Compare commits

...

132 Commits

Author SHA1 Message Date
Joona Hoikkala
ff0d851a1e WIP 2019-08-05 21:01:18 +03:00
Joona Hoikkala
5cfdfc80bd Basic assertions 2019-08-05 15:57:53 +03:00
Joona Hoikkala
f4c962073e Merge remote-tracking branch 'origin/apache-parser-v2' into parser_refactor 2019-08-05 15:34:12 +03:00
Brad Warren
31a8d086fc Merge pull request #7289 from certbot/fix-apache-parser-v2
Fix AppVeyor on the apache-parser-v2 branch
2019-08-02 11:22:53 -07:00
Adrien Ferrand
36b4c312c6 Upgrade virtualenv in dev/tests environments (#7287)
AppVeyor recently upgrade the Python 3.7.x installed in their VM to 3.7.4. However, virtualenv 16.6.1 is broken on that specific version of Python for Windows.

This PR upgrade virtualenv installed for a dev/test environment from 16.6.1 to 16.6.2 in order to fix this issue, and repair the CI jobs execute by AppVeyor on PRs.
2019-08-02 09:47:36 -07:00
Joona Hoikkala
973c521c3e Merge remote-tracking branch 'origin/master' into parser_refactor 2019-08-02 18:53:39 +03:00
Joona Hoikkala
526997107b Merge branch 'parser_refactor1' into parser_refactor 2019-08-02 18:53:08 +03:00
Joona Hoikkala
c1362b7437 Add more context to comment property docstring 2019-08-02 18:11:37 +03:00
Joona Hoikkala
371ece9ea6 Fix docstrings and mypy 2019-08-02 15:55:46 +03:00
Adrien Ferrand
56f609d4f5 Fix unit tests on Windows (#7270)
Fixes #6850

This PR makes the last corrections needed to run all unit tests on Windows:

add a function to check if a hook is executable in a cross-platform compatible way
handle correctly the PATH surgery for Windows during hook execution
handle correctly an account compatibility over both ACMEv1 and ACMEv2
remove (finally!) the @broken_on_windows decorator.

* Fix account_tests

* Fix hook executable test

* Remove the temporary decorator @broken_on_windows

* Fix util_test

* No broken unit test on Windows anymore

* More elegant mock

* Fix context manager

* Adapt coverage

* Corrections

* Adapt coverage

* Forbid os.access
2019-08-01 10:39:46 -07:00
Joona Hoikkala
dd762f3c79 Make coverage happy 2019-08-01 17:05:13 +03:00
Joona Hoikkala
8cfd228eee Add parameters, fix mypy and lint issues 2019-08-01 16:18:43 +03:00
Mikel Kew
2d3f3a042a Update dns-cloudflare docs regarding API Tokens (#7285)
A quick update to the docs to explicitly mention that the Cloudflare Global API Key must me used instead of an API Token.
2019-07-31 10:31:05 +02:00
Brad Warren
bfd4955bad Bump timeout waiting for ACME server to 4 minutes. (#7284)
* Bump timeout to 4 minutes.

* address review comments
2019-07-30 21:28:18 +02:00
Adrien Ferrand
9174c631d9 Disable TLS session tickets for Apache 2.4.11+ (#7191)
* Implement the logic

* Update tests

* Fix lint and changelog

* Update configurator.py

* Move the TLS configs in a dedicated folder. Fix the formalism of their naming and location.

* Improve existing test to check all TLS config have their hash registered in Certbot

* Corrections after review

* Improve a test

* Remove commented useless lines in TLS configs

* Add a nice warning. Because I am nice.

* Fix lint

* Add a test
2019-07-29 22:54:51 +03:00
Joona Hoikkala
0aef744f29 Add dummy tests and change arguments to properties 2019-07-29 22:01:13 +03:00
Adrien Ferrand
81e0b92b43 Refer to ubuntu in install.rst (#6986)
Fixes #5758
2019-07-29 10:27:09 -07:00
Joona Hoikkala
4b77350b0a Update certbot-apache/certbot_apache/interfaces.py
Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>
2019-07-29 18:52:14 +03:00
Joona Hoikkala
6b5c4c9f79 Update certbot-apache/certbot_apache/interfaces.py
Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>
2019-07-29 18:52:04 +03:00
Brad Warren
d3da19919f Remove duplicate, failing oldest tests. (#7272)
Nightly tests failed last night at https://travis-ci.com/certbot/certbot/builds/120816454.

The cause was the oldest the version of Ubuntu used in the tests suddenly changed from Trusty to Xenial. You can see Xenial being used in the failing test at  https://travis-ci.com/certbot/certbot/jobs/219873088#L9 and Trusty being used at the last passing test at https://travis-ci.com/certbot/certbot/jobs/218936290#L9. The change in the default doesn't seem to be documented (yet) at https://docs.travis-ci.com/user/reference/overview/.

I started to pin Trusty in these tests, however, I noticed that we are running these same unit tests at e6bf3fe7f8/.travis.yml (L58). These other tests are still succeeding because it appears that including `sudo: required` causes Travis to still default to Trusty.

Deleting these duplicated tests fixes our Travis failures and speeds things up ever so slightly.

* Remove duplicate, failing oldest tests.

* pin trusty
2019-07-26 13:37:16 -07:00
Adrien Ferrand
e6bf3fe7f8 [Windows] Security model for files permissions - STEP 3f (#7233)
* Correct file permissions on TempHandler

* Forbid os.chown and os.geteuid, as theses functions can be harmful to the security model on Windows.

* Implement copy_ownership

* Apply copy_ownership

* Correct webroot tests (and activate another broken test !)

* Correct lint and mypy

* Ensure to apply mode in makedirs

* Apply strict permissions on directories created with tempfile.mkdtemp(), like on Unix.

* Ensure streamHandler has 0600 on Windows

* Reactivate a test on windows

* Pin oldest requirements to current internal libraries (acme and certbot)

* Add dynamically pywin32 in dependencies: always except for certbot-oldest to avoid to break the relevant tests.

* Administrative privileges are always required.

* Correct security implementation (not the logic yet)

* First correction. Allow to manipulate finely file permissions during their generation

* Align to master + fix lint + resolve correctly symbolic links

* Add a test for windows about default paths

* Strenghthen the detection of Linux/Windows to check the standard files layout.

* Fix lint and mypy

* Reflect non usage of cache discovery from dns google plugin to its tests, solving Windows tests on the way

* Apply suggestions from code review

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Add more details in a comment

* Retrigger build.

* Add documentation.

* Fix a test

* Correct RW clear down

* Update util.py

* Remove unused code

* Fix code style

* Adapt certbot coverage threshold on Linux due to Windows specific LOC addition.

* Various optimizations around file owner and file mode

* Fix last error

* Fix copy_ownership_and_apply_mode

* Fix lint

* Correct mypy

* Extract out first part from windows-file-permissions

* Ignore new_compat in coverage for now

* Create test package for compat

* Add unit tests for security module.

* Add pywin32

* Adapt linux coverages to the windows-specific LOCs added

* Clean imports

* Correct import

* Trigger CI

* Reactivate a test

* Create the certbot.compat package. Move logic in certbot.compat.misc

* Clean comment

* Add doc

* Fix lint

* Correct mypy

* Add executable permissions

* Add the delegate certbot.compat.os module, add check coding style to enforce usage of certbot.compat.os instead of standard os

* Load certbot.compat.os instead of os

* Move existing compat test

* Update local oldest requirements

* Import sys

* Fix some mocks

* Update account_test.py

* Update os.py

* Update os.py

* Update local oldest requirements

* Implement the new linter_plugin

* Fix remaining linting errors

* Fix local oldest for nginx

* Remove custom check in favor of pylint plugin

* Remove check coding style

* Update linter_plugin.py

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Add several comments

* Update the setup.py

* Add documentation

* Update acme dependencies

* Update certbot/compat/os.py

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Update certbot/compat/os.py

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Update certbot/compat/os.py

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Update docs/contributing.rst

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Update linter_plugin.py

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Update linter_plugin.py

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Update docs/contributing.rst

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Update docs/contributing.rst

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Corrections

* Handle os.path. Simplify checker.

* Add a comment to a reference implementation

* Update changelog

* Fix module registering

* Update docs/contributing.rst

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Update docs/contributing.rst

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Update docs/contributing.rst

Co-Authored-By: adferrand <adferrand@users.noreply.github.com>

* Update config and changelog

* Correction

* Correct os

* Fix merge

* Disable pylint checks

* Normalize imports

* Simplify security

* Corrections

* Reorganize module

* Clean code

* Clean code

* Remove coverage

* No cover

* Implement security.chmod

* Disable a test for now

* Disable hard error for now

* Add a first test. Remove unused import

* Recalibrate coverage

* Modifications for misc

* Correct function call

* Add some types

* Remove newline

* Use os_rename

* Implement security.open

* Revert to windows-files-permissions approach

* Fix lint

* Implement security.mkdir and security.makedirs

* Fix lint

* Clean lint

* Clean lint

* Revert "Clean lint"

This reverts commit 83bf81960ac6bf3f76c286ca065a5ac850c6870b.

* Correct mock

* Conditionally add pywin32 on setuptools versions that support environment markers.

* Fix separator

* Fix separator

* Rename security into filesystem

* Change module security to filesystem

* Move rename into filesystem

* Rename security into filesystem

* Rename security into filesystem

* Rerun CI

* Fix import

* Fix pylint

* Implement copy_ownership_and_apply_mode

* Fix pylint

* Update certbot/compat/os.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Remove default values

* Rewrite a comment.

* Relaunch CI

* Pass as keyword arguments

* Update certbot/compat/filesystem.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/compat/filesystem.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/compat/filesystem.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Make the private key permissions transfer platform specific

* Update certbot/compat/filesystem.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Rename variable

* Fix comment0

* Add unit test for copy_ownership_and_apply_mode

* Adapt coverage

* Implement new methods.

* Remove the old method

* Reimplement make_or_verify_dir

* Finish migration

* Start to fix tests

* Fix ownership when creating a file with filesystem.open

* Fix security on TempHandler

* Fix validation path permissions

* Fix owner on mkdir

* Use a proper workdir for crypto tests

* Fix pylint

* Adapt coverage

* Update storage_test.py

* Update util_test.py

* Clean code

* Update certbot/compat/filesystem.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Add comment

* Update certbot/compat/filesystem.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Check permissions

* Change test mode

* Add unit test for filesystem.check_* functions

* Update filesystem_test.py

* Better logic for TempHandler

* Adapt coverage
2019-07-26 00:25:36 +02:00
Joona Hoikkala
267a281a10 Make linter happy 2019-07-25 19:34:10 +03:00
alexzorin
40da709792 docs: s/certbot_tests/certbot_test/ (#7267) 2019-07-25 10:23:28 +02:00
Brad Warren
bf9c681c4f fix backwards logic (#7265) 2019-07-25 10:20:52 +02:00
alexzorin
391f301dd8 acme: Implement authz deactivation (#7254)
Resolves #4945. First PR in order to address #5116.

* acme: Implement authz deactivation

Resolves #4945

* update AUTHORS and CHANGELOG

* typos in mypy annotations

* formatting: missing newline

* improve test_deactivate_authorization

* improve deactivate_authorization

* test: s/STATUS_INVALID/STATUS_DEACTIVATED/

* simplify dict to keyword argument

* acme: add UpdateAuthorization

* acme: use UpdateAuthorization in deactivate_authz

and add mypy annotation

This allows deactivate_authorization to succeed for both ACME v1
and v2 servers.
2019-07-24 18:04:59 -07:00
Joona Hoikkala
fbd7d20d47 Move parserassert to a different PR and address some of the comments 2019-07-24 17:22:39 +03:00
Joona Hoikkala
41f24c73f5 Move include handling to its own method 2019-07-24 16:35:19 +03:00
Brad Warren
06a0dae67f Fix test_symlink_resolution on macOS. (#7263)
This fixes the test failures which can be seen at
https://travis-ci.com/certbot/certbot/builds/120123338.

The problem here is the path returned by tempfile.mkdtemp() contains a symlink.
For instance, one run of the function produced
'/var/folders/3b/zg8fdh5j71x92yyzc1tyllfw0000gp/T/tmp3k9ytfj1' which is a
symlink to
'/private/var/folders/3b/zg8fdh5j71x92yyzc1tyllfw0000gp/T/tmp3k9ytfj1'.

Removing this symlink before testing filesystem.realpath solves the problem.

You can see the macOS tests passing with this change at https://travis-ci.com/certbot/certbot/builds/120250667.
2019-07-23 11:01:29 -07:00
Joona Hoikkala
f74b8ad2ad Update certbot-apache/certbot_apache/interfaces.py
Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>
2019-07-23 13:20:16 +03:00
Joona Hoikkala
d8806a4c05 Add the test assertions, and fix interface 2019-07-22 16:18:08 +03:00
Adrien Ferrand
a35470292e Remove Dockerfiles (#7257) 2019-07-22 13:43:58 +03:00
Brad Warren
47f64c7280 Remove list of packaging efforts. (#7258)
I think this list maybe had value when distros were first starting to package Certbot, but now I don't think it does. What function does this list serve? The instruction generator at https://certbot.eff.org/instructions does a much better job telling users how to use these packages. On the packaging side, I think anyone capable of packaging Certbot at the various distros would be able to search their repositories to see if a Certbot package is available.

Since this list is hard to maintain as links semi-regularly break and keeping it up to date with all distros and all Certbot components is a fair bit of work, let's just remove it.

This PR was motivated by the Travis failures at https://travis-ci.com/certbot/website/builds/119588518 due to GNU Guix changing the layout of their site.
2019-07-19 10:44:17 -07:00
Joona Hoikkala
fd3644a4d7 Address some of the review comments 2019-07-19 18:48:50 +03:00
Brad Warren
f7c736da6f Update pexpect to fix Python 3.7 dev venvs. (#7259) 2019-07-18 15:44:01 -07:00
Adrien Ferrand
71ff47daad Implement a consistent realpath function in certbot.compat.filesystem (#7242)
Fixes #7115 

This PR creates a `realpath` method in `filesystem`, whose goal is to replace any call to `os.path.realpath` in Certbot. The reason is that `os.path.realpath` is broken on some versions of Python for Windows. See https://bugs.python.org/issue9949. The function created here works consistently across Linux and Windows.

As for the other forbidden functions in `os` module, our `certbot.compat.os` will raise an exception if its `path.realpath` function is invoked, and using the `os` module from Python is forbidden from the pylint check implemented in our CI.

Every call to `os.path.realpath` is corrected in `certbot` and `certbot-apache` modules.

* Forbid os.path.realpath

* Finish implementation

* Use filesystem.realpath

* Control symlink loops also for Linux

* Add a test for forbidden method

* Import a new object from os.path module

* Use same approach of wrapping than certbot.compat.os

* Correct errors

* Fix dependencies

* Make path module internal
2019-07-18 14:31:39 -07:00
J0WI
41a17f913e Use Buster as base image (#7251) 2019-07-17 13:05:02 -07:00
Joona Hoikkala
0b545ab443 New ParserNode interface abstraction 2019-07-15 18:33:19 +03:00
Po-Chuan Hsieh
750d6a9686 Unify license filename (LICENSE.txt) (#7239)
* Unify license filename (LICENSE.txt)
2019-07-12 22:53:43 +03:00
Adrien Ferrand
c4684f187a Add a test for the default directories on Windows (#7238)
There is a unit test to check that the default directories for Certbot are not diverging, in certbot.tests.cli_test:FlagDefaultTests:test_linux_directories.

But this test is not done on Windows.

This PR fixes that.
2019-07-11 17:49:52 -07:00
Lucid One
82ad736120 Fixes #7220 to allow config to be loaded from <(envsubst < template) (#7221)
* Fixes #7220 to allow config to be loaded from <(envsubst < template)
2019-07-11 14:40:24 -07:00
Brad Warren
ca893bd836 Merge pull request #7236 from certbot/candidate-0.36.0
Release 0.36.0
2019-07-11 14:00:49 -07:00
Erica Portnoy
d1934e36fe Bump version to 0.37.0 2019-07-11 12:31:53 -07:00
Erica Portnoy
15b1d8e5a7 Add contents to CHANGELOG.md for next version 2019-07-11 12:31:53 -07:00
Erica Portnoy
cbd0a37c7a Release 0.36.0 2019-07-11 12:31:51 -07:00
Erica Portnoy
13c44a0595 Update changelog for 0.36.0 release 2019-07-11 12:12:24 -07:00
Brad Warren
89f52ca9f9 Add mypy to contributing checklist. (#7224) 2019-07-10 18:14:12 -07:00
Brad Warren
d0a9695b09 Make PR template a checklist and suggest mypy. (#7223) 2019-07-10 18:14:01 -07:00
Brad Warren
add24d4861 Run tests on apache-parser-v2 (#7231)
We're planning on using the branch apache-parser-v2 allowing us to incrementally work on the new Apache parser and feel comfortable landing temporary test code that we don't really want in master.

The apache-parser-v2 branch is created and locked down, but neither Travis or AppVeyor are configured to run tests on it. See #7230. This PR fixes that problem.

This could probably just land in the apache-parser-v2 branch, but why unnecessarily deviate the branch from master? It doesn't hurt anything there. Once it lands, I'll get this added to the apache-parser-v2 branch too.

* Run tests on apache-parser-v2.

* add comment

* Don't run full test suite on apache-parser-v2.
2019-07-10 16:30:06 -07:00
Adrien Ferrand
74292a10f5 [Windows] Security model for files permissions - STEP 3e (#7182)
This PR implements the filesystem.copy_ownership_and_apply_mode method from #6497.

This method is used in two places in Certbot, replacing os.chown, to copy the owner and group owner from a file to another one, and apply to the latter the given POSIX mode.

* Implement copy_ownership_and_apply_mode

* Update certbot/compat/os.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Remove default values

* Rewrite a comment.

* Relaunch CI

* Pass as keyword arguments

* Update certbot/compat/filesystem.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/compat/filesystem.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/compat/filesystem.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Make the private key permissions transfer platform specific

* Update certbot/compat/filesystem.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Rename variable

* Fix comment0

* Add unit test for copy_ownership_and_apply_mode

* Adapt coverage

* Execute unconditionally chmod with copy_ownership_and_apply_mode. Improve doc.
2019-07-10 16:26:30 -07:00
Brad Warren
74bf9ef46a Remove test symlink. (#7232) 2019-07-10 23:48:34 +02:00
Adrien Ferrand
2ac99fefe0 [Windows|Linux] Launch integration tests on Pebble without Docker (#7157)
This PR is a part of the actions necessary to make Certbot-CI work on Windows, in order to execute the integration tests on this platform.

Following #7156, this PR changes how the integration tests are setup against Pebble to not need Docker anymore.

As a reminder, one can check #7156 and letsencrypt/pebble#240 to see the rationale about why using Docker is a problem to run the integration tests on Windows.

Basically, this PR executes directly Pebble using its executable, since it is build using Go, and Go produces self-contained executable that can run without any installation on Linux and on Windows. During the integration tests setup, Certbot-CI will get the Pebble (and Challtestsrv) executables for the defined target version on the GitHub releases. The binaries are persisted on the filesystem, so it is not needed to download them again on the second integration tests execution. Nonetheless, we are talking about 20MB of executables.

Since the setup needs to hold a state, I also took this occasion to refactor the acme_server, in order to use on object oriented approach and improve the readability/maintainability.

Once this PR and #7156 are merged, Docker will not be needed anymore for the main integration tests usecase, that is to use Pebble.

* Complete process

* Fix nginx cert path

* Check conditionnally docker

* Update gitignore, fix apacheconftest

* Full object

* Carriage return

* Move to official v2.1.0 of pebble

* Fix name

* Update acme_server.py

* Relaunch CI

* Update certbot-ci/certbot_integration_tests/utils/acme_server.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot-ci/certbot_integration_tests/utils/acme_server.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update docstring

* Update documentation

* Configure a stdout to ACMEServer

* Map all process through defined stdout

* Remove unused variable

* Handle using signals

* Use failsafe entering context

* Remove failsafe rmtree, that is not needed anymore
2019-07-10 14:29:57 -07:00
Brad Warren
43f58ca803 Document pytest packaging problems. (#7226)
This is probably unlikely to come up again, but this documents that people should run our tests using setuptools rather than calling something like pytest directly. See https://opensource.eff.org/eff-open-source/pl/wdrky4uyzjguppgch3r7t7qjmc for more info.
2019-07-09 15:07:33 -07:00
Brad Warren
17f2cabbbf Replace broken link with archive link. (#7222) 2019-07-08 10:27:25 -07:00
Adrien Ferrand
7d61e9ea56 [Windows] Security model for files permissions - STEP 3d (#6968)
* Implement security.mkdir and security.makedirs

* Fix lint

* Correct mock

* Rename security into filesystem

* Update apache and nginx plugins requirements

* Update certbot/plugins/webroot.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Reenable pylint here

* Move code

* Reimplement mkdir

* Control errors on eexist, remove superfluous chmod for makedirs

* Add proper skip for windows only tests

* Fix lint

* Fix mypy

* Clean code

* Adapt coverage threshold on Linux with addition of LOC specific to Windows

* Add forbiden functions to tests

* Update certbot/compat/os.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Simplify code

* Sync _get_current_user with part3c

* Use the simpliest implementation

* Remove exist_ok, simplify code.

* Simplify inline comment

* Update filesystem_test.py

* Update certbot/compat/os.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/plugins/webroot.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/plugins/webroot.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Add a test to check we set back os.mkdir correctly after filesystem.makedirs is called.

* Fix lint, adapt coverage
2019-07-03 16:20:43 -07:00
Brad Warren
20b595bc9e Simplify and deprecate viewing config changes (#7198)
* Remove apache and nginx from config_changes help

* Deprecate certbot_config changes.

* Document config_changes deprecation.

* Remove view_config_changes as IInstaller method.

* Remove view_config_changes from plugins.

* Add view_config_changes warnings.

* simplify test_config_changes_deprecation
2019-07-02 17:20:12 -07:00
Adrien Ferrand
88876b9901 [Windows] Security model for files permissions - STEP 3c (#6967)
* Implement security.open

* Clean lint

* Rename security into filesystem

* Update certbot/compat/filesystem.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/util.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/lock.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/compat/os.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/lock.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/compat/os.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Simplify and make more clear comment on os.open.

* Secure implementation preventing race conditions

* Revert "Secure implementation preventing race conditions"

This reverts commit dbb85492195122020ca0b4a685ddb4836fdc6d12.

* Simplify the logic on Windows.

* Implement os.open to prevent race conditions

* Add unit tests

* Handle os.O_CREAT and os.O_EXCL directly from the Windows APIs

* Improve comments

* Use CREATE_ALWAYS

* Adapt coverage threshold to new Windows specific LOCs.

* Update certbot/compat/os.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/compat/os.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/compat/os.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/compat/filesystem.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Add some comments

* Fix pylint

* Improve docstring

* Added test cases

* Improve docstring

* Update certbot/lock.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot/lock.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Fix lint

* Adapt coverage

* Adapt coverage
2019-07-02 16:21:24 -07:00
Brad Warren
448d159223 Install Python3 only dev tools with tools/venv3.py (#7215)
These packages can be useful and I found that they aren't being installed in our Python 3 development environment. Let's fix that.
2019-07-02 13:45:57 -07:00
Brad Warren
3e872627d8 Pin/upgrade virtualenv in our tests (#7211)
* Update virtualenv to the latest version.

* Use venv from pip and pin more packages.

* Pin codecov.

* update appveyor config

* Write the path separator backwards.

* s/pip_install.py install/pip_install.py

* Prefix tools\\pip_install.py with python exe.

* Upgrade py to fix AppVeyor failures.

* add back comment

* Update virtualenv with CERTBOT_NO_PIN.

* Pass -U to upgrade tox and deps.

* Upgrade virtualenv.
2019-07-02 20:02:00 +03:00
Brad Warren
76b7eb0628 Document certbot-auto's code freeze. (#7207)
Inspired by #7194, this PR adds a note to our documentation that we're not accepting most changes to certbot-auto.
2019-06-28 15:53:56 -07:00
dkp
4fc30f2ecb Replace Some Platform-Specific Line Separation (#7203)
os.linesep isn't supposed to be used when writing to files opened in
text mode, where '\n' is escaped to the platform-specific ASCII
sequence.  For example, on Windows, os.linesep is '\r\n' and in text
mode is escaped to ASCII sequence CR CR LF rather than just CR LF.
This is also true for the default logger and IDisplay notifications.

Replacing os.linesep with '\n' ensures the right sequence is escaped.

Resolves: 6899
2019-06-28 13:06:52 -07:00
sydneyli
1c75b6dacd Update Nginx conf file to match Mozilla's security recommendations (#7163)
Fixes #7089
2019-06-28 12:16:51 -07:00
Joona Hoikkala
c08a4dec2d Refactor augeas_configurator.py functionality to configurator.py and parser.py accordingly. (#7181)
This pull request moves the functionality within `AugeasConfigurator` that previously existed as a parent class of `ApacheConfigurator` to `ApacheConfigurator` and `ApacheParser` accordingly.

Most of the methods were moved as-is, and one (`recovery_routine()`) was completely removed. Few of the methods had to be split between the configurator and parser, good example of this is `save()`.

The Augeas object now lives completely within the `ApacheParser`.

* Remove augeasconfigurator

* Fix references

* Adjust tests accordingly

* Simplify test

* Address review comments

* Address review comments

* Move test_recovery_routine_reload
2019-06-28 08:39:13 -07:00
Brad Warren
4fc0ef0fbe certbot-plugin-gandi is not an installer. (#7201)
This [plugin](https://github.com/obynio/certbot-plugin-gandi) is an authenticator but not an installer. It's a DNS authenticator plugin.
2019-06-27 15:17:45 -07:00
Brad Warren
26a1eddd89 Remove plesk from the list of 3rd party plugins. (#7200)
Our link for the Plesk plugin goes to https://github.com/plesk/letsencrypt-plesk which refers you to https://ext.plesk.com/packages/f6847e61-33a7-4104-8dc9-d26a0183a8dd-letsencrypt and in their changelog for 2.0.0 it says "Replaced Python-based certbot with PHP-based client".
2019-06-27 15:17:31 -07:00
Brad Warren
1c6210ee00 Fix certbot config_changes (#7197)
* Remove for_logging parameter.

* Remove broken/unused --num parameter.

* update changelog
2019-06-26 17:46:51 -07:00
Brad Warren
a27f3ebd4f s/for for/for (#7196) 2019-06-26 17:24:04 -07:00
Brad Warren
a778b50403 Run le_auto_xenial on every PR. (#7195)
https://github.com/certbot/certbot/pull/7190/files removed our only le_auto_* tests on PRs. This PR fixes that by running le_auto_xenial on every PR which also includes running modification-check.py like we used to for Trusty.
2019-06-26 14:54:08 -07:00
Brad Warren
f2ab6a338c Remove files for old Docker image. (#7188) 2019-06-26 11:54:02 +02:00
Hunter
0d5bad6c8c dns-cloudflare: update URL for obtaining API keys (#7052)
Updated the ACCOUNT_URL in the Cloudflare-DNS plugin.
This uses the new "dash.cloudflare.com" scheme and future-proofs this URL for an upcoming change to Cloudflare API keys (this is not public yet, so no other changes related to this).
2019-06-25 17:53:31 -07:00
Brad Warren
dc0cfa21c9 Drop support for Ubuntu Trusty (#7190)
* Remove references and tests for Ubuntu Trusty.

* Mention dropped support in changelog.
2019-06-25 14:04:25 -07:00
Brad Warren
a37a4486cf Add Debian ARM AMI. (#7189)
Inspired by the number of ARM users we have (and because I want to rip out the only 32 bit test we have which without this PR would remove all tests we have on non-x86_64 architectures), this test adds an ARM image to the test farm tests. The image ID was taken from https://wiki.debian.org/Cloud/AmazonEC2Image/Stretch, you can see tests passing at https://travis-ci.com/certbot/certbot/builds/116857897, and I ran test_tests.sh locally and it passed.
2019-06-25 14:03:45 -07:00
Brad Warren
776e939a4c Drop support for quay.io. (#7187)
In this spirt of cleaning up some low hanging cruft, this fixes #4343.

There are no (recent) release tags on quay.io and the builds are just following master. See https://quay.io/repository/letsencrypt/letsencrypt?tab=tags.

Once this lands, I can disable the automated builds on quay.io and we can delete Dockerfile-old and tools/docker-warning.sh.
2019-06-25 11:05:28 -07:00
Brad Warren
69cf64079c Mention dropped support in changelog. 2019-06-25 10:18:39 -07:00
Brad Warren
9962cf0b8e Upgrade compatibility tests to stretch. (#7185)
Inspired by #7180, there's no reason for these tests to be running on old stable. This upgrades them to the latest stable version of Debian.

You can see tests passing with these changes at https://travis-ci.com/certbot/certbot/builds/116844923.
2019-06-25 10:13:57 -07:00
Brad Warren
4c95b687ae Remove references and tests for Ubuntu Trusty. 2019-06-25 10:10:14 -07:00
Brad Warren
a3bbdd52e7 Improve issue closing behavior. (#7178) 2019-06-24 16:39:45 -07:00
Siilike
2e3c1d7c77 Add reference to the Standalone DNS Authenticator (#7137)
Updated documentation to add a reference to the Standalone DNS Authenticator, https://github.com/siilike/certbot-dns-standalone
2019-06-24 12:47:50 -07:00
Adrien Ferrand
249af5c4cd Fix integration tests with Pebble v2.1.0 + (#7175)
Since Pebble v2.1.0, new controls have been added on ACME specs compliance on Pebble with strict mode enabled. These controls are described here: letsencrypt/pebble@3a2ce1c

Currently Certbot is not compliant enough to pass these new controls. One part of the work to do is described here: #7171

As a consequence, our CI is currently broken, both on PR builds and nightly builds.

This PR disables the strict mode during integration tests, fixing temporarily our CI. This will give us some time to fix theses deviations, and add back the strict mode in a future PR once it is merged.

* Remove -strict mode on Pebble for now.

* Refer to relevant Certbot PR

* Clean code
2019-06-24 12:03:24 -07:00
Adrien Ferrand
9a60f6df78 Fix codecov quality gate since flags have been removed (#7173)
Because some users were complaining about staled workflow when flags (https://docs.codecov.io/docs/flags) are enabled, Codecov decided to remove them when calculating the coverage on branches until they improved this functionality.

See: https://docs.codecov.io/docs/flags#section-flags-in-the-codecov-ui

The flags are still taken into account on PR builds, but not on based branch.

This is a problem for us, because we use the flags to compare specifically the coverage of a PR against its base branch for Windows on one side, and Linux on the other side. Without flags taken into account on the base branch, the CI fails because the coverage on Windows is too low.

As a temporary fix until the situation is clarified on Codecov side, this PR replaces the validation condition, that was a comparison against the base branch, to a fixed coverage registered in the local .codecov.yml file in Certbot repository.

This way, the coverage on PR builds, that takes into account the flags, is validated against an appropriate value.

This is a temporary solution, that will require an explicit update of .codecov.yml in the mean time if the coverage significantly increases, or decreases on some developments. But until the situation is fixed, this will allow to have a functional quality gate.
2019-06-21 12:00:03 -07:00
Adrien Ferrand
e9bcaaa576 [Windows] Security model for files permissions - STEP 3a (#6964)
This PR implements the filesystem.chmod method from #6497.

* Implement filesystem.chmod

* Conditionally add pywin32 on setuptools versions that support environment markers.

* Update apache plugin requirements

* Use a try/except import approach similar to lock

* Add comments about well-known SIDs

* Add main command

* Call filesystem.chmod in tests, remove one test

* Add test for os module

* Update environment marker

* Ensure we are not building wheels using an old version of setuptools

* Added a link to list of NTFS rights

* Simplify sid comparison

* Enable coverage

* Sometimes, double-quote is the solution

* Add entrypoint

* Add unit tests to filesystem

* Resolve recursively the link, add doc

* Move imports to the top of the file

* Remove string conversion of the ACL, fix setup

* Ensure admins have all permissions

* Simplify dacl comparison

* Conditionally raise for windows temporary workaround

* Add a test to check filesystem.chown is protected against symlink loops
2019-06-20 10:52:43 -07:00
Brad Warren
5078b58de9 Upgrade to the latest macOS image (#7167)
This fixes the test failures we saw last night at https://travis-ci.com/certbot/certbot/builds/116073070.

The problem is that the Homebrew installation included in the Travis image is outdated and when it tries to install packages, it fails. You can see this at https://travis-ci.com/certbot/certbot/jobs/209185570#L83. There is a thread in Travis' community froum about this at https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.

To fix this, we could either upgrade Hombrew which can be a slow process according to both Travis and the original poster of the issue or we could upgrade to a newer version of macOS. I chose the latter to avoid the speed problems and picked the latest version available.

You can see tests passing with these changes at https://travis-ci.com/certbot/certbot/builds/116186095.
2019-06-19 14:09:30 -07:00
schoen
03cf5d15a6 Merge pull request #6894 from suqld/fix-google-dns-private-zones
Detect private DNS zones in Google and skip them
2019-06-19 13:21:22 -07:00
David Drobner
8efe3fb19a RFC8555 Messages (#7131)
Add new error types and descriptions from RFC 8555 to acme (#7116)
2019-06-18 17:29:53 -07:00
Brad Warren
9863c2d18e Update Ubuntu 18.04 AMI to fix blocking on input (#7166) 2019-06-18 12:07:45 -07:00
timwsuqld
6172821d90 Merge branch 'master' into fix-google-dns-private-zones 2019-06-18 14:04:21 +10:00
Brad Warren
dde16df778 Fixes #3400. (#7162)
The person who wrote this code no longer works on Certbot and regardless of
what the intended behavior was, let's document the actual behavior.
2019-06-17 15:56:06 -07:00
Adrien Ferrand
1df778859b [Windows|Linux] Use builtin Python proxy capabilities for Certbot-CI (#7156)
This PR is a part of the actions necessary to make Certbot-CI work on Windows, in order to execute the integration tests on this platform.

I initially used the fully-fledged HTTP proxy [Traefik](https://docs.traefik.io/) to distribute HTTP challenges among several pytest nodes, and so parallelize the integration tests. Traefik for this purpose is overkill. We just want to redirect the ACME server to a pytest node depending on the `Host` header, and we use here a production-grade HTTP proxy for that.

However it was not a problem on Linux, as soon as you can have Docker, because this instance is deployed through it.

But this becomes a problem for Windows, where Docker is not available everywhere, very compelling on its setup, and limited by the implemented network drivers. See my comments here https://github.com/letsencrypt/pebble/pull/240 for more details.

Hopefully Python ships with everything needed to implement a simple HTTP proxy, with strictly what we need for the parallelization of integration tests.

This PR implements this kind of HTTP proxy, and remove the coupling to Traefik.

This PR has been tested successfully with integration tests on Pebble under Linux for Python 2.x and Python 3.x, and the proxy alone has been also tested successfully on Windows (no integration tests can be run for now on this platform).

* Create a python proxy

* Refactor proxy config

* Working logic

* Resolve from the path

* Give proxy process to the ACMEServer context manager
2019-06-14 16:28:14 -07:00
Brad Warren
20ca47dec6 Bump stale threshold to 1 year. (#7149)
While I expect stale bot will close out 150 - 250 issues, that'll still leave us with 400+ open issues. My concern is that with a threshold of 6 months, most of these 400 issues will be in the same state 6 months from now and stale bot will annoy people by asking them if their issue is still valid too frequently.

Doubling the stale threshold to 1 year should mitigate this problem a bit I think.
2019-06-14 15:51:15 -07:00
sydneyli
6c53f5d8ed Turn off session tickets for versions of Nginx that support it (#7092)
* Turn off session tickets for versions of Nginx that support it

In line with Mozilla's security recommendations.

* Changelog.

* Set version before installing config files

* lint: remove unused import

* windows testfix

* another windows testfix?

* Testing path of updating src file with old nginx

* Fix windows, and make config update tests fail if update doesn't happen
2019-06-14 13:44:50 -07:00
Brad Warren
add90cef32 Tell people they can add their name to AUTHORS.md. (#7155) 2019-06-14 00:38:39 +02:00
Adrien Ferrand
1b54c74621 Remove the remaining integration tests bash scripts (#7153)
Since #7073 for Certbot and letsencrypt/boulder@3918714 for Boulder have landed, the bash scripts that remained after certbot-ci are not useful anymore outside of Certbot.

Only remaining place is the apacheconftest-with-pebble tox target, which leverages pebble-fetch.py script to expose a running ACME server to the apache-conf-test script.

This PR refactor apacheconftest-with-pebble to use certbot-ci instead. Finally, this PR remove the remaining integration tests bash scripts, that are _common.sh, boulder-fetch.py and pebble-fetch.py.

* Disconnect common and boulder-fetch

* Prepare reconnection of apacheconftest to new pebble deployment logic

* Finish the configuration for apacheconftest

* Add executable flag to python script

* Fix shebang

* Delete pebble-fetch.sh
2019-06-13 14:09:09 -07:00
Adrien Ferrand
e60651057e Add a branch in acme_server to properly clean the boulder workspace (#7154)
Currently integration tests against Boulder fail during nightly tests. See https://travis-ci.com/certbot/certbot/builds/115373954.

This is due to a failure to cleanup the workspace associated to the Boulder docker started during the integration tests. Indeed this docker compile several artifacts whose owner is root, and permissions are 0744. These files are persisted in the workspace folder attached to the Docker.

Since tox is run as a non-root user (but this user still have access to the Docker daemon), everything works fine until the end of the test suite, when all resources are cleaned up. At this point, pytest fires a PermissionError when failing to delete these artifacts, return with a non-zero exit code, and so fail the build.

Since this situation could happen outside of the CI, I made appropriate corrections to allow the integration tests to be run as a non-root user, instead of changing Travis to execute tests as root user.

The correction is to add a step to the cleanup process: the deletion of these artifacts through an ad-hoc docker instance.
2019-06-13 13:27:06 -07:00
Adrien Ferrand
e394889864 Add executable scripts to start certbot and acme server in certbot-ci (#7073)
During review of #6989, we saw that some of our test bash scripts were still used in the Boulder project in particular. It is about `tests/integration/_common.sh` in particular, to expose the `certbot_test` bash function,  that is an appropriate way to execute a local version of certbot in test mode: define a custom server, remove several checks, full log and so on.

This PR is an attempt to assert this goal: exposing a new `certbot_test` executable for test purpose. More generally, this PR is about giving well suited scripts to quickly make manual tests against certbot without launching the full automated pytest suite.

The idea here is to leverage the existing logic in certbot-ci, and expose it as executable scripts. This is done thanks to the `console_scripts` entry of setuptools entrypoint feature, that install scripts in the `PATH`, when `pip install` is invoked, that delegate to specific functions in the installed packages.

Two scripts are defined this way:
* `certbot_test`: it executes certbot in test mode in a very similar way than the original `certbot_test` in `_common.sh`, by delegating to `certbot_integration_tests.utils.certbot_call:main`. By default this execution will target a pebble directory url started locally. The url, and also http-01/tls-alpn-01 challenge ports can be configured using ad-hoc environment variables. All arguments passed to `certbot_test` are transferred to the underlying certbot command.
* `acme_server`: it set up a fully running instance of an ACME server, ready for tests (in particular, all FQDN resolves to localhost in order to target a locally running `certbot_test` command) by delegating to `certbot_integration_tests.utils.acme_server:main`. The choice of the ACME server is given by the first parameter passed to `acme_server`, it can be `pebble`, `boulder-v1` or `boulder-v2`. The command keeps running on foreground, displaying the logs of the ACME server on stdout/stderr. The server is shut down and resources cleaned upon entering CTRL+C.

This two commands can be run also through the underlying python modules, that are executable.

Finally, a typical workflow on certbot side to run manual tests would be:
```
cd certbot
tools/venv.py
source venv/bin/activate
acme_server pebble &
certbot_test certonly --standalone -d test.example.com
```

On boulder side it could be:
```
# Follow certbot dev environment setup instructions, then ...
cd boulder
docker-compose run --use-aliases -e FAKE_DNS=172.17.0.1 --service-ports boulder ./start.py
SERVER=http://localhost:4001/directory certbot_test certonly --standalone -d test.example.com
```

* Configure certbot-ci to expose a certbot_test console script calling certbot in test mode against a local pebble instance

* Add a command to start pebble/boulder

* Use explicit start

* Add execution permission to acme_server

* Add a docstring to certbot_test function

* Change executable name

* Increase sleep to 3600s

* Implement a context manager to handle the acme server

* Add certbot_test workspace in .gitignore

* Add documentation

* Remove one function in context, split logic of certbot_test towards capturing non capturing

* Use an explicit an properly configured ACMEServer as handler.

* Add doc. Put constants.
2019-06-12 17:19:23 -07:00
Adrien Ferrand
d75908c645 [Windows] Security model for files permissions - STEP 3b (#6965)
* Modifications for misc

* Add some types

* Use os_rename

* Move rename into filesystem

* Use our os package

* Rename filesystem.rename to filesystem.replace

* Disable globally function redefined lint in os module
2019-06-11 17:08:48 -07:00
Adrien Ferrand
72e5d89e95 Move nginx_compat to nightly (#7001)
With the various optimizations already done and upcoming (certbot-ci), the time execution of integration tests have significantly decreased, allowing potentially a complete execution of a Travis PR job to be done within 5min30s.

However, one job is significantly longer that the other ones after this migration: this is nginx_compat, that takes more that 11min to finish. I tried to split the nginx_compat in terms of tested configuration and of tests to execute (auth, install, enhance). Both are not satisfactory:

splitting by configuration may work, but add a significant complexity in the tests
splitting by tests type is supported almost out-of-the-box, but fails to make two fast tests (see https://travis-ci.org/adferrand/certbot/builds/525892885?utm_source=github_status&utm_medium=notification for instance)
Since these tests are designed to check corner cases on the nginx parser, this is mostly useless to execute them on each PR, as the nginx parser is rarely updated.

After some discussion with @bmw, I think that we can just move the nginx_compat from the PR tests to the nightly tests. This PR does that.
2019-06-11 14:54:36 -07:00
Brad Warren
0c5f526f8b Remove the Postfix plugin (#7097)
* Remove the postfix plugin.

* Remove references to postfix plugin in code.

* Remove reference to postfix plugin in docs.
2019-06-11 23:41:25 +02:00
Brad Warren
5385375571 Remove list of modified packages from changelog. (#7146) 2019-06-11 14:02:54 -07:00
Brad Warren
7b4201fbdb Merge pull request #7147 from certbot/candidate-0.35.1
Release 0.35.1
2019-06-11 12:39:09 -07:00
Erica Portnoy
8106f74dc0 Merge branch 'master' into candidate-0.35.1 2019-06-11 12:21:17 -07:00
Erica Portnoy
3bceae4a89 Bump version to 0.36.0 2019-06-10 15:25:16 -07:00
Erica Portnoy
f18143b117 Add contents to CHANGELOG.md for next version 2019-06-10 15:25:15 -07:00
Erica Portnoy
0cc56677e2 Release 0.35.1 2019-06-10 15:25:09 -07:00
Erica Portnoy
6334d065cf Update changelog for 0.35.1 release 2019-06-10 15:02:09 -07:00
Brad Warren
c3edc25fb7 Fix dns rfc2136 (#7142) (#7143)
* Revert "Add an option to dns_rfc2136 plugin to specify an authorative base domain. (#7029)"

This reverts commit 5ab6a597b0.

* Update changelog.

(cherry picked from commit 23b52ca1c8)
2019-06-10 14:12:59 -07:00
Brad Warren
23b52ca1c8 Fix dns rfc2136 (#7142)
* Revert "Add an option to dns_rfc2136 plugin to specify an authorative base domain. (#7029)"

This reverts commit 5ab6a597b0.

* Update changelog.
2019-06-10 13:56:57 -07:00
Brad Warren
02cf051e45 List Certbot package given #7127. (#7132) 2019-06-07 14:35:08 -07:00
Brad Warren
4d034122c6 Ask for updates, the issue isn't stale. (#7133)
This PR attempts to improve the behavior our "stale" bot by asking for updates instead of telling people that their issue is stale.
2019-06-07 14:34:40 -07:00
Brad Warren
391f742df7 List Certbot package given #7127. 2019-06-07 13:56:38 -07:00
Rob Stradling
5c663d4d97 Update the 'manage your account' help to be more generic. (#7127)
Resolves #7121.

* Update the 'manage your account' help to be more generic.

* Add changelog entry about #7127.
2019-06-07 13:03:35 -07:00
Brad Warren
89d907b182 Improve Apache error message when run with insufficient privileges (#7129)
* fixes #6369

* Add changelog entry.

* Improve error message again.
2019-06-07 19:57:21 +02:00
Brad Warren
d0f1a3e205 Merge pull request #7124 from certbot/candidate-0.35.0
Candidate 0.35.0
2019-06-05 14:51:17 -07:00
Erica Portnoy
f3b73c4d2a Bump version to 0.36.0 2019-06-05 14:00:54 -07:00
Erica Portnoy
f25a9b2004 Add contents to CHANGELOG.md for next version 2019-06-05 14:00:54 -07:00
Erica Portnoy
3568070c73 Release 0.35.0 2019-06-05 14:00:46 -07:00
Erica Portnoy
8e92577cb0 Update changelog for 0.35.0 release 2019-06-05 13:39:05 -07:00
Brad Warren
459ba89aef Add changelog entry about #7054. (#7122)
* Add changelog entry about #7054.

* Fix typo noticed by cpu

Co-Authored-By: Daniel McCarney <daniel@binaryparadox.net>
2019-06-04 14:17:49 -07:00
Adrien Ferrand
bfd1ce97ef Add Adrien Ferrand to the authors list (#7119) 2019-06-04 11:37:54 -07:00
Thue
419ad7df1e Fix typo cerbot->certbot. (#7118) 2019-06-04 14:46:40 +02:00
Adrien Ferrand
889aeb31df Validate OCSP responses in case an explicit responder is designated (#7054)
* Validate OCSP response for responders that are not the certificate's issuer.

* Improve OCSP tests using a issuer/responder pair for OCSP responses

* Clean code

* Update ocsp_test.py

* Add various comments

* Add several cases of ocsp responder. More factories for the resilience tests.

* Update ocsp_test.py
2019-06-03 22:55:26 +03:00
Brad Warren
09b7d2f461 Configure the stale bot (#7108)
* Configure the stale bot.

* Add top level comment.

* except assignees

* Give warning about closing issues.
2019-06-03 10:25:23 -07:00
Brad Warren
18797dca79 Remove scripts that are never run. (#7111)
* Remove scripts that are never run.

* Update example in multitester.py docstring.
2019-06-03 10:20:20 +03:00
Brad Warren
31e81e7ae0 Add explanation of the purpose of test_tests.sh. (#7112)
This is one of the two action items from the conversation at https://opensource.eff.org/eff-open-source/pl/rno49hd6q7ba7dr18ph11njc6o.

Just to make sure I didn't make a typo, I ran this script with these changes and the tests still pass.
2019-05-31 18:09:17 -07:00
Brad Warren
4b06eeae64 Update Fedora AMI (#7102)
Fixes #6955.

This updates the Fedora version used in our test farm tests to Fedora 30. The AMI ID comes from https://alt.fedoraproject.org/cloud/ where it is listed as their standard HVM AMI for the region we use us-east-1 (US East (N. Virginia)).

Unfortunately, there were a lot of small changes required for this. The big reason for this is on Fedora, there isn't a Python 2 executable installed. In fact, there's not even an executable named python. It's just python3. Rather than installing another Python in each test, I wrote a script that the test scripts can share to figure out the different paths and names that should be used in their script. (This isn't used in test_sdists.sh because the logic is a little different.)

Other changes here worth flagging are:

I changed the name of the variable RUN_PYTHON3_TESTS in test_leauto_upgrades.sh to RUN_RHEL6_TESTS. The tests that are run when this variable is set test the upgrade from Python 2 to Python 3 on RHEL 6. I think this new name is much better now that we also have Fedora running Python 3.
I made tools/simple_http_server.py work on Python 3.
You can see tests passing with these changes at https://travis-ci.com/certbot/certbot/builds/113821476. I also ran test_tests.sh and they passed.

* Update to Fedora 30 in test farm tests.

Fedora 28 is likely to reach its EOL soon.

* Add set_python_envvars.sh.

* Fix test_apache2.sh on python3 only distros.

* Fix test_leauto_upgrades.sh on python3 systems.

* Fix certonly_standalone tests with python3 only

* Fix test_sdists.sh on python3 only distros.

* Make simple_http_server.py work on Python 3.

* add comments
2019-05-31 18:08:52 -07:00
Felix Lechner
641aba68b1 Ignore editor backups when running hooks. (#7109)
* Ignore editor backups when running hooks.

When processing hooks, certbot also runs editor backups even though
such files are outdated, clearly warranted correction and may quite
possibly be defective.

That behavior could lead to unexpected breakage, and perhaps even pose
security risks---for example, if a previous script was careless with
file permissions. As an aggravating factor, the backup runs after the
corrected version and could unintentionally override a fix the user
thought was properly implemented.

This commit causes editor backup files ending in tilde (~) to be
excluded when running hooks.

Additional information can be found here:

https://github.com/certbot/certbot/issues/7107
https://community.letsencrypt.org/t/editor-backup-files-executed-as-renewal-hooks/94750

* Add unit test for hook scripts with filenames ending in tilde.

* Provide changelog entry for not running hook scripts ending in tilde.

* Add Felix Lechner to the list of contributors.
2019-05-30 15:02:15 -07:00
Adrien Ferrand
926c8c198c Remove dependency on acme in certbot-ci (#7055)
Following discussion in #6947 (comment), I have second thoughts about relying on acme in certbot-ci.

Indeed, I think it is a good design to not rely in tests on the code you are testing. Obviously in unit tests it is very difficult, since most of the time the unit that is tested needs input generated by other part of the code. However it is not really a problem in a unit test, as its purpose is to make assertions about a specific portion of the code, not the others parts.

In the scope of integration tests, the software tested is treated as a black box. In this case, having some parts of the test logic that use in fact part of the code in the black box, increase the risk that some assertions compared two results coming from the same flawed logic from the tested software.

Since using acme in certbot-ci is only saving few lines of code, I think it does not worth the risk and the added complexity to declare acme as a dependency. I prefer to duplicate these lines and keep certbot-ci free of any dependency coming from the certbot project.
2019-05-30 07:09:09 -07:00
Pete Cooper
4c299be965 Update docs/cli-help.txt -- typo and formatting (#7105)
* Update docs/cli-help.txt -- yypo and formatting

'areusing' -> 'are using'

* Update cli.py -- formatting

See https://github.com/certbot/certbot/pull/7105

Addresses https://github.com/certbot/certbot/pull/7105#issuecomment-497079342
2019-05-29 14:16:16 -07:00
Brad Warren
561534b754 Move IRC notifications to #certbot-devel. (#7098)
* Move IRC notifications to #certbot-devel.

* Don't use notice.
2019-05-29 09:54:26 +03:00
Adrien Ferrand
7d35f95293 Avoid to delete both webroot_map and webroot_path (#7095)
* Always restore webroot_path in renewal config.

* Add unit tests to ensure correct behavior

* Add changelog

* Add certbot as modified package
2019-05-28 15:16:12 -07:00
Brad Warren
d2a2b88090 Update Ubuntu AMI to 19.04. (#7099) 2019-05-28 23:36:10 +02:00
Brad Warren
bf818036eb Revert "Fix unpinned dependencies tests towards botocore and urllib3 (#7081)" (#7101)
This reverts commit 51a7e7cd19.
2019-05-25 00:20:54 +02:00
Tim White
352218510a Update changelog 2019-04-26 09:34:17 +10:00
Tim White
463d089407 Detect private DNS zones in Google and skip them until we get to a public zone 2019-04-26 09:31:39 +10:00
236 changed files with 4130 additions and 4280 deletions

View File

@@ -4,11 +4,15 @@ coverage:
default: off
linux:
flags: linux
target: auto
# Fixed target instead of auto set by #7173, can
# be removed when flags in Codecov are added back.
target: 97.5
threshold: 0.1
base: auto
windows:
flags: windows
target: auto
# Fixed target instead of auto set by #7173, can
# be removed when flags in Codecov are added back.
target: 97.6
threshold: 0.1
base: auto

35
.github/stale.yml vendored Normal file
View File

@@ -0,0 +1,35 @@
# Configuration for https://github.com/marketplace/stale
# Number of days of inactivity before an Issue or Pull Request becomes stale
daysUntilStale: 365
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
# When changing this value, be sure to also update markComment below.
daysUntilClose: 30
# Ignore issues with an assignee (defaults to false)
exemptAssignees: true
# Label to use when marking as stale
staleLabel: needs-update
# Comment to post when marking as stale. Set to `false` to disable
markComment: >
We've made a lot of changes to Certbot since this issue was opened. If you
still have this issue with an up-to-date version of Certbot, can you please
add a comment letting us know? This helps us to better see what issues are
still affecting our users. If there is no activity in the next 30 days, this
issue will be automatically closed.
# Comment to post when closing a stale Issue or Pull Request.
closeComment: >
This issue has been closed due to lack of activity, but if you think it
should be reopened, please open a new issue with a link to this one and we'll
take a look.
# Limit the number of actions per hour, from 1-30. Default is 30
limitPerRun: 1
# Don't mark pull requests as stale.
only: issues

5
.gitignore vendored
View File

@@ -44,3 +44,8 @@ tests/letstest/venv/
# docker files
.docker
# certbot tests
.certbot_test_workspace
**/assets/pebble*
**/assets/challtestsrv*

View File

@@ -16,6 +16,9 @@ before_script:
# is a cap of on the number of simultaneous runs.
branches:
only:
# apache-parser-v2 is a temporary branch for doing work related to
# rewriting the parser in the Apache plugin.
- apache-parser-v2
- master
- /^\d+\.\d+\.x$/
- /^test-.*$/
@@ -24,9 +27,10 @@ branches:
not-on-master: &not-on-master
if: NOT (type = push AND branch = master)
# Jobs for the extended test suite are executed for cron jobs and pushes on non-master branches.
# Jobs for the extended test suite are executed for cron jobs and pushes to
# non-development branches. See the explanation for apache-parser-v2 above.
extended-test-suite: &extended-test-suite
if: type = cron OR (type = push AND branch != master)
if: type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))
matrix:
include:
@@ -41,12 +45,6 @@ matrix:
- python: "2.7"
env: TOXENV=py27-cover FYI="py27 tests + code coverage"
- sudo: required
env: TOXENV=nginx_compat
services: docker
before_install:
addons:
<<: *not-on-master
- python: "2.7"
env: TOXENV=lint
<<: *not-on-master
@@ -57,7 +55,11 @@ matrix:
env: TOXENV=mypy
<<: *not-on-master
- python: "2.7"
env: TOXENV='py27-{acme,apache,certbot,dns,nginx,postfix}-oldest'
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
env: TOXENV='py27-{acme,apache,certbot,dns,nginx}-oldest'
sudo: required
services: docker
<<: *not-on-master
@@ -79,10 +81,8 @@ matrix:
addons:
<<: *not-on-master
- sudo: required
env: TOXENV=le_auto_trusty
env: TOXENV=le_auto_xenial
services: docker
before_install:
addons:
<<: *not-on-master
- python: "2.7"
env: TOXENV=apacheconftest-with-pebble
@@ -94,6 +94,12 @@ matrix:
<<: *not-on-master
# Extended test suite on cron jobs and pushes to tested branches other than master
- sudo: required
env: TOXENV=nginx_compat
services: docker
before_install:
addons:
<<: *extended-test-suite
- python: "2.7"
env:
- TOXENV=travis-test-farm-apache2
@@ -130,12 +136,6 @@ matrix:
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: TOXENV=py27-certbot-oldest
<<: *extended-test-suite
- python: "2.7"
env: TOXENV=py27-nginx-oldest
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration-certbot-oldest
sudo: required
@@ -211,10 +211,6 @@ matrix:
sudo: required
services: docker
<<: *extended-test-suite
- sudo: required
env: TOXENV=le_auto_xenial
services: docker
<<: *extended-test-suite
- sudo: required
env: TOXENV=le_auto_jessie
services: docker
@@ -234,6 +230,9 @@ matrix:
- language: generic
env: TOXENV=py27
os: osx
# Using this osx_image is a workaround for
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
osx_image: xcode10.2
addons:
homebrew:
packages:
@@ -243,6 +242,9 @@ matrix:
- language: generic
env: TOXENV=py3
os: osx
# Using this osx_image is a workaround for
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
osx_image: xcode10.2
addons:
homebrew:
packages:
@@ -257,7 +259,6 @@ addons:
apt:
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
- python-dev
- python-virtualenv
- gcc
- libaugeas0
- libssl-dev
@@ -267,7 +268,12 @@ addons:
- nginx-light
- openssl
install: "$(command -v pip || command -v pip3) install codecov tox"
# tools/pip_install.py is used to pin packages to a known working version
# except in tests where the environment variable CERTBOT_NO_PIN is set.
# virtualenv is listed here explicitly to make sure it is upgraded when
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
# version of virtualenv.
install: "tools/pip_install.py -U codecov tox virtualenv"
script: tox
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov -F linux'
@@ -276,8 +282,11 @@ notifications:
email: false
irc:
channels:
- secure: "SGWZl3ownKx9xKVV2VnGt7DqkTmutJ89oJV9tjKhSs84kLijU6EYdPnllqISpfHMTxXflNZuxtGo0wTDYHXBuZL47w1O32W6nzuXdra5zC+i4sYQwYULUsyfOv9gJX8zWAULiK0Z3r0oho45U+FR5ZN6TPCidi8/eGU+EEPwaAw="
# This is set to a secure variable to prevent forks from sending
# notifications. This value was created by installing
# https://github.com/travis-ci/travis.rb and running
# `travis encrypt "chat.freenode.net#certbot-devel"`.
- secure: "EWW66E2+KVPZyIPR8ViENZwfcup4Gx3/dlimmAZE0WuLwxDCshBBOd3O8Rf6pBokEoZlXM5eDT6XdyJj8n0DLslgjO62pExdunXpbcMwdY7l1ELxX2/UbnDTE6UnPYa09qVBHNG7156Z6yE0x2lH4M9Ykvp0G0cubjPQHylAwo0="
on_cancel: never
on_success: never
on_failure: always
use_notice: true

View File

@@ -5,6 +5,7 @@ Authors
* Aaron Zuehlke
* Ada Lovelace
* [Adam Woodbeck](https://github.com/awoodbeck)
* [Adrien Ferrand](https://github.com/adferrand)
* [Aidin Gharibnavaz](https://github.com/aidin36)
* [AJ ONeal](https://github.com/coolaj86)
* [Alcaro](https://github.com/Alcaro)
@@ -14,6 +15,7 @@ Authors
* [Alex Gaynor](https://github.com/alex)
* [Alex Halderman](https://github.com/jhalderm)
* [Alex Jordan](https://github.com/strugee)
* [Alex Zorin](https://github.com/alexzorin)
* [Amjad Mashaal](https://github.com/TheNavigat)
* [Andrew Murray](https://github.com/radarhere)
* [Anselm Levskaya](https://github.com/levskaya)
@@ -75,6 +77,7 @@ Authors
* [Fabian](https://github.com/faerbit)
* [Faidon Liambotis](https://github.com/paravoid)
* [Fan Jiang](https://github.com/tcz001)
* [Felix Lechner](https://github.com/lechner)
* [Felix Schwarz](https://github.com/FelixSchwarz)
* [Felix Yan](https://github.com/felixonmars)
* [Filip Ochnik](https://github.com/filipochnik)

View File

@@ -2,13 +2,12 @@
Certbot adheres to [Semantic Versioning](https://semver.org/).
## 0.35.0 - master
## 0.37.0 - master
### Added
* dns_rfc2136 plugin now supports explicitly specifing an authorative
base domain for cases when the automatic method does not work (e.g.
Split horizon DNS)
* Turn off session tickets for apache plugin by default
* acme: Authz deactivation added to `acme` module.
### Changed
@@ -18,6 +17,43 @@ Certbot adheres to [Semantic Versioning](https://semver.org/).
*
More details about these changes can be found on our GitHub repo.
## 0.36.0 - 2019-07-11
### Added
* Turn off session tickets for nginx plugin by default
* Added missing error types from RFC8555 to acme
### Changed
* Support for Ubuntu 14.04 Trusty has been removed.
* Update the 'manage your account' help to be more generic.
* The error message when Certbot's Apache plugin is unable to modify your
Apache configuration has been improved.
* Certbot's config_changes subcommand has been deprecated and will be
removed in a future release.
* `certbot config_changes` no longer accepts a --num parameter.
* The functions `certbot.plugins.common.Installer.view_config_changes` and
`certbot.reverter.Reverter.view_config_changes` have been deprecated and will
be removed in a future release.
### Fixed
* Replace some unnecessary platform-specific line separation.
More details about these changes can be found on our GitHub repo.
## 0.35.1 - 2019-06-10
### Fixed
* Support for specifying an authoritative base domain in our dns-rfc2136 plugin
has been removed. This feature was added in our last release but had a bug
which caused the plugin to fail so the feature has been removed until it can
be added properly.
Despite us having broken lockstep, we are continuing to release new versions of
all Certbot components during releases for the time being, however, the only
package with changes other than its version number was:
@@ -26,6 +62,37 @@ package with changes other than its version number was:
More details about these changes can be found on our GitHub repo.
## 0.35.0 - 2019-06-05
### Added
* dns_rfc2136 plugin now supports explicitly specifing an authorative
base domain for cases when the automatic method does not work (e.g.
Split horizon DNS)
### Changed
*
### Fixed
* Renewal parameter `webroot_path` is always saved, avoiding some regressions
when `webroot` authenticator plugin is invoked with no challenge to perform.
* Certbot now accepts OCSP responses when an explicit authorized
responder, different from the issuer, is used to sign OCSP
responses.
* Scripts in Certbot hook directories are no longer executed when their
filenames end in a tilde.
Despite us having broken lockstep, we are continuing to release new versions of
all Certbot components during releases for the time being, however, the only
package with changes other than its version number was:
* certbot
* certbot-dns-rfc2136
More details about these changes can be found on our GitHub repo.
## 0.34.2 - 2019-05-07
### Fixed
@@ -68,6 +135,10 @@ More details about these changes can be found on our GitHub repo.
`malformed` error to be received from the ACME server.
* Linode DNS plugin now supports api keys created from their new panel
at [cloud.linode.com](https://cloud.linode.com)
### Fixed
* Fixed Google DNS Challenge issues when private zones exist
* Adding a warning noting that future versions of Certbot will automatically configure the
webserver so that all requests redirect to secure HTTPS access. You can control this
behavior and disable this warning with the --redirect and --no-redirect flags.

View File

@@ -1,35 +0,0 @@
FROM python:2-alpine3.9
ENTRYPOINT [ "certbot" ]
EXPOSE 80 443
VOLUME /etc/letsencrypt /var/lib/letsencrypt
WORKDIR /opt/certbot
COPY CHANGELOG.md README.rst setup.py src/
# Generate constraints file to pin dependency versions
COPY letsencrypt-auto-source/pieces/dependency-requirements.txt .
COPY tools /opt/certbot/tools
RUN sh -c 'cat dependency-requirements.txt | /opt/certbot/tools/strip_hashes.py > unhashed_requirements.txt'
RUN sh -c 'cat tools/dev_constraints.txt unhashed_requirements.txt | /opt/certbot/tools/merge_requirements.py > docker_constraints.txt'
COPY acme src/acme
COPY certbot src/certbot
RUN apk add --no-cache --virtual .certbot-deps \
libffi \
libssl1.1 \
openssl \
ca-certificates \
binutils
RUN apk add --no-cache --virtual .build-deps \
gcc \
linux-headers \
openssl-dev \
musl-dev \
libffi-dev \
&& pip install -r /opt/certbot/dependency-requirements.txt \
&& pip install --no-cache-dir --no-deps \
--editable /opt/certbot/src/acme \
--editable /opt/certbot/src \
&& apk del .build-deps

View File

@@ -1,5 +1,5 @@
# This Dockerfile builds an image for development.
FROM ubuntu:xenial
FROM debian:buster
# Note: this only exposes the port to other docker containers.
EXPOSE 80 443

View File

@@ -1,75 +0,0 @@
# https://github.com/letsencrypt/letsencrypt/pull/431#issuecomment-103659297
# it is more likely developers will already have ubuntu:trusty rather
# than e.g. debian:jessie and image size differences are negligible
FROM ubuntu:trusty
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
MAINTAINER William Budington <bill@eff.org>
# Note: this only exposes the port to other docker containers. You
# still have to bind to 443@host at runtime, as per the ACME spec.
EXPOSE 443
# TODO: make sure --config-dir and --work-dir cannot be changed
# through the CLI (certbot-docker wrapper that uses standalone
# authenticator and text mode only?)
VOLUME /etc/letsencrypt /var/lib/letsencrypt
WORKDIR /opt/certbot
# no need to mkdir anything:
# https://docs.docker.com/reference/builder/#copy
# If <dest> doesn't exist, it is created along with all missing
# directories in its path.
ENV DEBIAN_FRONTEND=noninteractive
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*
# the above is not likely to change, so by putting it further up the
# Dockerfile we make sure we cache as much as possible
COPY setup.py README.rst CHANGELOG.md MANIFEST.in letsencrypt-auto-source/pieces/pipstrap.py /opt/certbot/src/
# all above files are necessary for setup.py and venv setup, however,
# package source code directory has to be copied separately to a
# subdirectory...
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
# directory, the entire contents of the directory are copied,
# including filesystem metadata. Note: The directory itself is not
# copied, just its contents." Order again matters, three files are far
# more likely to be cached than the whole project directory
COPY certbot /opt/certbot/src/certbot/
COPY acme /opt/certbot/src/acme/
COPY certbot-apache /opt/certbot/src/certbot-apache/
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
RUN VIRTUALENV_NO_DOWNLOAD=1 virtualenv --no-site-packages -p python2 /opt/certbot/venv
# PATH is set now so pipstrap upgrades the correct (v)env
ENV PATH /opt/certbot/venv/bin:$PATH
RUN /opt/certbot/venv/bin/python /opt/certbot/src/pipstrap.py && \
/opt/certbot/venv/bin/pip install \
-e /opt/certbot/src/acme \
-e /opt/certbot/src \
-e /opt/certbot/src/certbot-apache \
-e /opt/certbot/src/certbot-nginx
# install in editable mode (-e) to save space: it's not possible to
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
# this might also help in debugging: you can "docker run --entrypoint
# bash" and investigate, apply patches, etc.
# set up certbot/letsencrypt wrapper to warn people about Dockerfile changes
COPY tools/docker-warning.sh /opt/certbot/bin/certbot
RUN ln -s /opt/certbot/bin/certbot /opt/certbot/bin/letsencrypt
ENV PATH /opt/certbot/bin:$PATH
ENTRYPOINT [ "certbot" ]

View File

@@ -123,6 +123,21 @@ class ClientBase(object): # pylint: disable=too-many-instance-attributes
"""
return self.update_registration(regr, update={'status': 'deactivated'})
def deactivate_authorization(self, authzr):
# type: (messages.AuthorizationResource) -> messages.AuthorizationResource
"""Deactivate authorization.
:param messages.AuthorizationResource authzr: The Authorization resource
to be deactivated.
:returns: The Authorization resource that was deactivated.
:rtype: `.AuthorizationResource`
"""
body = messages.UpdateAuthorization(status='deactivated')
response = self._post(authzr.uri, body)
return self._authzr_from_response(response)
def _authzr_from_response(self, response, identifier=None, uri=None):
authzr = messages.AuthorizationResource(
body=messages.Authorization.from_json(response.json()),

View File

@@ -637,6 +637,14 @@ class ClientTest(ClientTestBase):
errors.PollError, self.client.poll_and_request_issuance,
csr, authzrs, mintime=mintime, max_attempts=2)
def test_deactivate_authorization(self):
authzb = self.authzr.body.update(status=messages.STATUS_DEACTIVATED)
self.response.json.return_value = authzb.to_json()
authzr = self.client.deactivate_authorization(self.authzr)
self.assertEqual(authzb, authzr.body)
self.assertEqual(self.client.net.post.call_count, 1)
self.assertTrue(self.authzr.uri in self.net.post.call_args_list[0][0])
def test_check_cert(self):
self.response.headers['Location'] = self.certr.uri
self.response.content = CERT_DER

View File

@@ -18,20 +18,35 @@ OLD_ERROR_PREFIX = "urn:acme:error:"
ERROR_PREFIX = "urn:ietf:params:acme:error:"
ERROR_CODES = {
'accountDoesNotExist': 'The request specified an account that does not exist',
'alreadyRevoked': 'The request specified a certificate to be revoked that has' \
' already been revoked',
'badCSR': 'The CSR is unacceptable (e.g., due to a short key)',
'badNonce': 'The client sent an unacceptable anti-replay nonce',
'badPublicKey': 'The JWS was signed by a public key the server does not support',
'badRevocationReason': 'The revocation reason provided is not allowed by the server',
'badSignatureAlgorithm': 'The JWS was signed with an algorithm the server does not support',
'caa': 'Certification Authority Authorization (CAA) records forbid the CA from issuing' \
' a certificate',
'compound': 'Specific error conditions are indicated in the "subproblems" array',
'connection': ('The server could not connect to the client to verify the'
' domain'),
'dns': 'There was a problem with a DNS query during identifier validation',
'dnssec': 'The server could not validate a DNSSEC signed domain',
'incorrectResponse': 'Response recieved didn\'t match the challenge\'s requirements',
# deprecate invalidEmail
'invalidEmail': 'The provided email for a registration was invalid',
'invalidContact': 'The provided contact URI was invalid',
'malformed': 'The request message was malformed',
'rejectedIdentifier': 'The server will not issue certificates for the identifier',
'orderNotReady': 'The request attempted to finalize an order that is not ready to be finalized',
'rateLimited': 'There were too many requests of a given type',
'serverInternal': 'The server experienced an internal error',
'tls': 'The server experienced a TLS error during domain verification',
'unauthorized': 'The client lacks sufficient authorization',
'unsupportedContact': 'A contact URL for an account used an unsupported protocol scheme',
'unknownHost': 'The server could not resolve a domain name',
'unsupportedIdentifier': 'An identifier is of an unsupported type',
'externalAccountRequired': 'The server requires external account binding',
}
@@ -153,6 +168,7 @@ STATUS_VALID = Status('valid')
STATUS_INVALID = Status('invalid')
STATUS_REVOKED = Status('revoked')
STATUS_READY = Status('ready')
STATUS_DEACTIVATED = Status('deactivated')
class IdentifierType(_Constant):
@@ -456,7 +472,7 @@ class Authorization(ResourceBody):
:ivar datetime.datetime expires:
"""
identifier = jose.Field('identifier', decoder=Identifier.from_json)
identifier = jose.Field('identifier', decoder=Identifier.from_json, omitempty=True)
challenges = jose.Field('challenges', omitempty=True)
combinations = jose.Field('combinations', omitempty=True)
@@ -486,6 +502,12 @@ class NewAuthorization(Authorization):
resource = fields.Resource(resource_type)
class UpdateAuthorization(Authorization):
"""Update authorization."""
resource_type = 'authz'
resource = fields.Resource(resource_type)
class AuthorizationResource(ResourceWithURI):
"""Authorization Resource.

View File

@@ -3,7 +3,7 @@ from setuptools import find_packages
from setuptools.command.test import test as TestCommand
import sys
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Please update tox.ini when modifying dependency version requirements
install_requires = [

View File

@@ -7,6 +7,9 @@ environment:
branches:
only:
# apache-parser-v2 is a temporary branch for doing work related to
# rewriting the parser in the Apache plugin.
- apache-parser-v2
- master
- /^\d+\.\d+\.x$/ # Version branches like X.X.X
- /^test-.*$/
@@ -27,7 +30,8 @@ install:
# Upgrade pip to avoid warnings
- "python -m pip install --upgrade pip"
# Ready to install tox and coverage
- "pip install tox codecov"
# tools/pip_install.py is used to pin packages to a known working version.
- "python tools\\pip_install.py tox codecov"
build: off

View File

@@ -5,3 +5,4 @@ recursive-include certbot_apache/tests/testdata *
include certbot_apache/centos-options-ssl-apache.conf
include certbot_apache/options-ssl-apache.conf
recursive-include certbot_apache/augeas_lens *.aug
recursive-include certbot_apache/tls_configs *.conf

View File

@@ -1,6 +1,8 @@
""" Utility functions for certbot-apache plugin """
import binascii
import pkg_resources
from certbot import util
from certbot.compat import os
@@ -105,3 +107,15 @@ def parse_define_file(filepath, varname):
def unique_id():
""" Returns an unique id to be used as a VirtualHost identifier"""
return binascii.hexlify(os.urandom(16)).decode("utf-8")
def find_ssl_apache_conf(prefix):
"""
Find a TLS Apache config file in the dedicated storage.
:param str prefix: prefix of the TLS Apache config file to find
:return: the path the TLS Apache config file
:rtype: str
"""
return pkg_resources.resource_filename(
"certbot_apache",
os.path.join("tls_configs", "{0}-options-ssl-apache.conf".format(prefix)))

View File

@@ -1,207 +0,0 @@
"""Class of Augeas Configurators."""
import logging
from certbot import errors
from certbot.plugins import common
from certbot_apache import constants
logger = logging.getLogger(__name__)
class AugeasConfigurator(common.Installer):
"""Base Augeas Configurator class.
:ivar config: Configuration.
:type config: :class:`~certbot.interfaces.IConfig`
:ivar aug: Augeas object
:type aug: :class:`augeas.Augeas`
:ivar str save_notes: Human-readable configuration change notes
:ivar reverter: saves and reverts checkpoints
:type reverter: :class:`certbot.reverter.Reverter`
"""
def __init__(self, *args, **kwargs):
super(AugeasConfigurator, self).__init__(*args, **kwargs)
# Placeholder for augeas
self.aug = None
self.save_notes = ""
def init_augeas(self):
""" Initialize the actual Augeas instance """
import augeas
self.aug = augeas.Augeas(
# specify a directory to load our preferred lens from
loadpath=constants.AUGEAS_LENS_DIR,
# Do not save backup (we do it ourselves), do not load
# anything by default
flags=(augeas.Augeas.NONE |
augeas.Augeas.NO_MODL_AUTOLOAD |
augeas.Augeas.ENABLE_SPAN))
# See if any temporary changes need to be recovered
# This needs to occur before VirtualHost objects are setup...
# because this will change the underlying configuration and potential
# vhosts
self.recovery_routine()
def check_parsing_errors(self, lens):
"""Verify Augeas can parse all of the lens files.
:param str lens: lens to check for errors
:raises .errors.PluginError: If there has been an error in parsing with
the specified lens.
"""
error_files = self.aug.match("/augeas//error")
for path in error_files:
# Check to see if it was an error resulting from the use of
# the httpd lens
lens_path = self.aug.get(path + "/lens")
# As aug.get may return null
if lens_path and lens in lens_path:
msg = (
"There has been an error in parsing the file {0} on line {1}: "
"{2}".format(
# Strip off /augeas/files and /error
path[13:len(path) - 6],
self.aug.get(path + "/line"),
self.aug.get(path + "/message")))
raise errors.PluginError(msg)
def ensure_augeas_state(self):
"""Makes sure that all Augeas dom changes are written to files to avoid
loss of configuration directives when doing additional augeas parsing,
causing a possible augeas.load() resulting dom reset
"""
if self.unsaved_files():
self.save_notes += "(autosave)"
self.save()
def unsaved_files(self):
"""Lists files that have modified Augeas DOM but the changes have not
been written to the filesystem yet, used by `self.save()` and
ApacheConfigurator to check the file state.
:raises .errors.PluginError: If there was an error in Augeas, in
an attempt to save the configuration, or an error creating a
checkpoint
:returns: `set` of unsaved files
"""
save_state = self.aug.get("/augeas/save")
self.aug.set("/augeas/save", "noop")
# Existing Errors
ex_errs = self.aug.match("/augeas//error")
try:
# This is a noop save
self.aug.save()
except (RuntimeError, IOError):
self._log_save_errors(ex_errs)
# Erase Save Notes
self.save_notes = ""
raise errors.PluginError(
"Error saving files, check logs for more info.")
# Return the original save method
self.aug.set("/augeas/save", save_state)
# Retrieve list of modified files
# Note: Noop saves can cause the file to be listed twice, I used a
# set to remove this possibility. This is a known augeas 0.10 error.
save_paths = self.aug.match("/augeas/events/saved")
save_files = set()
if save_paths:
for path in save_paths:
save_files.add(self.aug.get(path)[6:])
return save_files
def save(self, title=None, temporary=False):
"""Saves all changes to the configuration files.
This function first checks for save errors, if none are found,
all configuration changes made will be saved. According to the
function parameters. If an exception is raised, a new checkpoint
was not created.
:param str title: The title of the save. If a title is given, the
configuration will be saved as a new checkpoint and put in a
timestamped directory.
:param bool temporary: Indicates whether the changes made will
be quickly reversed in the future (ie. challenges)
"""
save_files = self.unsaved_files()
if save_files:
self.add_to_checkpoint(save_files,
self.save_notes, temporary=temporary)
self.save_notes = ""
self.aug.save()
# Force reload if files were modified
# This is needed to recalculate augeas directive span
if save_files:
for sf in save_files:
self.aug.remove("/files/"+sf)
self.aug.load()
if title and not temporary:
self.finalize_checkpoint(title)
def _log_save_errors(self, ex_errs):
"""Log errors due to bad Augeas save.
:param list ex_errs: Existing errors before save
"""
# Check for the root of save problems
new_errs = self.aug.match("/augeas//error")
# logger.error("During Save - %s", mod_conf)
logger.error("Unable to save files: %s. Attempted Save Notes: %s",
", ".join(err[13:len(err) - 6] for err in new_errs
# Only new errors caused by recent save
if err not in ex_errs), self.save_notes)
# Wrapper functions for Reverter class
def recovery_routine(self):
"""Revert all previously modified files.
Reverts all modified files that have not been saved as a checkpoint
:raises .errors.PluginError: If unable to recover the configuration
"""
super(AugeasConfigurator, self).recovery_routine()
# Need to reload configuration after these changes take effect
self.aug.load()
def revert_challenge_config(self):
"""Used to cleanup challenge configurations.
:raises .errors.PluginError: If unable to revert the challenge config.
"""
self.revert_temporary_config()
self.aug.load()
def rollback_checkpoints(self, rollback=1):
"""Rollback saved checkpoints.
:param int rollback: Number of checkpoints to revert
:raises .errors.PluginError: If there is a problem with the input or
the function is unable to correctly revert the configuration
"""
super(AugeasConfigurator, self).rollback_checkpoints(rollback)
self.aug.load()

View File

@@ -0,0 +1,87 @@
""" Tests for ParserNode interface """
from certbot_apache import interfaces
class AugeasCommentNode(interfaces.CommentNode):
""" Augeas implementation of CommentNode interface """
ancestor = None
comment = ""
dirty = False
def __init__(self, comment, ancestor=None):
self.comment = comment
self.ancestor = ancestor
def save(self, msg): # pragma: no cover
pass
class AugeasDirectiveNode(interfaces.DirectiveNode):
""" Augeas implementation of DirectiveNode interface """
ancestor = None
parameters = tuple() # type: Tuple[str, ...]
dirty = False
enabled = True
name = ""
def __init__(self, name, parameters=tuple(), ancestor=None):
self.name = name
self.parameters = parameters
self.ancestor = ancestor
def save(self, msg): # pragma: no cover
pass
def set_parameters(self, parameters): # pragma: no cover
self.parameters = tuple("CERTBOT_PASS_ASSERT")
class AugeasBlockNode(interfaces.BlockNode):
""" Augeas implementation of BlockNode interface """
ancestor = None
parameters = tuple() # type: Tuple[str, ...]
children = tuple() # type: Tuple[interfaces.ParserNode, ...]
dirty = False
enabled = True
name = ""
def __init__(self, name, parameters=tuple(), ancestor=None):
self.name = name
self.parameters = parameters
self.ancestor = ancestor
def save(self, msg): # pragma: no cover
pass
def add_child_block(self, name, parameters=None, position=None): # pragma: no cover
new_block = AugeasBlockNode("CERTBOT_PASS_ASSERT", ancestor=self)
self.children += (new_block,)
return new_block
def add_child_directive(self, name, parameters=None, position=None): # pragma: no cover
new_dir = AugeasDirectiveNode("CERTBOT_PASS_ASSERT", ancestor=self)
self.children += (new_dir,)
return new_dir
def add_child_comment(self, comment="", position=None): # pragma: no cover
new_comment = AugeasCommentNode("CERTBOT_PASS_ASSERT", ancestor=self)
self.children += (new_comment,)
return new_comment
def find_blocks(self, name, exclude=True): # pragma: no cover
return [AugeasBlockNode("CERTBOT_PASS_ASSERT", ancestor=self)]
def find_directives(self, name, exclude=True): # pragma: no cover
return [AugeasDirectiveNode("CERTBOT_PASS_ASSERT", ancestor=self)]
def find_comments(self, comment, exact=False): # pragma: no cover
return [AugeasCommentNode("CERTBOT_PASS_ASSERT", ancestor=self)]
def delete_child(self, child): # pragma: no cover
pass
def set_parameters(self, parameters): # pragma: no cover
self.parameters = tuple("CERTBOT_PASS_ASSERT")
def unsaved_files(self): # pragma: no cover
return ["CERTBOT_PASS_ASSERT"]

View File

@@ -1,4 +1,4 @@
"""Apache Configuration based off of Augeas Configurator."""
"""Apache Configurator."""
# pylint: disable=too-many-lines
import copy
import fnmatch
@@ -9,7 +9,6 @@ import time
from collections import defaultdict
import pkg_resources
import six
import zope.component
@@ -23,13 +22,14 @@ from certbot import interfaces
from certbot import util
from certbot.achallenges import KeyAuthorizationAnnotatedChallenge # pylint: disable=unused-import
from certbot.compat import filesystem
from certbot.compat import os
from certbot.plugins import common
from certbot.plugins.util import path_surgery
from certbot.plugins.enhancements import AutoHSTSEnhancement
from certbot_apache import apache_util
from certbot_apache import augeas_configurator
from certbot_apache import augeasparser
from certbot_apache import constants
from certbot_apache import display_ops
from certbot_apache import http_01
@@ -70,13 +70,10 @@ logger = logging.getLogger(__name__)
@zope.interface.implementer(interfaces.IAuthenticator, interfaces.IInstaller)
@zope.interface.provider(interfaces.IPluginFactory)
class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
class ApacheConfigurator(common.Installer):
# pylint: disable=too-many-instance-attributes,too-many-public-methods
"""Apache configurator.
State of Configurator: This code has been been tested and built for Ubuntu
14.04 Apache 2.4 and it works for Ubuntu 12.04 Apache 2.2
:ivar config: Configuration.
:type config: :class:`~certbot.interfaces.IConfig`
@@ -113,14 +110,24 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/apache2",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
def option(self, key):
"""Get a value from options"""
return self.options.get(key)
def pick_apache_config(self):
"""
Pick the appropriate TLS Apache configuration file for current version of Apache and OS.
:return: the path to the TLS Apache configuration file to use
:rtype: str
"""
# Disabling TLS session tickets is supported by Apache 2.4.11+.
# So for old versions of Apache we pick a configuration without this option.
if self.version < (2, 4, 11):
return apache_util.find_ssl_apache_conf("old")
return apache_util.find_ssl_apache_conf("current")
def _prepare_options(self):
"""
Set the values possibly changed by command line parameters to
@@ -201,6 +208,8 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
self._enhanced_vhosts = defaultdict(set) # type: DefaultDict[str, Set[obj.VirtualHost]]
# Temporary state for AutoHSTS enhancement
self._autohsts = {} # type: Dict[str, Dict[str, Union[int, float]]]
# Reverter save notes
self.save_notes = ""
# These will be set in the prepare function
self._prepared = False
@@ -231,12 +240,6 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
:raises .errors.PluginError: If there is any other error
"""
# Perform the actual Augeas initialization to be able to react
try:
self.init_augeas()
except ImportError:
raise errors.NoInstallationError("Problem in Augeas installation")
self._prepare_options()
# Verify Apache is installed
@@ -254,16 +257,14 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
raise errors.NotSupportedError(
"Apache Version {0} not supported.".format(str(self.version)))
if not self._check_aug_version():
raise errors.NotSupportedError(
"Apache plugin support requires libaugeas0 and augeas-lenses "
"version 1.2.0 or higher, please make sure you have you have "
"those installed.")
# Recover from previous crash before Augeas initialization to have the
# correct parse tree from the get go.
self.recovery_routine()
# Perform the actual Augeas initialization to be able to react
self.parser = self.get_parser()
# Check for errors in parsing files with Augeas
self.check_parsing_errors("httpd.aug")
self.parser.check_parsing_errors("httpd.aug")
# Get all of the available vhosts
self.vhosts = self.get_virtual_hosts()
@@ -276,9 +277,77 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
util.lock_dir_until_exit(self.option("server_root"))
except (OSError, errors.LockError):
logger.debug("Encountered error:", exc_info=True)
raise errors.PluginError("Unable to lock {0}".format(self.option("server_root")))
raise errors.PluginError(
"Unable to create a lock file in {0}. Are you running"
" Certbot with sufficient privileges to modify your"
" Apache configuration?".format(self.option("server_root")))
# TODO: apache-parser-v2
self.parser_root = augeasparser.AugeasBlockNode(None, ancestor=None)
self._prepared = True
def save(self, title=None, temporary=False):
"""Saves all changes to the configuration files.
This function first checks for save errors, if none are found,
all configuration changes made will be saved. According to the
function parameters. If an exception is raised, a new checkpoint
was not created.
:param str title: The title of the save. If a title is given, the
configuration will be saved as a new checkpoint and put in a
timestamped directory.
:param bool temporary: Indicates whether the changes made will
be quickly reversed in the future (ie. challenges)
"""
save_files = self.parser.unsaved_files()
if save_files:
self.add_to_checkpoint(save_files,
self.save_notes, temporary=temporary)
# Handle the parser specific tasks
self.parser.save(save_files)
if title and not temporary:
self.finalize_checkpoint(title)
def recovery_routine(self):
"""Revert all previously modified files.
Reverts all modified files that have not been saved as a checkpoint
:raises .errors.PluginError: If unable to recover the configuration
"""
super(ApacheConfigurator, self).recovery_routine()
# Reload configuration after these changes take effect if needed
# ie. ApacheParser has been initialized.
if self.parser:
# TODO: wrap into non-implementation specific parser interface
self.parser.aug.load()
def revert_challenge_config(self):
"""Used to cleanup challenge configurations.
:raises .errors.PluginError: If unable to revert the challenge config.
"""
self.revert_temporary_config()
self.parser.aug.load()
def rollback_checkpoints(self, rollback=1):
"""Rollback saved checkpoints.
:param int rollback: Number of checkpoints to revert
:raises .errors.PluginError: If there is a problem with the input or
the function is unable to correctly revert the configuration
"""
super(ApacheConfigurator, self).rollback_checkpoints(rollback)
self.parser.aug.load()
def _verify_exe_availability(self, exe):
"""Checks availability of Apache executable"""
if not util.exe_exists(exe):
@@ -286,26 +355,11 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
raise errors.NoInstallationError(
'Cannot find Apache executable {0}'.format(exe))
def _check_aug_version(self):
""" Checks that we have recent enough version of libaugeas.
If augeas version is recent enough, it will support case insensitive
regexp matching"""
self.aug.set("/test/path/testing/arg", "aRgUMeNT")
try:
matches = self.aug.match(
"/test//*[self::arg=~regexp('argument', 'i')]")
except RuntimeError:
self.aug.remove("/test/path")
return False
self.aug.remove("/test/path")
return matches
def get_parser(self):
"""Initializes the ApacheParser"""
# If user provided vhost_root value in command line, use it
return parser.ApacheParser(
self.aug, self.option("server_root"), self.conf("vhost-root"),
self.option("server_root"), self.conf("vhost-root"),
self.version, configurator=self)
def _wildcard_domain(self, domain):
@@ -458,10 +512,25 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
"cert_key": self.parser.find_dir("SSLCertificateKeyFile",
None, vhost.path)}
# TODO: apache-parser-v2
# We need to add a BlockNode reference to obj.VirtualHost
# v2_path = {"cert_path": vhost.node.find_directives(
# "SSLCertificateFile"),
# "cert_key": vhost.node.find_directives(
# "SSLCertificateKeyFile")}
# parserassert.legacy_assert_dir(path["cert_path"], v2_path["cert_path"])
# parserassert.legacy_assert_dir(path["cert_key"], v2_path["cert_key"])
# Only include if a certificate chain is specified
if chain_path is not None:
path["chain_path"] = self.parser.find_dir(
"SSLCertificateChainFile", None, vhost.path)
# TODO: apache-parser-v2
# v2_path["chain_path"] = vhost.node.find_directives(
# "SSLCertificateChainFile")
# parserassert.legacy_assert_dir(path["chain_path"],
# v2_path["chain_path])
# Handle errors when certificate/key directives cannot be found
if not path["cert_path"]:
@@ -484,11 +553,17 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
# install SSLCertificateFile, SSLCertificateKeyFile,
# and SSLCertificateChainFile directives
set_cert_path = cert_path
self.aug.set(path["cert_path"][-1], cert_path)
self.aug.set(path["cert_key"][-1], key_path)
self.parser.aug.set(path["cert_path"][-1], cert_path)
self.parser.aug.set(path["cert_key"][-1], key_path)
# TODO: apache-parser-v2
# path["cert_path"][-1].set_parameters(cert_path,)
# path["cert_key"][-1].set_parameters(key_path,)
if chain_path is not None:
self.parser.add_dir(vhost.path,
"SSLCertificateChainFile", chain_path)
# TODO: apache-parser-v2
# vhost.node.add_child_directive("SSLCertificateChainFile",
# (chain_path,))
else:
raise errors.PluginError("--chain-path is required for your "
"version of Apache")
@@ -497,8 +572,11 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
raise errors.PluginError("Please provide the --fullchain-path "
"option pointing to your full chain file")
set_cert_path = fullchain_path
self.aug.set(path["cert_path"][-1], fullchain_path)
self.aug.set(path["cert_key"][-1], key_path)
self.parser.aug.set(path["cert_path"][-1], fullchain_path)
self.parser.aug.set(path["cert_key"][-1], key_path)
# TODO: apache-parser-v2
# path["cert_path"][-1].set_parameters(fullchain_path,)
# path["cert_key"][-1].set_parameters(key_path,)
# Enable the new vhost if needed
if not vhost.enabled:
@@ -744,29 +822,48 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
return ""
def _get_vhost_names(self, path):
def _get_vhost_names(self, vhost):
"""Helper method for getting the ServerName and
ServerAlias values from vhost in path
ServerAlias values from vhost
:param path: Path to read ServerName and ServerAliases from
:param vhost: VirtualHost object to read ServerName and ServerAliases from
:returns: Tuple including ServerName and `list` of ServerAlias strings
"""
servername_match = self.parser.find_dir(
"ServerName", None, start=path, exclude=False)
"ServerName", None, start=vhost.path, exclude=False)
serveralias_match = self.parser.find_dir(
"ServerAlias", None, start=path, exclude=False)
"ServerAlias", None, start=vhost.path, exclude=False)
# TODO: apache-parser-v2
# v2_servername_match = vhost.node.find_directives("ServerName",
# exclude=False)
# v2_serveralias_match = vhost.node.find_directives("ServerAlias",
# exclude=False)
# parserassert.legacy_assert_dir(servername_match, v2_servername_match)
# parserassert.legacy_assert_dir(serveralias_match, v2_serveralias_match)
serveraliases = []
for alias in serveralias_match:
serveralias = self.parser.get_arg(alias)
serveraliases.append(serveralias)
# TODO: apache-parser-v2
# v2_serveraliases = []
# for alias in v2_serveralias_match:
# v2_serveraliases += list(alias.parameters)
# parserassert.legacy_assert_list(serveraliases, v2_serveraliases)
servername = None
if servername_match:
# Get last ServerName as each overwrites the previous
servername = self.parser.get_arg(servername_match[-1])
# TODO: apache-parser-v2
# v2_servername = None
# if v2_servername_match:
# v2_servername = v2_servername_match[-1].parameters[-1]
# parserassert.simpleassert(servername, v2_servername)
return (servername, serveraliases)
@@ -778,7 +875,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
"""
servername, serveraliases = self._get_vhost_names(host.path)
servername, serveraliases = self._get_vhost_names(host)
for alias in serveraliases:
if not host.modmacro:
@@ -798,7 +895,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
"""
addrs = set()
try:
args = self.aug.match(path + "/arg")
args = self.parser.aug.match(path + "/arg")
except RuntimeError:
logger.warning("Encountered a problem while parsing file: %s, skipping", path)
return None
@@ -816,7 +913,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
is_ssl = True
filename = apache_util.get_file_path(
self.aug.get("/augeas/files%s/path" % apache_util.get_file_path(path)))
self.parser.aug.get("/augeas/files%s/path" % apache_util.get_file_path(path)))
if filename is None:
return None
@@ -846,7 +943,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
# Make a list of parser paths because the parser_paths
# dictionary may be modified during the loop.
for vhost_path in list(self.parser.parser_paths):
paths = self.aug.match(
paths = self.parser.aug.match(
("/files%s//*[label()=~regexp('%s')]" %
(vhost_path, parser.case_i("VirtualHost"))))
paths = [path for path in paths if
@@ -856,7 +953,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
if not new_vhost:
continue
internal_path = apache_util.get_internal_aug_path(new_vhost.path)
realpath = os.path.realpath(new_vhost.filep)
realpath = filesystem.realpath(new_vhost.filep)
if realpath not in file_paths:
file_paths[realpath] = new_vhost.filep
internal_paths[realpath].add(internal_path)
@@ -1100,16 +1197,16 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
avail_fp = nonssl_vhost.filep
ssl_fp = self._get_ssl_vhost_path(avail_fp)
orig_matches = self.aug.match("/files%s//* [label()=~regexp('%s')]" %
orig_matches = self.parser.aug.match("/files%s//* [label()=~regexp('%s')]" %
(self._escape(ssl_fp),
parser.case_i("VirtualHost")))
self._copy_create_ssl_vhost_skeleton(nonssl_vhost, ssl_fp)
# Reload augeas to take into account the new vhost
self.aug.load()
self.parser.aug.load()
# Get Vhost augeas path for new vhost
new_matches = self.aug.match("/files%s//* [label()=~regexp('%s')]" %
new_matches = self.parser.aug.match("/files%s//* [label()=~regexp('%s')]" %
(self._escape(ssl_fp),
parser.case_i("VirtualHost")))
@@ -1120,7 +1217,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
# Make Augeas aware of the new vhost
self.parser.parse_file(ssl_fp)
# Try to search again
new_matches = self.aug.match(
new_matches = self.parser.aug.match(
"/files%s//* [label()=~regexp('%s')]" %
(self._escape(ssl_fp),
parser.case_i("VirtualHost")))
@@ -1182,11 +1279,11 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
"""
if self.conf("vhost-root") and os.path.exists(self.conf("vhost-root")):
fp = os.path.join(os.path.realpath(self.option("vhost_root")),
fp = os.path.join(filesystem.realpath(self.option("vhost_root")),
os.path.basename(non_ssl_vh_fp))
else:
# Use non-ssl filepath
fp = os.path.realpath(non_ssl_vh_fp)
fp = filesystem.realpath(non_ssl_vh_fp)
if fp.endswith(".conf"):
return fp[:-(len(".conf"))] + self.option("le_vhost_ext")
@@ -1270,8 +1367,8 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
"vhost for your HTTPS site located at {1} because they have "
"the potential to create redirection loops.".format(
vhost.filep, ssl_fp), reporter.MEDIUM_PRIORITY)
self.aug.set("/augeas/files%s/mtime" % (self._escape(ssl_fp)), "0")
self.aug.set("/augeas/files%s/mtime" % (self._escape(vhost.filep)), "0")
self.parser.aug.set("/augeas/files%s/mtime" % (self._escape(ssl_fp)), "0")
self.parser.aug.set("/augeas/files%s/mtime" % (self._escape(vhost.filep)), "0")
def _sift_rewrite_rules(self, contents):
""" Helper function for _copy_create_ssl_vhost_skeleton to prepare the
@@ -1346,7 +1443,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
"""
try:
span_val = self.aug.span(vhost.path)
span_val = self.parser.aug.span(vhost.path)
except ValueError:
logger.critical("Error while reading the VirtualHost %s from "
"file %s", vhost.name, vhost.filep, exc_info=True)
@@ -1381,13 +1478,13 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
def _update_ssl_vhosts_addrs(self, vh_path):
ssl_addrs = set()
ssl_addr_p = self.aug.match(vh_path + "/arg")
ssl_addr_p = self.parser.aug.match(vh_path + "/arg")
for addr in ssl_addr_p:
old_addr = obj.Addr.fromstring(
str(self.parser.get_arg(addr)))
ssl_addr = old_addr.get_addr_obj("443")
self.aug.set(addr, str(ssl_addr))
self.parser.aug.set(addr, str(ssl_addr))
ssl_addrs.add(ssl_addr)
return ssl_addrs
@@ -1406,14 +1503,14 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
vh_path, False)) > 1:
directive_path = self.parser.find_dir(directive, None,
vh_path, False)
self.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
self.parser.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
def _remove_directives(self, vh_path, directives):
for directive in directives:
while self.parser.find_dir(directive, None, vh_path, False):
directive_path = self.parser.find_dir(directive, None,
vh_path, False)
self.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
self.parser.aug.remove(re.sub(r"/\w*$", "", directive_path[0]))
def _add_dummy_ssl_directives(self, vh_path):
self.parser.add_dir(vh_path, "SSLCertificateFile",
@@ -1427,7 +1524,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
def _add_servername_alias(self, target_name, vhost):
vh_path = vhost.path
sname, saliases = self._get_vhost_names(vh_path)
sname, saliases = self._get_vhost_names(vhost)
if target_name == sname or target_name in saliases:
return
if self._has_matching_wildcard(vh_path, target_name):
@@ -1452,7 +1549,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
"""
matches = self.parser.find_dir(
"ServerAlias", start=vh_path, exclude=False)
aliases = (self.aug.get(match) for match in matches)
aliases = (self.parser.aug.get(match) for match in matches)
return self.domain_in_names(aliases, target_name)
def _add_name_vhost_if_necessary(self, vhost):
@@ -1635,7 +1732,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
if header_path:
pat = '(?:[ "]|^)(strict-transport-security)(?:[ "]|$)'
for match in header_path:
if re.search(pat, self.aug.get(match).lower()):
if re.search(pat, self.parser.aug.get(match).lower()):
hsts_dirpath = match
if not hsts_dirpath:
err_msg = ("Certbot was unable to find the existing HSTS header "
@@ -1649,7 +1746,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
# Our match statement was for string strict-transport-security, but
# we need to update the value instead. The next index is for the value
hsts_dirpath = hsts_dirpath.replace("arg[3]", "arg[4]")
self.aug.set(hsts_dirpath, hsts_maxage)
self.parser.aug.set(hsts_dirpath, hsts_maxage)
note_msg = ("Increasing HSTS max-age value to {0} for VirtualHost "
"in {1}\n".format(nextstep_value, vhost.filep))
logger.debug(note_msg)
@@ -1731,7 +1828,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
# We'll simply delete the directive, so that we'll have a
# consistent OCSP cache path.
if stapling_cache_aug_path:
self.aug.remove(
self.parser.aug.remove(
re.sub(r"/\w*$", "", stapling_cache_aug_path[0]))
self.parser.add_dir_to_ifmodssl(ssl_vhost_aug_path,
@@ -1808,7 +1905,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
# "Existing Header directive for virtualhost"
pat = '(?:[ "]|^)(%s)(?:[ "]|$)' % (header_substring.lower())
for match in header_path:
if re.search(pat, self.aug.get(match).lower()):
if re.search(pat, self.parser.aug.get(match).lower()):
raise errors.PluginEnhancementAlreadyPresent(
"Existing %s header" % (header_substring))
@@ -1935,11 +2032,11 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
constants.REWRITE_HTTPS_ARGS_WITH_END]
for dir_path, args_paths in rewrite_args_dict.items():
arg_vals = [self.aug.get(x) for x in args_paths]
arg_vals = [self.parser.aug.get(x) for x in args_paths]
# Search for past redirection rule, delete it, set the new one
if arg_vals in constants.OLD_REWRITE_HTTPS_ARGS:
self.aug.remove(dir_path)
self.parser.aug.remove(dir_path)
self._set_https_redirection_rewrite_rule(vhost)
self.save()
raise errors.PluginEnhancementAlreadyPresent(
@@ -1995,7 +2092,7 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
redirect_filepath = self._write_out_redirect(ssl_vhost, text)
self.aug.load()
self.parser.aug.load()
# Make a new vhost data structure and add it to the lists
new_vhost = self._create_vhost(parser.get_aug_path(self._escape(redirect_filepath)))
self.vhosts.append(new_vhost)
@@ -2299,8 +2396,9 @@ class ApacheConfigurator(augeas_configurator.AugeasConfigurator):
# XXX if we ever try to enforce a local privilege boundary (eg, running
# certbot for unprivileged users via setuid), this function will need
# to be modified.
return common.install_version_controlled_file(options_ssl, options_ssl_digest,
self.option("MOD_SSL_CONF_SRC"), constants.ALL_SSL_OPTIONS_HASHES)
apache_config_path = self.pick_apache_config()
return common.install_version_controlled_file(
options_ssl, options_ssl_digest, apache_config_path, constants.ALL_SSL_OPTIONS_HASHES)
def enable_autohsts(self, _unused_lineage, domains):
"""

View File

@@ -9,6 +9,7 @@ MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
UPDATED_MOD_SSL_CONF_DIGEST = ".updated-options-ssl-apache-conf-digest.txt"
"""Name of the hash of the updated or informed mod_ssl_conf as saved in `IConfig.config_dir`."""
# NEVER REMOVE A SINGLE HASH FROM THIS LIST UNLESS YOU KNOW EXACTLY WHAT YOU ARE DOING!
ALL_SSL_OPTIONS_HASHES = [
'2086bca02db48daf93468332543c60ac6acdb6f0b58c7bfdf578a5d47092f82a',
'4844d36c9a0f587172d9fa10f4f1c9518e3bcfa1947379f155e16a70a728c21a',
@@ -18,6 +19,10 @@ ALL_SSL_OPTIONS_HASHES = [
'cfdd7c18d2025836ea3307399f509cfb1ebf2612c87dd600a65da2a8e2f2797b',
'80720bd171ccdc2e6b917ded340defae66919e4624962396b992b7218a561791',
'c0c022ea6b8a51ecc8f1003d0a04af6c3f2bc1c3ce506b3c2dfc1f11ef931082',
'717b0a89f5e4c39b09a42813ac6e747cfbdeb93439499e73f4f70a1fe1473f20',
'0fcdc81280cd179a07ec4d29d3595068b9326b455c488de4b09f585d5dafc137',
'86cc09ad5415cd6d5f09a947fe2501a9344328b1e8a8b458107ea903e80baa6c',
'06675349e457eae856120cdebb564efe546f0b87399f2264baeb41e442c724c7',
]
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""

View File

@@ -5,6 +5,7 @@ from acme.magic_typing import List, Set # pylint: disable=unused-import, no-nam
from certbot import errors
from certbot.compat import os
from certbot.compat import filesystem
from certbot.plugins import common
from certbot_apache.obj import VirtualHost # pylint: disable=unused-import
@@ -168,8 +169,7 @@ class ApacheHttp01(common.TLSSNI01):
def _set_up_challenges(self):
if not os.path.isdir(self.challenge_dir):
os.makedirs(self.challenge_dir)
os.chmod(self.challenge_dir, 0o755)
filesystem.makedirs(self.challenge_dir, 0o755)
responses = []
for achall in self.achalls:
@@ -185,7 +185,7 @@ class ApacheHttp01(common.TLSSNI01):
self.configurator.reverter.register_file_creation(True, name)
with open(name, 'wb') as f:
f.write(validation.encode())
os.chmod(name, 0o644)
filesystem.chmod(name, 0o644)
return response

View File

@@ -0,0 +1,375 @@
"""Parser interfaces."""
import abc
import six
@six.add_metaclass(abc.ABCMeta)
class ParserNode(object):
"""
ParserNode is the basic building block of the tree of such nodes,
representing the structure of the configuration. It is largely meant to keep
the structure information intact and idiomatically accessible.
The root node as well as the child nodes of it should be instances of ParserNode.
Nodes keep track of their differences to on-disk representation of configuration
by marking modified ParserNodes as dirty to enable partial write-to-disk for
different files in the configuration structure.
While for the most parts the usage and the child types are obvious, "include"-
and similar directives are an exception to this rule. This is because of the
nature of include directives - which unroll the contents of another file or
configuration block to their place. While we could unroll the included nodes
to the parent tree, it remains important to keep the context of include nodes
separate in order to write back the original configuration as it was.
For parsers that require the implementation to keep track of the whitespacing,
it's responsibility of each ParserNode object itself to store its prepending
whitespaces in order to be able to reconstruct the complete configuration file i
as it was when originally read from the disk.
"""
@property
@abc.abstractmethod
def ancestor(self): # pragma: no cover
"""
This property contains a reference to ancestor node, or None if the node
is the root node of the configuration tree.
:returns: The ancestor BlockNode object, or None for root node.
:rtype: ParserNode
"""
raise NotImplementedError
@property
@abc.abstractmethod
def dirty(self): # pragma: no cover
"""
This property contains a boolean value of the information if this node has
been modified since last save (or after the initial parse).
:returns: True if this node has had changes that have not yet been written
to disk.
:rtype: bool
"""
raise NotImplementedError
@abc.abstractmethod
def save(self, msg):
"""
Save traverses the children, and attempts to write the AST to disk for
all the objects that are marked dirty. The actual operation of course
depends on the underlying implementation. save() shouldn't be called
from the Configurator outside of its designated save() method in order
to ensure that the Reverter checkpoints are created properly.
Note: this approach of keeping internal structure of the configuration
within the ParserNode tree does not represent the file inclusion structure
of actual configuration files that reside in the filesystem. To handle
file writes properly, the file specific temporary trees should be extracted
from the full ParserNode tree where necessary when writing to disk.
"""
@six.add_metaclass(abc.ABCMeta)
class CommentNode(ParserNode):
"""
CommentNode class is used for representation of comments within the parsed
configuration structure. Because of the nature of comments, it is not able
to have child nodes and hence it is always treated as a leaf node.
CommentNode stores its contents in class variable 'comment' and does not
have a specific name.
"""
@property
@abc.abstractmethod
def comment(self): # pragma: no cover
"""
Comment property contains the contents of the comment without the comment
directives (typically # or /* ... */).
:returns: A string containing the comment
:rtype: str
"""
raise NotImplementedError
@six.add_metaclass(abc.ABCMeta)
class DirectiveNode(ParserNode):
"""
DirectiveNode class represents a configuration directive within the configuration.
It can have zero or more arguments attached to it. Because of the nature of
single directives, it is not able to have child nodes and hence it is always
treated as a leaf node.
"""
@property
@abc.abstractmethod
def enabled(self): # pragma: no cover
"""
Configuration blocks may have conditional statements enabling or disabling
their contents. This property returns the state of this DirectiveNode.
:returns: True if the DirectiveNode is parsed and enabled in the configuration.
:rtype: bool
"""
raise NotImplementedError
@property
@abc.abstractmethod
def name(self): # pragma: no cover
"""
Name property contains the name of the directive.
:returns: Name of this node
:rtype: str
"""
raise NotImplementedError
@property
@abc.abstractmethod
def parameters(self): # pragma: no cover
"""
This property contains a tuple of parameters of this ParserNode object
excluding whitespaces.
:returns: A tuple of parameters for this node
:rtype: tuple
"""
raise NotImplementedError
@abc.abstractmethod
def set_parameters(self, parameters):
"""
Sets the sequence of parameters for this ParserNode object without
whitespaces, and marks this object dirty.
:param list parameters: sequence of parameters
"""
@six.add_metaclass(abc.ABCMeta)
class BlockNode(ParserNode):
"""
BlockNode class represents a block of nested configuration directives, comments
and other blocks as its children. A BlockNode can have zero or more arguments
attached to it.
Configuration blocks typically consist of one or more child nodes of all possible
types. Because of this, the BlockNode class has various discovery and structure
management methods.
Lists of arguments used as an optional argument for some of the methods should
be lists of strings that are applicable arguments for each specific BlockNode
or DirectiveNode types. As an example, for a following configuration example:
<VirtualHost *:80>
...
</VirtualHost>
The node type would be BlockNode, name would be 'VirtualHost' and arguments
would be: ['*:80'].
While for the following example:
LoadModule alias_module /usr/lib/apache2/modules/mod_alias.so
The node type would be DirectiveNode, name would be 'LoadModule' and arguments
would be: ['alias_module', '/usr/lib/apache2/modules/mod_alias.so']
The applicable arguments are dependent on the underlying configuration language
and its grammar.
"""
@abc.abstractmethod
def add_child_block(self, name, arguments=None, position=None):
"""
Adds a new BlockNode child node with provided values and marks the callee
BlockNode dirty. This is used to add new children to the AST.
:param str name: The name of the child node to add
:param list arguments: list of arguments for the node
:param int position: Position in the list of children to add the new child
node to. Defaults to None, which appends the newly created node to the list.
If an integer is given, the child is inserted before that index in the
list similar to list().insert.
:returns: BlockNode instance of the created child block
"""
@abc.abstractmethod
def add_child_directive(self, name, arguments=None, position=None):
"""
Adds a new DirectiveNode child node with provided values and marks the
callee BlockNode dirty. This is used to add new children to the AST.
:param str name: The name of the child node to add
:param list arguments: list of arguments for the node
:param int position: Position in the list of children to add the new child
node to. Defaults to None, which appends the newly created node to the list.
If an integer is given, the child is inserted before that index in the
list similar to list().insert.
:returns: DirectiveNode instance of the created child directive
"""
@abc.abstractmethod
def add_child_comment(self, comment="", position=None):
"""
Adds a new CommentNode child node with provided value and marks the
callee BlockNode dirty. This is used to add new children to the AST.
:param str comment: Comment contents
:param int position: Position in the list of children to add the new child
node to. Defaults to None, which appends the newly created node to the list.
If an integer is given, the child is inserted before that index in the
list similar to list().insert.
:returns: CommentNode instance of the created child comment
"""
@property
@abc.abstractmethod
def children(self): # pragma: no cover
"""
This property contains a list ParserNode objects that are the children
for this node. The order of children is the same as that of the parsed
configuration block.
:returns: A tuple of this block's children
:rtype: tuple
"""
raise NotImplementedError
@property
@abc.abstractmethod
def enabled(self):
"""
Configuration blocks may have conditional statements enabling or disabling
their contents. This property returns the state of this configuration block.
In case of unmatched conditional statement in block, this block itself should
be set enabled while its children should be set disabled.
:returns: True if the BlockNode is parsed and enabled in the configuration.
:rtype: bool
"""
@abc.abstractmethod
def find_blocks(self, name, exclude=True):
"""
Find a configuration block by name. This method walks the child tree of
ParserNodes under the instance it was called from. This way it is possible
to search for the whole configuration tree, when starting from root node or
to do a partial search when starting from a specified branch.
:param str name: The name of the directive to search for
:param bool exclude: If the search results should exclude the contents of
ParserNode objects that reside within conditional blocks and because
of current state are not enabled.
:returns: A list of found BlockNode objects.
"""
@abc.abstractmethod
def find_directives(self, name, exclude=True):
"""
Find a directive by name. This method walks the child tree of ParserNodes
under the instance it was called from. This way it is possible to search
for the whole configuration tree, when starting from root node, or to do
a partial search when starting from a specified branch.
:param str name: The name of the directive to search for
:param bool exclude: If the search results should exclude the contents of
ParserNode objects that reside within conditional blocks and because
of current state are not enabled.
:returns: A list of found DirectiveNode objects.
"""
@abc.abstractmethod
def find_comments(self, comment, exact=False):
"""
Find comments with value containing or being exactly the same as search term.
This method walks the child tree of ParserNodes under the instance it was
called from. This way it is possible to search for the whole configuration
tree, when starting from root node, or to do a partial search when starting
from a specified branch.
:param str comment: The content of comment to search for
:param bool exact: If the comment needs to exactly match the search term
:returns: A list of found CommentNode objects.
"""
@abc.abstractmethod
def delete_child(self, child):
"""
Remove a specified child node from the list of children of the called
BlockNode object.
:param ParserNode child: Child ParserNode object to remove from the list
of children of the callee.
"""
@property
@abc.abstractmethod
def name(self): # pragma: no cover
"""
Name property contains the name of the block. As an example for config:
<VirtualHost *:80> ... </VirtualHost>
the name would be "VirtualHost".
:returns: Name of this node
:rtype: str
"""
raise NotImplementedError
@property
@abc.abstractmethod
def parameters(self): # pragma: no cover
"""
This property contains a tuple of parameters of this ParserNode object
excluding whitespaces.
:returns: A tuple of parameters for this node
:rtype: tuple
"""
raise NotImplementedError
@abc.abstractmethod
def set_parameters(self, parameters):
"""
Sets the sequence of parameters for this ParserNode object without
whitespaces, and marks this object dirty.
:param list parameters: sequence of parameters
"""
@abc.abstractmethod
def unsaved_files(self):
"""
Returns a list of file paths that have been changed since the last save
(or the initial configuration parse). The intended use for this method
is to tell the Reverter which files need to be included in a checkpoint.
This is typically called for the root of the ParserNode tree.
:returns: list of file paths of files that have been changed but not yet
saved to disk.
"""

View File

@@ -1,6 +1,4 @@
""" Distribution specific override class for Arch Linux """
import pkg_resources
import zope.interface
from certbot import interfaces
@@ -26,6 +24,4 @@ class ArchConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/httpd/conf",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)

View File

@@ -1,7 +1,6 @@
""" Distribution specific override class for CentOS family (RHEL, Fedora) """
import logging
import pkg_resources
import zope.interface
from certbot import errors
@@ -39,8 +38,6 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/httpd/conf.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "centos-options-ssl-apache.conf")
)
def config_test(self):
@@ -75,6 +72,18 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
# Finish with actual config check to see if systemctl restart helped
super(CentOSConfigurator, self).config_test()
def pick_apache_config(self):
"""
Pick the appropriate TLS Apache configuration file for current version of Apache and OS.
:return: the path to the TLS Apache configuration file to use
:rtype: str
"""
# Disabling TLS session tickets is supported by Apache 2.4.11+.
# So for old versions of Apache we pick a configuration without this option.
if self.version < (2, 4, 11):
return apache_util.find_ssl_apache_conf("centos-old")
return apache_util.find_ssl_apache_conf("centos-current")
def _prepare_options(self):
"""
Override the options dictionary initialization in order to support
@@ -86,7 +95,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
def get_parser(self):
"""Initializes the ApacheParser"""
return CentOSParser(
self.aug, self.option("server_root"), self.option("vhost_root"),
self.option("server_root"), self.option("vhost_root"),
self.version, configurator=self)
def _deploy_cert(self, *args, **kwargs): # pylint: disable=arguments-differ
@@ -155,7 +164,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
for loadmod_path in loadmod_paths:
nodir_path = loadmod_path.split("/directive")[0]
# Remove the old LoadModule directive
self.aug.remove(loadmod_path)
self.parser.aug.remove(loadmod_path)
# Create a new IfModule !mod_ssl.c if not already found on path
ssl_ifmod = self.parser.get_ifmod(nodir_path, "!mod_ssl.c",

View File

@@ -1,6 +1,4 @@
""" Distribution specific override class for macOS """
import pkg_resources
import zope.interface
from certbot import interfaces
@@ -26,6 +24,4 @@ class DarwinConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/apache2/other",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)

View File

@@ -1,12 +1,12 @@
""" Distribution specific override class for Debian family (Ubuntu/Debian) """
import logging
import pkg_resources
import zope.interface
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache import apache_util
@@ -34,8 +34,6 @@ class DebianConfigurator(configurator.ApacheConfigurator):
handle_modules=True,
handle_sites=True,
challenge_location="/etc/apache2",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
def enable_site(self, vhost):
@@ -65,7 +63,7 @@ class DebianConfigurator(configurator.ApacheConfigurator):
try:
os.symlink(vhost.filep, enabled_path)
except OSError as err:
if os.path.islink(enabled_path) and os.path.realpath(
if os.path.islink(enabled_path) and filesystem.realpath(
enabled_path) == vhost.filep:
# Already in shape
vhost.enabled = True

View File

@@ -1,5 +1,4 @@
""" Distribution specific override class for Fedora 29+ """
import pkg_resources
import zope.interface
from certbot import errors
@@ -31,9 +30,6 @@ class FedoraConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/httpd/conf.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
# TODO: eventually newest version of Fedora will need their own config
"certbot_apache", "centos-options-ssl-apache.conf")
)
def config_test(self):
@@ -51,7 +47,7 @@ class FedoraConfigurator(configurator.ApacheConfigurator):
def get_parser(self):
"""Initializes the ApacheParser"""
return FedoraParser(
self.aug, self.option("server_root"), self.option("vhost_root"),
self.option("server_root"), self.option("vhost_root"),
self.version, configurator=self)
def _try_restart_fedora(self):

View File

@@ -1,6 +1,4 @@
""" Distribution specific override class for Gentoo Linux """
import pkg_resources
import zope.interface
from certbot import interfaces
@@ -29,8 +27,6 @@ class GentooConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/apache2/vhosts.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
def _prepare_options(self):
@@ -44,7 +40,7 @@ class GentooConfigurator(configurator.ApacheConfigurator):
def get_parser(self):
"""Initializes the ApacheParser"""
return GentooParser(
self.aug, self.option("server_root"), self.option("vhost_root"),
self.option("server_root"), self.option("vhost_root"),
self.version, configurator=self)

View File

@@ -1,6 +1,4 @@
""" Distribution specific override class for OpenSUSE """
import pkg_resources
import zope.interface
from certbot import interfaces
@@ -26,6 +24,4 @@ class OpenSUSEConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/apache2/vhosts.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)

View File

@@ -13,6 +13,8 @@ from acme.magic_typing import Dict, List, Set # pylint: disable=unused-import,
from certbot import errors
from certbot.compat import os
from certbot_apache import constants
logger = logging.getLogger(__name__)
@@ -32,7 +34,7 @@ class ApacheParser(object):
arg_var_interpreter = re.compile(r"\$\{[^ \}]*}")
fnmatch_chars = set(["*", "?", "\\", "[", "]"])
def __init__(self, aug, root, vhostroot=None, version=(2, 4),
def __init__(self, root, vhostroot=None, version=(2, 4),
configurator=None):
# Note: Order is important here.
@@ -41,15 +43,25 @@ class ApacheParser(object):
# issues with aug.load() after adding new files / defines to parse tree
self.configurator = configurator
# Initialize augeas
self.aug = None
self.init_augeas()
if not self.check_aug_version():
raise errors.NotSupportedError(
"Apache plugin support requires libaugeas0 and augeas-lenses "
"version 1.2.0 or higher, please make sure you have you have "
"those installed.")
self.modules = set() # type: Set[str]
self.parser_paths = {} # type: Dict[str, List[str]]
self.variables = {} # type: Dict[str, str]
self.aug = aug
# Find configuration root and make sure augeas can parse it.
self.root = os.path.abspath(root)
self.loc = {"root": self._find_config_root()}
self.parse_file(self.loc["root"])
self.handle_includes()
if version >= (2, 4):
# Look up variables from httpd and add to DOM if not already parsed
@@ -77,6 +89,146 @@ class ApacheParser(object):
if self.find_dir("Define", exclude=False):
raise errors.PluginError("Error parsing runtime variables")
def init_augeas(self):
""" Initialize the actual Augeas instance """
try:
import augeas
except ImportError: # pragma: no cover
raise errors.NoInstallationError("Problem in Augeas installation")
self.aug = augeas.Augeas(
# specify a directory to load our preferred lens from
loadpath=constants.AUGEAS_LENS_DIR,
# Do not save backup (we do it ourselves), do not load
# anything by default
flags=(augeas.Augeas.NONE |
augeas.Augeas.NO_MODL_AUTOLOAD |
augeas.Augeas.ENABLE_SPAN))
def check_parsing_errors(self, lens):
"""Verify Augeas can parse all of the lens files.
:param str lens: lens to check for errors
:raises .errors.PluginError: If there has been an error in parsing with
the specified lens.
"""
error_files = self.aug.match("/augeas//error")
for path in error_files:
# Check to see if it was an error resulting from the use of
# the httpd lens
lens_path = self.aug.get(path + "/lens")
# As aug.get may return null
if lens_path and lens in lens_path:
msg = (
"There has been an error in parsing the file {0} on line {1}: "
"{2}".format(
# Strip off /augeas/files and /error
path[13:len(path) - 6],
self.aug.get(path + "/line"),
self.aug.get(path + "/message")))
raise errors.PluginError(msg)
def check_aug_version(self):
""" Checks that we have recent enough version of libaugeas.
If augeas version is recent enough, it will support case insensitive
regexp matching"""
self.aug.set("/test/path/testing/arg", "aRgUMeNT")
try:
matches = self.aug.match(
"/test//*[self::arg=~regexp('argument', 'i')]")
except RuntimeError:
self.aug.remove("/test/path")
return False
self.aug.remove("/test/path")
return matches
def unsaved_files(self):
"""Lists files that have modified Augeas DOM but the changes have not
been written to the filesystem yet, used by `self.save()` and
ApacheConfigurator to check the file state.
:raises .errors.PluginError: If there was an error in Augeas, in
an attempt to save the configuration, or an error creating a
checkpoint
:returns: `set` of unsaved files
"""
save_state = self.aug.get("/augeas/save")
self.aug.set("/augeas/save", "noop")
# Existing Errors
ex_errs = self.aug.match("/augeas//error")
try:
# This is a noop save
self.aug.save()
except (RuntimeError, IOError):
self._log_save_errors(ex_errs)
# Erase Save Notes
self.configurator.save_notes = ""
raise errors.PluginError(
"Error saving files, check logs for more info.")
# Return the original save method
self.aug.set("/augeas/save", save_state)
# Retrieve list of modified files
# Note: Noop saves can cause the file to be listed twice, I used a
# set to remove this possibility. This is a known augeas 0.10 error.
save_paths = self.aug.match("/augeas/events/saved")
save_files = set()
if save_paths:
for path in save_paths:
save_files.add(self.aug.get(path)[6:])
return save_files
def ensure_augeas_state(self):
"""Makes sure that all Augeas dom changes are written to files to avoid
loss of configuration directives when doing additional augeas parsing,
causing a possible augeas.load() resulting dom reset
"""
if self.unsaved_files():
self.configurator.save_notes += "(autosave)"
self.configurator.save()
def save(self, save_files):
"""Saves all changes to the configuration files.
save() is called from ApacheConfigurator to handle the parser specific
tasks of saving.
:param list save_files: list of strings of file paths that we need to save.
"""
self.configurator.save_notes = ""
self.aug.save()
# Force reload if files were modified
# This is needed to recalculate augeas directive span
if save_files:
for sf in save_files:
self.aug.remove("/files/"+sf)
self.aug.load()
def _log_save_errors(self, ex_errs):
"""Log errors due to bad Augeas save.
:param list ex_errs: Existing errors before save
"""
# Check for the root of save problems
new_errs = self.aug.match("/augeas//error")
# logger.error("During Save - %s", mod_conf)
logger.error("Unable to save files: %s. Attempted Save Notes: %s",
", ".join(err[13:len(err) - 6] for err in new_errs
# Only new errors caused by recent save
if err not in ex_errs), self.configurator.save_notes)
def add_include(self, main_config, inc_path):
"""Add Include for a new configuration file if one does not exist
@@ -118,6 +270,7 @@ class ApacheParser(object):
"""
mods = set() # type: Set[str]
self.handle_includes()
matches = self.find_dir("LoadModule")
iterator = iter(matches)
# Make sure prev_size != cur_size for do: while: iteration
@@ -170,11 +323,7 @@ class ApacheParser(object):
def update_includes(self):
"""Get includes from httpd process, and add them to DOM if needed"""
# Find_dir iterates over configuration for Include and IncludeOptional
# directives to make sure we see the full include tree present in the
# configuration files
_ = self.find_dir("Include")
self.handle_includes()
inc_cmd = [self.configurator.option("ctl"), "-t", "-D",
"DUMP_INCLUDES"]
matches = self.parse_from_subprocess(inc_cmd, r"\(.*\) (.*)")
@@ -455,6 +604,8 @@ class ApacheParser(object):
# includes = self.aug.match(start +
# "//* [self::directive='Include']/* [label()='arg']")
regex = "%s" % (case_i(directive))
regex = "(%s)|(%s)|(%s)" % (case_i(directive),
case_i("Include"),
case_i("IncludeOptional"))
@@ -486,6 +637,49 @@ class ApacheParser(object):
return ordered_matches
def handle_includes(self, known_matches=None):
"""
This method searches through the configuration tree for Include and
IncludeOptional directives and includes them in the Augeas DOM tree
recursively.
:param list known_matches: List of already known Augeas DOM paths of found
include directives.
"""
start = get_aug_path(self.loc["root"])
regex = "(%s)|(%s)" % (case_i("Include"), case_i("IncludeOptional"))
all_matches = self.aug.match(
"%s//*[self::directive=~regexp('%s')]" % (start, regex))
if known_matches and all([True for m in all_matches if m in known_matches]):
# We already found everything previously
return
# Always use the exclusion as we want to have the correct state
matches = self._exclude_dirs(all_matches)
for match in matches:
incpath = self.get_arg(match + "/arg")
# Attempts to add a transform to the file if one does not already exist
# Standardize the include argument based on server root
if not incpath.startswith("/"):
# Normpath will condense ../
incpath = os.path.normpath(os.path.join(self.root, incpath))
else:
incpath = os.path.normpath(incpath)
if os.path.isdir(incpath):
self.parse_file(os.path.join(incpath, "*"))
else:
self.parse_file(incpath)
# Iterate. The results will be checked and the recursion broken when
# everything has been found.
self.handle_includes(all_matches)
def get_all_args(self, match):
"""
Tries to fetch all arguments for a directive. See get_arg.
@@ -604,10 +798,10 @@ class ApacheParser(object):
arg = os.path.normpath(arg)
# Attempts to add a transform to the file if one does not already exist
if os.path.isdir(arg):
self.parse_file(os.path.join(arg, "*"))
else:
self.parse_file(arg)
#if os.path.isdir(arg):
# self.parse_file(os.path.join(arg, "*"))
#else:
# self.parse_file(arg)
# Argument represents an fnmatch regular expression, convert it
# Split up the path and convert each into an Augeas accepted regex
@@ -658,8 +852,7 @@ class ApacheParser(object):
use_new, remove_old = self._check_path_actions(filepath)
# Ensure that we have the latest Augeas DOM state on disk before
# calling aug.load() which reloads the state from disk
if self.configurator:
self.configurator.ensure_augeas_state()
self.ensure_augeas_state()
# Test if augeas included file for Httpd.lens
# Note: This works for augeas globs, ie. *.conf
if use_new:

View File

@@ -0,0 +1,84 @@
"""Runtime assertions for two different parser implementations"""
def simpleassert(old_value, new_value):
"""
Simple assertion
"""
assert old_value == new_value
def legacy_assert_list(old_list, new_list):
"""
Used to assert that both of the lists contain the same values.
"""
assert len(old_list) == len(new_list)
assert all([True for ol in old_list if ol in new_list])
def legacy_assert_dir(old_result, new_result):
"""
Used to test and ensure that the new implementation search results matches
the old implementation results. This test is intended to be used only to test
the old Augeas implementation results against the ParserNode Augeas implementation.
This test expects the DirectiveNode to have Augeas path in its attribute dict
"_metadata" with key "augpath".
"""
if new_result != "CERTBOT_PASS_ASSERT":
return
assert len(old_result) == len(new_result)
matching = []
for oldres in old_result:
match = [res for res in new_result if res._metadata["augpath"] == oldres]
assert len(match) == 1
def legacy_assert_args(old_result, new_result, parser):
"""
Used to test and ensure that returned parameter values are the same for
the old and the new implementation.
Uses ApacheParser to actually fetch the parameter for the old result.
This assertion is structured this way because of how parser.get_arg() is
currently used in the ApacheConfigurator, making it easier to test the
results.
"""
if isinstance(old_result, list):
for old in old_result:
oldarg = parser.get_arg(old_result)
assert oldarg in new_result.parameters
else:
oldarg = parser.get_arg(old_result)
assert oldarg in new_result.parameters
def assert_dir(first, second):
"""
Used to test that DirectiveNode results match for both implementations.
"""
if "CERTBOT_PASS_ASSERT" in [first, second]:
return
assert first.name == second.name
assert first.parameters == second.parameters
assert first.dirty == second.dirty
def assert_block(first, second):
"""
Used to test that BlockNode results match for both implementations.
"""
if "CERTBOT_PASS_ASSERT" in [first, second]:
return
assert first.name == second.name
assert first.parameters == second.parameters
assert len(first.children) == len(second.children)
assert first.dirty == second.dirty

View File

@@ -0,0 +1,27 @@
#!/usr/bin/env python
"""
This executable script wraps the apache-conf-test bash script, in order to setup a pebble instance
before its execution. Directory URL is passed through the SERVER environment variable.
"""
import os
import subprocess
import sys
from certbot_integration_tests.utils import acme_server
SCRIPT_DIRNAME = os.path.dirname(__file__)
def main(args=None):
if not args:
args = sys.argv[1:]
with acme_server.ACMEServer('pebble', [], False) as acme_xdist:
environ = os.environ.copy()
environ['SERVER'] = acme_xdist['directory_url']
command = [os.path.join(SCRIPT_DIRNAME, 'apache-conf-test')]
command.extend(args)
return subprocess.call(command, env=environ)
if __name__ == '__main__':
sys.exit(main())

View File

@@ -165,10 +165,6 @@ class CentOS6Tests(util.ApacheTest):
"LoadModule", "ssl_module", start=self.vh_truth[1].path, exclude=False)
self.assertEqual(len(post_loadmods), 1)
def test_loadmod_non_duplicate(self):
# the modules/mod_ssl.so exists in ssl.conf
sslmod_args = ["ssl_module", "modules/mod_somethingelse.so"]
@@ -197,7 +193,7 @@ class CentOS6Tests(util.ApacheTest):
exclude=False)
for mod in orig_loadmods:
noarg_path = mod.rpartition("/")[0]
self.config.aug.remove(noarg_path)
self.config.parser.aug.remove(noarg_path)
self.config.save()
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",

View File

@@ -4,6 +4,7 @@ import unittest
import mock
from certbot import errors
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache import obj
@@ -160,7 +161,7 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
"""Make sure we read the sysconfig OPTIONS variable correctly"""
# Return nothing for the process calls
mock_cfg.return_value = ""
self.config.parser.sysconfig_filep = os.path.realpath(
self.config.parser.sysconfig_filep = filesystem.realpath(
os.path.join(self.config.parser.root, "../sysconfig/httpd"))
self.config.parser.variables = {}
@@ -189,6 +190,13 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
errors.SubprocessError]
self.assertRaises(errors.MisconfigurationError, self.config.restart)
def test_pick_correct_tls_config(self):
self.config.version = (2, 4, 10)
self.assertTrue('centos-old' in self.config.pick_apache_config())
self.config.version = (2, 4, 11)
self.assertTrue('centos-current' in self.config.pick_apache_config())
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -91,6 +91,7 @@ class ComplexParserTest(util.ParserTest):
from certbot_apache import parser
self.parser.add_dir(parser.get_aug_path(self.parser.loc["default"]),
"Include", [arg])
self.parser.handle_includes()
if hit:
self.assertTrue(self.parser.find_dir("FNMATCH_DIRECTIVE"))
else:

View File

@@ -1,21 +1,20 @@
"""Test for certbot_apache.augeas_configurator."""
"""Test for certbot_apache.configurator implementations of reverter"""
import shutil
import unittest
import mock
from certbot import errors
from certbot.compat import os
from certbot_apache.tests import util
class AugeasConfiguratorTest(util.ApacheTest):
"""Test for Augeas Configurator base class."""
class ConfiguratorReverterTest(util.ApacheTest):
"""Test for ApacheConfigurator reverter methods"""
def setUp(self): # pylint: disable=arguments-differ
super(AugeasConfiguratorTest, self).setUp()
super(ConfiguratorReverterTest, self).setUp()
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir)
@@ -28,20 +27,6 @@ class AugeasConfiguratorTest(util.ApacheTest):
shutil.rmtree(self.work_dir)
shutil.rmtree(self.temp_dir)
def test_bad_parse(self):
# pylint: disable=protected-access
self.config.parser.parse_file(os.path.join(
self.config.parser.root, "conf-available", "bad_conf_file.conf"))
self.assertRaises(
errors.PluginError, self.config.check_parsing_errors, "httpd.aug")
def test_bad_save(self):
mock_save = mock.Mock()
mock_save.side_effect = IOError
self.config.aug.save = mock_save
self.assertRaises(errors.PluginError, self.config.save)
def test_bad_save_checkpoint(self):
self.config.reverter.add_to_checkpoint = mock.Mock(
side_effect=errors.ReverterError)
@@ -63,23 +48,9 @@ class AugeasConfiguratorTest(util.ApacheTest):
self.assertTrue(mock_finalize.is_called)
def test_recovery_routine(self):
mock_load = mock.Mock()
self.config.aug.load = mock_load
self.config.recovery_routine()
self.assertEqual(mock_load.call_count, 1)
def test_recovery_routine_error(self):
self.config.reverter.recovery_routine = mock.Mock(
side_effect=errors.ReverterError)
self.assertRaises(
errors.PluginError, self.config.recovery_routine)
def test_revert_challenge_config(self):
mock_load = mock.Mock()
self.config.aug.load = mock_load
self.config.parser.aug.load = mock_load
self.config.revert_challenge_config()
self.assertEqual(mock_load.call_count, 1)
@@ -93,7 +64,7 @@ class AugeasConfiguratorTest(util.ApacheTest):
def test_rollback_checkpoints(self):
mock_load = mock.Mock()
self.config.aug.load = mock_load
self.config.parser.aug.load = mock_load
self.config.rollback_checkpoints()
self.assertEqual(mock_load.call_count, 1)
@@ -103,13 +74,11 @@ class AugeasConfiguratorTest(util.ApacheTest):
side_effect=errors.ReverterError)
self.assertRaises(errors.PluginError, self.config.rollback_checkpoints)
def test_view_config_changes(self):
self.config.view_config_changes()
def test_view_config_changes_error(self):
self.config.reverter.view_config_changes = mock.Mock(
side_effect=errors.ReverterError)
self.assertRaises(errors.PluginError, self.config.view_config_changes)
def test_recovery_routine_reload(self):
mock_load = mock.Mock()
self.config.parser.aug.load = mock_load
self.config.recovery_routine()
self.assertEqual(mock_load.call_count, 1)
if __name__ == "__main__":

View File

@@ -16,6 +16,7 @@ from certbot import achallenges
from certbot import crypto_util
from certbot import errors
from certbot.compat import os
from certbot.compat import filesystem
from certbot.tests import acme_util
from certbot.tests import util as certbot_util
@@ -50,25 +51,14 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.deploy_cert = mocked_deploy_cert
return self.config
@mock.patch("certbot_apache.configurator.ApacheConfigurator.init_augeas")
@mock.patch("certbot_apache.configurator.path_surgery")
def test_prepare_no_install(self, mock_surgery, _init_augeas):
def test_prepare_no_install(self, mock_surgery):
silly_path = {"PATH": "/tmp/nothingness2342"}
mock_surgery.return_value = False
with mock.patch.dict('os.environ', silly_path):
self.assertRaises(errors.NoInstallationError, self.config.prepare)
self.assertEqual(mock_surgery.call_count, 1)
@mock.patch("certbot_apache.augeas_configurator.AugeasConfigurator.init_augeas")
def test_prepare_no_augeas(self, mock_init_augeas):
""" Test augeas initialization ImportError """
def side_effect_error():
""" Side effect error for the test """
raise ImportError
mock_init_augeas.side_effect = side_effect_error
self.assertRaises(
errors.NoInstallationError, self.config.prepare)
@mock.patch("certbot_apache.parser.ApacheParser")
@mock.patch("certbot_apache.configurator.util.exe_exists")
def test_prepare_version(self, mock_exe_exists, _):
@@ -80,16 +70,6 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertRaises(
errors.NotSupportedError, self.config.prepare)
@mock.patch("certbot_apache.parser.ApacheParser")
@mock.patch("certbot_apache.configurator.util.exe_exists")
def test_prepare_old_aug(self, mock_exe_exists, _):
mock_exe_exists.return_value = True
self.config.config_test = mock.Mock()
# pylint: disable=protected-access
self.config._check_aug_version = mock.Mock(return_value=False)
self.assertRaises(
errors.NotSupportedError, self.config.prepare)
def test_prepare_locked(self):
server_root = self.config.conf("server-root")
self.config.config_test = mock.Mock()
@@ -674,7 +654,7 @@ class MultipleVhostsTest(util.ApacheTest):
# span excludes the closing </VirtualHost> tag in older versions
# of Augeas
return_value = [self.vh_truth[0].filep, 1, 12, 0, 0, 0, 1142]
with mock.patch.object(self.config.aug, 'span') as mock_span:
with mock.patch.object(self.config.parser.aug, 'span') as mock_span:
mock_span.return_value = return_value
self.test_make_vhost_ssl()
@@ -682,7 +662,7 @@ class MultipleVhostsTest(util.ApacheTest):
# span includes the closing </VirtualHost> tag in newer versions
# of Augeas
return_value = [self.vh_truth[0].filep, 1, 12, 0, 0, 0, 1157]
with mock.patch.object(self.config.aug, 'span') as mock_span:
with mock.patch.object(self.config.parser.aug, 'span') as mock_span:
mock_span.return_value = return_value
self.test_make_vhost_ssl()
@@ -695,8 +675,7 @@ class MultipleVhostsTest(util.ApacheTest):
def test_make_vhost_ssl_nonexistent_vhost_path(self):
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[1])
self.assertEqual(os.path.dirname(ssl_vhost.filep),
os.path.dirname(os.path.realpath(
self.vh_truth[1].filep)))
os.path.dirname(filesystem.realpath(self.vh_truth[1].filep)))
def test_make_vhost_ssl(self):
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[0])
@@ -1231,7 +1210,7 @@ class MultipleVhostsTest(util.ApacheTest):
except errors.PluginEnhancementAlreadyPresent:
args_paths = self.config.parser.find_dir(
"RewriteRule", None, http_vhost.path, False)
arg_vals = [self.config.aug.get(x) for x in args_paths]
arg_vals = [self.config.parser.aug.get(x) for x in args_paths]
self.assertEqual(arg_vals, constants.REWRITE_HTTPS_ARGS)
@@ -1334,15 +1313,6 @@ class MultipleVhostsTest(util.ApacheTest):
return account_key, (achall1, achall2, achall3)
def test_aug_version(self):
mock_match = mock.Mock(return_value=["something"])
self.config.aug.match = mock_match
# pylint: disable=protected-access
self.assertEqual(self.config._check_aug_version(),
["something"])
self.config.aug.match.side_effect = RuntimeError
self.assertFalse(self.config._check_aug_version())
def test_enable_site_nondebian(self):
inc_path = "/path/to/wherever"
vhost = self.vh_truth[0]
@@ -1365,8 +1335,8 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("socache_shmcb_module")
tmp_path = os.path.realpath(tempfile.mkdtemp("vhostroot"))
os.chmod(tmp_path, 0o755)
tmp_path = filesystem.realpath(tempfile.mkdtemp("vhostroot"))
filesystem.chmod(tmp_path, 0o755)
mock_p = "certbot_apache.configurator.ApacheConfigurator._get_ssl_vhost_path"
mock_a = "certbot_apache.parser.ApacheParser.add_include"
@@ -1511,7 +1481,7 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertEqual(first_id, second_id)
def test_realpath_replaces_symlink(self):
orig_match = self.config.aug.match
orig_match = self.config.parser.aug.match
mock_vhost = copy.deepcopy(self.vh_truth[0])
mock_vhost.filep = mock_vhost.filep.replace('sites-enabled', u'sites-available')
mock_vhost.path = mock_vhost.path.replace('sites-enabled', 'sites-available')
@@ -1525,7 +1495,7 @@ class MultipleVhostsTest(util.ApacheTest):
return orig_match(aug_expr)
self.config.parser.parser_paths = ["/mocked/path"]
self.config.aug.match = mock_match
self.config.parser.aug.match = mock_match
vhs = self.config.get_virtual_hosts()
self.assertEqual(len(vhs), 2)
self.assertTrue(vhs[0] == self.vh_truth[1])
@@ -1551,8 +1521,8 @@ class AugeasVhostsTest(util.ApacheTest):
self.work_dir)
def test_choosevhost_with_illegal_name(self):
self.config.aug = mock.MagicMock()
self.config.aug.match.side_effect = RuntimeError
self.config.parser.aug = mock.MagicMock()
self.config.parser.aug.match.side_effect = RuntimeError
path = "debian_apache_2_4/augeas_vhosts/apache2/sites-available/old-and-default.conf"
chosen_vhost = self.config._create_vhost(path)
self.assertEqual(None, chosen_vhost)
@@ -1736,7 +1706,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
self.config.updated_mod_ssl_conf_digest)
def _current_ssl_options_hash(self):
return crypto_util.sha256sum(self.config.option("MOD_SSL_CONF_SRC"))
return crypto_util.sha256sum(self.config.pick_apache_config())
def _assert_current_file(self):
self.assertTrue(os.path.isfile(self.config.mod_ssl_conf))
@@ -1772,7 +1742,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
self.assertFalse(mock_logger.warning.called)
self.assertTrue(os.path.isfile(self.config.mod_ssl_conf))
self.assertEqual(crypto_util.sha256sum(
self.config.option("MOD_SSL_CONF_SRC")),
self.config.pick_apache_config()),
self._current_ssl_options_hash())
self.assertNotEqual(crypto_util.sha256sum(self.config.mod_ssl_conf),
self._current_ssl_options_hash())
@@ -1788,18 +1758,31 @@ class InstallSslOptionsConfTest(util.ApacheTest):
"%s has been manually modified; updated file "
"saved to %s. We recommend updating %s for security purposes.")
self.assertEqual(crypto_util.sha256sum(
self.config.option("MOD_SSL_CONF_SRC")),
self.config.pick_apache_config()),
self._current_ssl_options_hash())
# only print warning once
with mock.patch("certbot.plugins.common.logger") as mock_logger:
self._call()
self.assertFalse(mock_logger.warning.called)
def test_current_file_hash_in_all_hashes(self):
def test_ssl_config_files_hash_in_all_hashes(self):
"""
It is really critical that all TLS Apache config files have their SHA256 hash registered in
constants.ALL_SSL_OPTIONS_HASHES. Otherwise Certbot will mistakenly assume that the config
file has been manually edited by the user, and will refuse to update it.
This test ensures that all necessary hashes are present.
"""
from certbot_apache.constants import ALL_SSL_OPTIONS_HASHES
self.assertTrue(self._current_ssl_options_hash() in ALL_SSL_OPTIONS_HASHES,
"Constants.ALL_SSL_OPTIONS_HASHES must be appended"
" with the sha256 hash of self.config.mod_ssl_conf when it is updated.")
import pkg_resources
tls_configs_dir = pkg_resources.resource_filename("certbot_apache", "tls_configs")
all_files = [os.path.join(tls_configs_dir, name) for name in os.listdir(tls_configs_dir)
if name.endswith('options-ssl-apache.conf')]
self.assertTrue(all_files)
for one_file in all_files:
file_hash = crypto_util.sha256sum(one_file)
self.assertTrue(file_hash in ALL_SSL_OPTIONS_HASHES,
"Constants.ALL_SSL_OPTIONS_HASHES must be appended with the sha256 "
"hash of {0} when it is updated.".format(one_file))
if __name__ == "__main__":

View File

@@ -79,9 +79,9 @@ class MultipleVhostsTestDebian(util.ApacheTest):
def test_enable_site_failure(self):
self.config.parser.root = "/tmp/nonexistent"
with mock.patch("os.path.isdir") as mock_dir:
with mock.patch("certbot.compat.os.path.isdir") as mock_dir:
mock_dir.return_value = True
with mock.patch("os.path.islink") as mock_link:
with mock.patch("certbot.compat.os.path.islink") as mock_link:
mock_link.return_value = False
self.assertRaises(
errors.NotSupportedError,

View File

@@ -4,6 +4,7 @@ import unittest
import mock
from certbot import errors
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache import obj
@@ -160,7 +161,7 @@ class MultipleVhostsTestFedora(util.ApacheTest):
"""Make sure we read the sysconfig OPTIONS variable correctly"""
# Return nothing for the process calls
mock_cfg.return_value = ""
self.config.parser.sysconfig_filep = os.path.realpath(
self.config.parser.sysconfig_filep = filesystem.realpath(
os.path.join(self.config.parser.root, "../sysconfig/httpd"))
self.config.parser.variables = {}

View File

@@ -4,6 +4,7 @@ import unittest
import mock
from certbot import errors
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache import obj
@@ -81,7 +82,7 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
"""Make sure we read the Gentoo APACHE2_OPTS variable correctly"""
defines = ['DEFAULT_VHOST', 'INFO',
'SSL', 'SSL_DEFAULT_VHOST', 'LANGUAGE']
self.config.parser.apacheconfig_filep = os.path.realpath(
self.config.parser.apacheconfig_filep = filesystem.realpath(
os.path.join(self.config.parser.root, "../conf.d/apache2"))
self.config.parser.variables = {}
with mock.patch("certbot_apache.override_gentoo.GentooParser.update_modules"):

View File

@@ -1,8 +1,8 @@
# pylint: disable=too-many-public-methods
"""Tests for certbot_apache.parser."""
import shutil
import unittest
import augeas
import mock
from certbot import errors
@@ -22,6 +22,27 @@ class BasicParserTest(util.ParserTest):
shutil.rmtree(self.config_dir)
shutil.rmtree(self.work_dir)
def test_bad_parse(self):
self.parser.parse_file(os.path.join(self.parser.root,
"conf-available", "bad_conf_file.conf"))
self.assertRaises(
errors.PluginError, self.parser.check_parsing_errors, "httpd.aug")
def test_bad_save(self):
mock_save = mock.Mock()
mock_save.side_effect = IOError
self.parser.aug.save = mock_save
self.assertRaises(errors.PluginError, self.parser.unsaved_files)
def test_aug_version(self):
mock_match = mock.Mock(return_value=["something"])
self.parser.aug.match = mock_match
# pylint: disable=protected-access
self.assertEqual(self.parser.check_aug_version(),
["something"])
self.parser.aug.match.side_effect = RuntimeError
self.assertFalse(self.parser.check_aug_version())
def test_find_config_root_no_root(self):
# pylint: disable=protected-access
os.remove(self.parser.loc["root"])
@@ -311,21 +332,38 @@ class BasicParserTest(util.ParserTest):
class ParserInitTest(util.ApacheTest):
def setUp(self): # pylint: disable=arguments-differ
super(ParserInitTest, self).setUp()
self.aug = augeas.Augeas(
flags=augeas.Augeas.NONE | augeas.Augeas.NO_MODL_AUTOLOAD)
def tearDown(self):
shutil.rmtree(self.temp_dir)
shutil.rmtree(self.config_dir)
shutil.rmtree(self.work_dir)
@mock.patch("certbot_apache.parser.ApacheParser.init_augeas")
def test_prepare_no_augeas(self, mock_init_augeas):
from certbot_apache.parser import ApacheParser
mock_init_augeas.side_effect = errors.NoInstallationError
self.config.config_test = mock.Mock()
self.assertRaises(
errors.NoInstallationError, ApacheParser,
os.path.relpath(self.config_path), "/dummy/vhostpath",
version=(2, 4, 22), configurator=self.config)
def test_init_old_aug(self):
from certbot_apache.parser import ApacheParser
with mock.patch("certbot_apache.parser.ApacheParser.check_aug_version") as mock_c:
mock_c.return_value = False
self.assertRaises(
errors.NotSupportedError,
ApacheParser, os.path.relpath(self.config_path),
"/dummy/vhostpath", version=(2, 4, 22), configurator=self.config)
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
def test_unparseable(self, mock_cfg):
from certbot_apache.parser import ApacheParser
mock_cfg.return_value = ('Define: TEST')
self.assertRaises(
errors.PluginError,
ApacheParser, self.aug, os.path.relpath(self.config_path),
ApacheParser, os.path.relpath(self.config_path),
"/dummy/vhostpath", version=(2, 2, 22), configurator=self.config)
def test_root_normalized(self):
@@ -337,8 +375,7 @@ class ParserInitTest(util.ApacheTest):
self.temp_dir,
"debian_apache_2_4/////multiple_vhosts/../multiple_vhosts/apache2")
parser = ApacheParser(self.aug, path,
"/dummy/vhostpath", configurator=self.config)
parser = ApacheParser(path, "/dummy/vhostpath", configurator=self.config)
self.assertEqual(parser.root, self.config_path)
@@ -347,7 +384,7 @@ class ParserInitTest(util.ApacheTest):
with mock.patch("certbot_apache.parser.ApacheParser."
"update_runtime_variables"):
parser = ApacheParser(
self.aug, os.path.relpath(self.config_path),
os.path.relpath(self.config_path),
"/dummy/vhostpath", configurator=self.config)
self.assertEqual(parser.root, self.config_path)
@@ -357,7 +394,7 @@ class ParserInitTest(util.ApacheTest):
with mock.patch("certbot_apache.parser.ApacheParser."
"update_runtime_variables"):
parser = ApacheParser(
self.aug, self.config_path + os.path.sep,
self.config_path + os.path.sep,
"/dummy/vhostpath", configurator=self.config)
self.assertEqual(parser.root, self.config_path)

View File

@@ -0,0 +1,86 @@
""" Tests for ParserNode interface """
import unittest
from acme.magic_typing import Optional, Tuple # pylint: disable=unused-import, no-name-in-module
from certbot_apache import interfaces
class DummyCommentNode(interfaces.CommentNode):
""" A dummy class implementing CommentNode interface """
ancestor = None
comment = ""
dirty = False
def save(self, msg): # pragma: no cover
pass
class DummyDirectiveNode(interfaces.DirectiveNode):
""" A dummy class implementing DirectiveNode interface """
ancestor = None
parameters = tuple() # type: Tuple[str, ...]
dirty = False
enabled = True
name = ""
def save(self, msg): # pragma: no cover
pass
def set_parameters(self, parameters): # pragma: no cover
pass
class DummyBlockNode(interfaces.BlockNode):
""" A dummy class implementing BlockNode interface """
ancestor = None
parameters = tuple() # type: Tuple[str, ...]
children = tuple() # type: Tuple[interfaces.ParserNode, ...]
dirty = False
enabled = True
name = ""
def save(self, msg): # pragma: no cover
pass
def add_child_block(self, name, arguments=None, position=None): # pragma: no cover
pass
def add_child_directive(self, name, arguments=None, position=None): # pragma: no cover
pass
def add_child_comment(self, comment="", position=None): # pragma: no cover
pass
def find_blocks(self, name, exclude=True): # pragma: no cover
pass
def find_directives(self, name, exclude=True): # pragma: no cover
pass
def find_comments(self, comment, exact=False): # pragma: no cover
pass
def delete_child(self, child): # pragma: no cover
pass
def set_parameters(self, parameters): # pragma: no cover
pass
def unsaved_files(self): # pragma: no cover
pass
class ParserNodeTest(unittest.TestCase):
"""Dummy placeholder test case for ParserNode interfaces"""
def test_dummy(self):
dummyblock = DummyBlockNode()
dummydirective = DummyDirectiveNode()
dummycomment = DummyCommentNode()
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -78,8 +78,7 @@ class ParserTest(ApacheTest):
with mock.patch("certbot_apache.parser.ApacheParser."
"update_runtime_variables"):
self.parser = ApacheParser(
self.aug, self.config_path, self.vhost_path,
configurator=self.config)
self.config_path, self.vhost_path, configurator=self.config)
def get_apache_configurator( # pylint: disable=too-many-arguments, too-many-locals

View File

@@ -10,16 +10,10 @@ SSLEngine on
SSLProtocol all -SSLv2 -SSLv3
SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
SSLHonorCipherOrder on
SSLSessionTickets off
SSLOptions +StrictRequire
# Add vhost name to log entries:
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\"" vhost_combined
LogFormat "%v %h %l %u %t \"%r\" %>s %b" vhost_common
#CustomLog /var/log/apache2/access.log vhost_combined
#LogLevel warn
#ErrorLog /var/log/apache2/error.log
# Always ensure Cookies have "Secure" set (JAH 2012/1)
#Header edit Set-Cookie (?i)^(.*)(;\s*secure)??((\s*;)?(.*)) "$1; Secure$3$4"

View File

@@ -0,0 +1,18 @@
# This file contains important security parameters. If you modify this file
# manually, Certbot will be unable to automatically provide future security
# updates. Instead, Certbot will print and log an error message with a path to
# the up-to-date file that you will need to refer to when manually updating
# this file.
SSLEngine on
# Intermediate configuration, tweak to your needs
SSLProtocol all -SSLv2 -SSLv3
SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
SSLHonorCipherOrder on
SSLOptions +StrictRequire
# Add vhost name to log entries:
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\"" vhost_combined
LogFormat "%v %h %l %u %t \"%r\" %>s %b" vhost_common

View File

@@ -11,16 +11,10 @@ SSLProtocol all -SSLv2 -SSLv3
SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
SSLHonorCipherOrder on
SSLCompression off
SSLSessionTickets off
SSLOptions +StrictRequire
# Add vhost name to log entries:
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\"" vhost_combined
LogFormat "%v %h %l %u %t \"%r\" %>s %b" vhost_common
#CustomLog /var/log/apache2/access.log vhost_combined
#LogLevel warn
#ErrorLog /var/log/apache2/error.log
# Always ensure Cookies have "Secure" set (JAH 2012/1)
#Header edit Set-Cookie (?i)^(.*)(;\s*secure)??((\s*;)?(.*)) "$1; Secure$3$4"

View File

@@ -0,0 +1,19 @@
# This file contains important security parameters. If you modify this file
# manually, Certbot will be unable to automatically provide future security
# updates. Instead, Certbot will print and log an error message with a path to
# the up-to-date file that you will need to refer to when manually updating
# this file.
SSLEngine on
# Intermediate configuration, tweak to your needs
SSLProtocol all -SSLv2 -SSLv3
SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
SSLHonorCipherOrder on
SSLCompression off
SSLOptions +StrictRequire
# Add vhost name to log entries:
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\"" vhost_combined
LogFormat "%v %h %l %u %t \"%r\" %>s %b" vhost_common

View File

@@ -1,3 +1,3 @@
# Remember to update setup.py to match the package versions below.
acme[dev]==0.29.0
certbot[dev]==0.34.0
-e .[dev]

View File

@@ -4,13 +4,13 @@ from setuptools.command.test import test as TestCommand
import sys
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.
install_requires = [
'acme>=0.29.0',
'certbot>=0.34.0',
'certbot>=0.37.0.dev0',
'mock',
'python-augeas',
'setuptools',

View File

@@ -31,7 +31,7 @@ if [ -z "$VENV_PATH" ]; then
fi
VENV_BIN="$VENV_PATH/bin"
BOOTSTRAP_VERSION_PATH="$VENV_PATH/certbot-auto-bootstrap-version.txt"
LE_AUTO_VERSION="0.34.2"
LE_AUTO_VERSION="0.36.0"
BASENAME=$(basename $0)
USAGE="Usage: $BASENAME [OPTIONS]
A self-updating wrapper script for the Certbot ACME client. When run, updates
@@ -1314,18 +1314,18 @@ letsencrypt==0.7.0 \
--hash=sha256:105a5fb107e45bcd0722eb89696986dcf5f08a86a321d6aef25a0c7c63375ade \
--hash=sha256:c36e532c486a7e92155ee09da54b436a3c420813ec1c590b98f635d924720de9
certbot==0.34.2 \
--hash=sha256:238bb1c100d0d17f0bda147387435c307e128b2f1a8339eb85cef7fb99909cb9 \
--hash=sha256:30732ddcb10ccd8b8410c515a76ae0429ad907130b8bf8caa58b73826d0ec9bb
acme==0.34.2 \
--hash=sha256:f2b3cec09270499211fa54e588571bac67a015d375a4806c6c23431c91fdf7e3 \
--hash=sha256:bd5b0dfcbca82a2be6fe12e7c7939721d6b3dacb7d8529ba519b56274060dc2a
certbot-apache==0.34.2 \
--hash=sha256:c9cbbc2499084361a741f865a6f9af717296d5b0fec5fdd45819df2a56014a63 \
--hash=sha256:74c302b2099c9906dd4783cd57f546393235902dcc179302a2da280d83e72b96
certbot-nginx==0.34.2 \
--hash=sha256:4883f638e703b8fbab0ec15df6d9f0ebbb3cd81e221521b65ca27cdc9e9d070d \
--hash=sha256:13d58e40097f6b36e323752c146dc90d06120dc69a313e141476e0bc1a74ee17
certbot==0.36.0 \
--hash=sha256:486cee6c4861762fe4a94b4f44f7d227034d026d1a8d7ba2911ef4e86a737613 \
--hash=sha256:bf6745b823644cdca8461150455aeb67d417f87f80b9ec774c716e9587ef20a2
acme==0.36.0 \
--hash=sha256:5570c8e87383fbc733224fd0f7d164313b67dd9c21deafe9ddc8e769441f0c86 \
--hash=sha256:0461ee3c882d865e98e624561843dc135fa1a1412b15603d7ebfbb392de6a668
certbot-apache==0.36.0 \
--hash=sha256:2537f7fb67a38b6d1ed5ee79f6a799090ca609695ac3799bb840b2fb677ac98d \
--hash=sha256:458d20a3e9e8a88563d3deb0bbe38752bd2b80100f0e5854e4069390c1b4e5cd
certbot-nginx==0.36.0 \
--hash=sha256:4303b54adf2030671c54bb3964c1f43aec0f677045e0cdb6d4fb931268d08310 \
--hash=sha256:4c34e6114dd8204b6667f101579dd9ab2b38fef0dd5a15702585edcb2aefb322
UNLIKELY_EOF
# -------------------------------------------------------------------------

View File

@@ -1,12 +1,10 @@
"""Module to handle the context of integration tests."""
import os
import shutil
import subprocess
import sys
import tempfile
from distutils.version import LooseVersion
from certbot_integration_tests.utils import misc
from certbot_integration_tests.utils import misc, certbot_call
class IntegrationTestsContext(object):
@@ -30,11 +28,6 @@ class IntegrationTestsContext(object):
# is listening on challtestsrv_port.
self.challtestsrv_port = acme_xdist['challtestsrv_port']
# Certbot version does not depend on the test context. But getting its value requires
# calling certbot from a subprocess. Since it will be called a lot of times through
# _common_test_no_force_renew, we cache its value as a member of the fixture context.
self.certbot_version = misc.get_certbot_version()
self.workspace = tempfile.mkdtemp()
self.config_dir = os.path.join(self.workspace, 'conf')
self.hook_probe = tempfile.mkstemp(dir=self.workspace)[1]
@@ -60,71 +53,18 @@ class IntegrationTestsContext(object):
"""Cleanup the integration test context."""
shutil.rmtree(self.workspace)
def _common_test_no_force_renew(self, args):
"""
Base command to execute certbot in a distributed integration test context,
not renewing certificates by default.
"""
new_environ = os.environ.copy()
new_environ['TMPDIR'] = self.workspace
additional_args = []
if self.certbot_version >= LooseVersion('0.30.0'):
additional_args.append('--no-random-sleep-on-renew')
command = [
'certbot',
'--server', self.directory_url,
'--no-verify-ssl',
'--http-01-port', str(self.http_01_port),
'--https-port', str(self.tls_alpn_01_port),
'--manual-public-ip-logging-ok',
'--config-dir', self.config_dir,
'--work-dir', os.path.join(self.workspace, 'work'),
'--logs-dir', os.path.join(self.workspace, 'logs'),
'--non-interactive',
'--no-redirect',
'--agree-tos',
'--register-unsafely-without-email',
'--debug',
'-vv'
]
command.extend(args)
command.extend(additional_args)
print('Invoke command:\n{0}'.format(subprocess.list2cmdline(command)))
return subprocess.check_output(command, universal_newlines=True,
cwd=self.workspace, env=new_environ)
def _common_test(self, args):
"""
Base command to execute certbot in a distributed integration test context,
renewing certificates by default.
"""
command = ['--renew-by-default']
command.extend(args)
return self._common_test_no_force_renew(command)
def certbot_no_force_renew(self, args):
def certbot(self, args, force_renew=True):
"""
Execute certbot with given args, not renewing certificates by default.
:param args: args to pass to certbot
:param force_renew: set to False to not renew by default
:return: output of certbot execution
"""
command = ['--authenticator', 'standalone', '--installer', 'null']
command.extend(args)
return self._common_test_no_force_renew(command)
def certbot(self, args):
"""
Execute certbot with given args, renewing certificates by default.
:param args: args to pass to certbot
:return: output of certbot execution
"""
command = ['--renew-by-default']
command.extend(args)
return self.certbot_no_force_renew(command)
return certbot_call.certbot_test(
command, self.directory_url, self.http_01_port, self.tls_alpn_01_port,
self.config_dir, self.workspace, force_renew=force_renew)
def get_domain(self, subdomain='le'):
"""

View File

@@ -229,8 +229,8 @@ def test_graceful_renew_it_is_not_time(context):
assert_cert_count_for_lineage(context.config_dir, certname, 1)
context.certbot_no_force_renew([
'renew', '--deploy-hook', 'echo deploy >> "{0}"'.format(context.hook_probe)])
context.certbot(['renew', '--deploy-hook', 'echo deploy >> "{0}"'.format(context.hook_probe)],
force_renew=False)
assert_cert_count_for_lineage(context.config_dir, certname, 1)
with pytest.raises(AssertionError):
@@ -250,8 +250,8 @@ def test_graceful_renew_it_is_time(context):
with open(join(context.config_dir, 'renewal', '{0}.conf'.format(certname)), 'w') as file:
file.writelines(lines)
context.certbot_no_force_renew([
'renew', '--deploy-hook', 'echo deploy >> "{0}"'.format(context.hook_probe)])
context.certbot(['renew', '--deploy-hook', 'echo deploy >> "{0}"'.format(context.hook_probe)],
force_renew=False)
assert_cert_count_for_lineage(context.config_dir, certname, 2)
assert_hook_execution(context.hook_probe, 'deploy')

View File

@@ -68,17 +68,18 @@ def _setup_primary_node(config):
:param config: Configuration of the pytest primary node
"""
# Check for runtime compatibility: some tools are required to be available in PATH
try:
subprocess.check_output(['docker', '-v'], stderr=subprocess.STDOUT)
except (subprocess.CalledProcessError, OSError):
raise ValueError('Error: docker is required in PATH to launch the integration tests, '
'but is not installed or not available for current user.')
if 'boulder' in config.option.acme_server:
try:
subprocess.check_output(['docker', '-v'], stderr=subprocess.STDOUT)
except (subprocess.CalledProcessError, OSError):
raise ValueError('Error: docker is required in PATH to launch the integration tests on'
'boulder, but is not installed or not available for current user.')
try:
subprocess.check_output(['docker-compose', '-v'], stderr=subprocess.STDOUT)
except (subprocess.CalledProcessError, OSError):
raise ValueError('Error: docker-compose is required in PATH to launch the integration tests, '
'but is not installed or not available for current user.')
try:
subprocess.check_output(['docker-compose', '-v'], stderr=subprocess.STDOUT)
except (subprocess.CalledProcessError, OSError):
raise ValueError('Error: docker-compose is required in PATH to launch the integration tests, '
'but is not installed or not available for current user.')
# Parameter numprocesses is added to option by pytest-xdist
workers = ['primary'] if not config.option.numprocesses\
@@ -86,7 +87,9 @@ def _setup_primary_node(config):
# By calling setup_acme_server we ensure that all necessary acme server instances will be
# fully started. This runtime is reflected by the acme_xdist returned.
acme_xdist = acme_lib.setup_acme_server(config.option.acme_server, workers)
print('ACME xdist config:\n{0}'.format(acme_xdist))
acme_server = acme_lib.ACMEServer(config.option.acme_server, workers)
config.add_cleanup(acme_server.stop)
print('ACME xdist config:\n{0}'.format(acme_server.acme_xdist))
acme_server.start()
return acme_xdist
return acme_server.acme_xdist

View File

@@ -2,7 +2,7 @@ import os
import subprocess
from certbot_integration_tests.certbot_tests import context as certbot_context
from certbot_integration_tests.utils import misc
from certbot_integration_tests.utils import misc, certbot_call
from certbot_integration_tests.nginx_tests import nginx_config as config
@@ -33,11 +33,14 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
"""
Main command to execute certbot using the nginx plugin.
:param list args: list of arguments to pass to nginx
:param bool force_renew: set to False to not renew by default
"""
command = ['--authenticator', 'nginx', '--installer', 'nginx',
'--nginx-server-root', self.nginx_root]
command.extend(args)
return self._common_test(command)
return certbot_call.certbot_test(
command, self.directory_url, self.http_01_port, self.tls_alpn_01_port,
self.config_dir, self.workspace, force_renew=True)
def _start_nginx(self, default_server):
self.nginx_config = config.construct_nginx_config(

View File

@@ -21,9 +21,9 @@ def construct_nginx_config(nginx_root, nginx_webroot, http_port, https_port, oth
:rtype: str
"""
key_path = key_path if key_path \
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/nginx_key.pem')
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/key.pem')
cert_path = cert_path if cert_path \
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/nginx_cert.pem')
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/cert.pem')
return '''\
# This error log will be written regardless of server scope error_log
# definitions, so we have to set this here in the main scope.

View File

@@ -30,6 +30,7 @@ def context(request):
('nginx6.{0}.wtf,nginx7.{0}.wtf', ['--preferred-challenges', 'http'], {'default_server': False}),
], indirect=['context'])
def test_certificate_deployment(certname_pattern, params, context):
# type: (str, list, nginx_context.IntegrationTestsContext) -> None
"""
Test various scenarios to deploy a certificate to nginx using certbot.
"""
@@ -45,10 +46,7 @@ def test_certificate_deployment(certname_pattern, params, context):
assert server_cert == certbot_cert
command = ['--authenticator', 'nginx', '--installer', 'nginx',
'--nginx-server-root', context.nginx_root,
'rollback', '--checkpoints', '1']
context._common_test_no_force_renew(command)
context.certbot_test_nginx(['rollback', '--checkpoints', '1'])
with open(context.nginx_config_path, 'r') as file_h:
current_nginx_config = file_h.read()

325
certbot-ci/certbot_integration_tests/utils/acme_server.py Normal file → Executable file
View File

@@ -1,197 +1,212 @@
#!/usr/bin/env python
"""Module to setup an ACME CA server environment able to run multiple tests in parallel"""
from __future__ import print_function
import errno
import json
import tempfile
import atexit
import time
import os
import subprocess
import shutil
import stat
import sys
from os.path import join
import requests
import json
import yaml
from certbot_integration_tests.utils import misc
# These ports are set implicitly in the docker-compose.yml files of Boulder/Pebble.
CHALLTESTSRV_PORT = 8055
HTTP_01_PORT = 5002
from certbot_integration_tests.utils import misc, proxy, pebble_artifacts
from certbot_integration_tests.utils.constants import *
def setup_acme_server(acme_server, nodes):
class ACMEServer(object):
"""
This method will setup an ACME CA server and an HTTP reverse proxy instance, to allow parallel
execution of integration tests against the unique http-01 port expected by the ACME CA server.
Instances are properly closed and cleaned when the Python process exits using atexit.
ACMEServer configures and handles the lifecycle of an ACME CA server and an HTTP reverse proxy
instance, to allow parallel execution of integration tests against the unique http-01 port
expected by the ACME CA server.
Typically all pytest integration tests will be executed in this context.
This method returns an object describing ports and directory url to use for each pytest node
with the relevant pytest xdist node.
:param str acme_server: the type of acme server used (boulder-v1, boulder-v2 or pebble)
:param str[] nodes: list of node names that will be setup by pytest xdist
:return: a dict describing the challenge ports that have been setup for the nodes
:rtype: dict
ACMEServer gives access the acme_xdist parameter, listing the ports and directory url to use
for each pytest node. It exposes also start and stop methods in order to start the stack, and
stop it with proper resources cleanup.
ACMEServer is also a context manager, and so can be used to ensure ACME server is started/stopped
upon context enter/exit.
"""
acme_type = 'pebble' if acme_server == 'pebble' else 'boulder'
acme_xdist = _construct_acme_xdist(acme_server, nodes)
workspace = _construct_workspace(acme_type)
def __init__(self, acme_server, nodes, http_proxy=True, stdout=False):
"""
Create an ACMEServer instance.
:param str acme_server: the type of acme server used (boulder-v1, boulder-v2 or pebble)
:param list nodes: list of node names that will be setup by pytest xdist
:param bool http_proxy: if False do not start the HTTP proxy
:param bool stdout: if True stream subprocesses stdout to standard stdout
"""
self._construct_acme_xdist(acme_server, nodes)
_prepare_traefik_proxy(workspace, acme_xdist)
_prepare_acme_server(workspace, acme_type, acme_xdist)
self._acme_type = 'pebble' if acme_server == 'pebble' else 'boulder'
self._proxy = http_proxy
self._workspace = tempfile.mkdtemp()
self._processes = []
self._stdout = sys.stdout if stdout else open(os.devnull, 'w')
return acme_xdist
def start(self):
"""Start the test stack"""
try:
if self._proxy:
self._prepare_http_proxy()
if self._acme_type == 'pebble':
self._prepare_pebble_server()
if self._acme_type == 'boulder':
self._prepare_boulder_server()
except BaseException as e:
self.stop()
raise e
def stop(self):
"""Stop the test stack, and clean its resources"""
print('=> Tear down the test infrastructure...')
try:
for process in self._processes:
try:
process.terminate()
except OSError as e:
# Process may be not started yet, so no PID and terminate fails.
# Then the process never started, and the situation is acceptable.
if e.errno != errno.ESRCH:
raise
for process in self._processes:
process.wait()
def _construct_acme_xdist(acme_server, nodes):
"""Generate and return the acme_xdist dict"""
acme_xdist = {'acme_server': acme_server, 'challtestsrv_port': CHALLTESTSRV_PORT}
if os.path.exists(os.path.join(self._workspace, 'boulder')):
# Boulder docker generates build artifacts owned by root with 0o744 permissions.
# If we started the acme server from a normal user that has access to the Docker
# daemon, this user will not be able to delete these artifacts from the host.
# We need to do it through a docker.
process = self._launch_process(['docker', 'run', '--rm', '-v',
'{0}:/workspace'.format(self._workspace),
'alpine', 'rm', '-rf', '/workspace/boulder'])
process.wait()
finally:
shutil.rmtree(self._workspace)
if self._stdout != sys.stdout:
self._stdout.close()
print('=> Test infrastructure stopped and cleaned up.')
# Directory and ACME port are set implicitly in the docker-compose.yml files of Boulder/Pebble.
if acme_server == 'pebble':
acme_xdist['directory_url'] = 'https://localhost:14000/dir'
else: # boulder
port = 4001 if acme_server == 'boulder-v2' else 4000
acme_xdist['directory_url'] = 'http://localhost:{0}/directory'.format(port)
def __enter__(self):
self.start()
return self.acme_xdist
acme_xdist['http_port'] = {node: port for (node, port)
in zip(nodes, range(5200, 5200 + len(nodes)))}
acme_xdist['https_port'] = {node: port for (node, port)
in zip(nodes, range(5100, 5100 + len(nodes)))}
acme_xdist['other_port'] = {node: port for (node, port)
in zip(nodes, range(5300, 5300 + len(nodes)))}
def __exit__(self, exc_type, exc_val, exc_tb):
self.stop()
return acme_xdist
def _construct_acme_xdist(self, acme_server, nodes):
"""Generate and return the acme_xdist dict"""
acme_xdist = {'acme_server': acme_server, 'challtestsrv_port': CHALLTESTSRV_PORT}
# Directory and ACME port are set implicitly in the docker-compose.yml files of Boulder/Pebble.
if acme_server == 'pebble':
acme_xdist['directory_url'] = PEBBLE_DIRECTORY_URL
else: # boulder
acme_xdist['directory_url'] = BOULDER_V2_DIRECTORY_URL \
if acme_server == 'boulder-v2' else BOULDER_V1_DIRECTORY_URL
def _construct_workspace(acme_type):
"""Create a temporary workspace for integration tests stack"""
workspace = tempfile.mkdtemp()
acme_xdist['http_port'] = {node: port for (node, port)
in zip(nodes, range(5200, 5200 + len(nodes)))}
acme_xdist['https_port'] = {node: port for (node, port)
in zip(nodes, range(5100, 5100 + len(nodes)))}
acme_xdist['other_port'] = {node: port for (node, port)
in zip(nodes, range(5300, 5300 + len(nodes)))}
def cleanup():
"""Cleanup function to call that will teardown relevant dockers and their configuration."""
for instance in [acme_type, 'traefik']:
print('=> Tear down the {0} instance...'.format(instance))
instance_path = join(workspace, instance)
try:
if os.path.isfile(join(instance_path, 'docker-compose.yml')):
_launch_command(['docker-compose', 'down'], cwd=instance_path)
except subprocess.CalledProcessError:
pass
print('=> Finished tear down of {0} instance.'.format(acme_type))
self.acme_xdist = acme_xdist
shutil.rmtree(workspace)
def _prepare_pebble_server(self):
"""Configure and launch the Pebble server"""
print('=> Starting pebble instance deployment...')
pebble_path, challtestsrv_path, pebble_config_path = pebble_artifacts.fetch(self._workspace)
# Here with atexit we ensure that clean function is called no matter what.
atexit.register(cleanup)
# Configure Pebble at full speed (PEBBLE_VA_NOSLEEP=1) and not randomly refusing valid
# nonce (PEBBLE_WFE_NONCEREJECT=0) to have a stable test environment.
environ = os.environ.copy()
environ['PEBBLE_VA_NOSLEEP'] = '1'
environ['PEBBLE_WFE_NONCEREJECT'] = '0'
return workspace
self._launch_process(
[pebble_path, '-config', pebble_config_path, '-dnsserver', '127.0.0.1:8053'],
env=environ)
def _prepare_acme_server(workspace, acme_type, acme_xdist):
"""Configure and launch the ACME server, Boulder or Pebble"""
print('=> Starting {0} instance deployment...'.format(acme_type))
instance_path = join(workspace, acme_type)
try:
# Load Boulder/Pebble from git, that includes a docker-compose.yml ready for production.
_launch_command(['git', 'clone', 'https://github.com/letsencrypt/{0}'.format(acme_type),
'--single-branch', '--depth=1', instance_path])
if acme_type == 'boulder':
# Allow Boulder to ignore usual limit rate policies, useful for tests.
os.rename(join(instance_path, 'test/rate-limit-policies-b.yml'),
join(instance_path, 'test/rate-limit-policies.yml'))
if acme_type == 'pebble':
# Configure Pebble at full speed (PEBBLE_VA_NOSLEEP=1) and not randomly refusing valid
# nonce (PEBBLE_WFE_NONCEREJECT=0) to have a stable test environment.
with open(os.path.join(instance_path, 'docker-compose.yml'), 'r') as file_handler:
config = yaml.load(file_handler.read())
config['services']['pebble'].setdefault('environment', [])\
.extend(['PEBBLE_VA_NOSLEEP=1', 'PEBBLE_WFE_NONCEREJECT=0'])
with open(os.path.join(instance_path, 'docker-compose.yml'), 'w') as file_handler:
file_handler.write(yaml.dump(config))
# Launch the ACME CA server.
_launch_command(['docker-compose', 'up', '--force-recreate', '-d'], cwd=instance_path)
self._launch_process(
[challtestsrv_path, '-management', ':{0}'.format(CHALLTESTSRV_PORT), '-defaultIPv6', '""',
'-defaultIPv4', '127.0.0.1', '-http01', '""', '-tlsalpn01', '""', '-https01', '""'])
# Wait for the ACME CA server to be up.
print('=> Waiting for {0} instance to respond...'.format(acme_type))
misc.check_until_timeout(acme_xdist['directory_url'])
print('=> Waiting for pebble instance to respond...')
misc.check_until_timeout(self.acme_xdist['directory_url'])
print('=> Finished pebble instance deployment.')
def _prepare_boulder_server(self):
"""Configure and launch the Boulder server"""
print('=> Starting boulder instance deployment...')
instance_path = join(self._workspace, 'boulder')
# Load Boulder from git, that includes a docker-compose.yml ready for production.
process = self._launch_process(['git', 'clone', 'https://github.com/letsencrypt/boulder',
'--single-branch', '--depth=1', instance_path])
process.wait()
# Allow Boulder to ignore usual limit rate policies, useful for tests.
os.rename(join(instance_path, 'test/rate-limit-policies-b.yml'),
join(instance_path, 'test/rate-limit-policies.yml'))
# Launch the Boulder server
self._launch_process(['docker-compose', 'up', '--force-recreate'], cwd=instance_path)
# Wait for the ACME CA server to be up.
print('=> Waiting for boulder instance to respond...')
misc.check_until_timeout(self.acme_xdist['directory_url'], attempts=240)
# Configure challtestsrv to answer any A record request with ip of the docker host.
acme_subnet = '10.77.77' if acme_type == 'boulder' else '10.30.50'
response = requests.post('http://localhost:{0}/set-default-ipv4'
.format(acme_xdist['challtestsrv_port']),
json={'ip': '{0}.1'.format(acme_subnet)})
response = requests.post('http://localhost:{0}/set-default-ipv4'.format(CHALLTESTSRV_PORT),
json={'ip': '10.77.77.1'})
response.raise_for_status()
print('=> Finished {0} instance deployment.'.format(acme_type))
except BaseException:
print('Error while setting up {0} instance.'.format(acme_type))
raise
print('=> Finished boulder instance deployment.')
def _prepare_http_proxy(self):
"""Configure and launch an HTTP proxy"""
print('=> Configuring the HTTP proxy...')
mapping = {r'.+\.{0}\.wtf'.format(node): 'http://127.0.0.1:{0}'.format(port)
for node, port in self.acme_xdist['http_port'].items()}
command = [sys.executable, proxy.__file__, str(HTTP_01_PORT), json.dumps(mapping)]
self._launch_process(command)
print('=> Finished configuring the HTTP proxy.')
def _launch_process(self, command, cwd=os.getcwd(), env=None):
"""Launch silently an subprocess OS command"""
if not env:
env = os.environ
process = subprocess.Popen(command, stdout=self._stdout, stderr=subprocess.STDOUT, cwd=cwd, env=env)
self._processes.append(process)
return process
def _prepare_traefik_proxy(workspace, acme_xdist):
"""Configure and launch Traefik, the HTTP reverse proxy"""
print('=> Starting traefik instance deployment...')
instance_path = join(workspace, 'traefik')
traefik_subnet = '10.33.33'
traefik_api_port = 8056
def main():
args = sys.argv[1:]
server_type = args[0] if args else 'pebble'
possible_values = ('pebble', 'boulder-v1', 'boulder-v2')
if server_type not in possible_values:
raise ValueError('Invalid server value {0}, should be one of {1}'
.format(server_type, possible_values))
acme_server = ACMEServer(server_type, [], http_proxy=False, stdout=True)
try:
os.mkdir(instance_path)
with acme_server as acme_xdist:
print('--> Instance of {0} is running, directory URL is {0}'
.format(acme_xdist['directory_url']))
print('--> Press CTRL+C to stop the ACME server.')
with open(join(instance_path, 'docker-compose.yml'), 'w') as file_h:
file_h.write('''\
version: '3'
services:
traefik:
image: traefik
command: --api --rest
ports:
- {http_01_port}:80
- {traefik_api_port}:8080
networks:
traefiknet:
ipv4_address: {traefik_subnet}.2
networks:
traefiknet:
ipam:
config:
- subnet: {traefik_subnet}.0/24
'''.format(traefik_subnet=traefik_subnet,
traefik_api_port=traefik_api_port,
http_01_port=HTTP_01_PORT))
_launch_command(['docker-compose', 'up', '--force-recreate', '-d'], cwd=instance_path)
misc.check_until_timeout('http://localhost:{0}/api'.format(traefik_api_port))
config = {
'backends': {
node: {
'servers': {node: {'url': 'http://{0}.1:{1}'.format(traefik_subnet, port)}}
} for node, port in acme_xdist['http_port'].items()
},
'frontends': {
node: {
'backend': node, 'passHostHeader': True,
'routes': {node: {'rule': 'HostRegexp: {{subdomain:.+}}.{0}.wtf'.format(node)}}
} for node in acme_xdist['http_port'].keys()
}
}
response = requests.put('http://localhost:{0}/api/providers/rest'.format(traefik_api_port),
data=json.dumps(config))
response.raise_for_status()
print('=> Finished traefik instance deployment.')
except BaseException:
print('Error while setting up traefik instance.')
raise
while True:
time.sleep(3600)
except KeyboardInterrupt:
pass
def _launch_command(command, cwd=os.getcwd()):
"""Launch silently an OS command, output will be displayed in case of failure"""
try:
subprocess.check_output(command, stderr=subprocess.STDOUT, cwd=cwd, universal_newlines=True)
except subprocess.CalledProcessError as e:
sys.stderr.write(e.output)
raise
if __name__ == '__main__':
main()

View File

@@ -0,0 +1,98 @@
#!/usr/bin/env python
"""Module to call certbot in test mode"""
from __future__ import absolute_import
from distutils.version import LooseVersion
import subprocess
import sys
import os
from certbot_integration_tests.utils import misc
from certbot_integration_tests.utils.constants import *
def certbot_test(certbot_args, directory_url, http_01_port, tls_alpn_01_port,
config_dir, workspace, force_renew=True):
"""
Invoke the certbot executable available in PATH in a test context for the given args.
The test context consists in running certbot in debug mode, with various flags suitable
for tests (eg. no ssl check, customizable ACME challenge ports and config directory ...).
This command captures stdout and returns it to the caller.
:param list certbot_args: the arguments to pass to the certbot executable
:param str directory_url: URL of the ACME directory server to use
:param int http_01_port: port for the HTTP-01 challenges
:param int tls_alpn_01_port: port for the TLS-ALPN-01 challenges
:param str config_dir: certbot configuration directory to use
:param str workspace: certbot current directory to use
:param bool force_renew: set False to not force renew existing certificates (default: True)
:return: stdout as string
:rtype: str
"""
command, env = _prepare_args_env(certbot_args, directory_url, http_01_port, tls_alpn_01_port,
config_dir, workspace, force_renew)
return subprocess.check_output(command, universal_newlines=True, cwd=workspace, env=env)
def _prepare_args_env(certbot_args, directory_url, http_01_port, tls_alpn_01_port,
config_dir, workspace, force_renew):
new_environ = os.environ.copy()
new_environ['TMPDIR'] = workspace
additional_args = []
if misc.get_certbot_version() >= LooseVersion('0.30.0'):
additional_args.append('--no-random-sleep-on-renew')
if force_renew:
additional_args.append('--renew-by-default')
command = [
'certbot',
'--server', directory_url,
'--no-verify-ssl',
'--http-01-port', str(http_01_port),
'--https-port', str(tls_alpn_01_port),
'--manual-public-ip-logging-ok',
'--config-dir', config_dir,
'--work-dir', os.path.join(workspace, 'work'),
'--logs-dir', os.path.join(workspace, 'logs'),
'--non-interactive',
'--no-redirect',
'--agree-tos',
'--register-unsafely-without-email',
'--debug',
'-vv'
]
command.extend(certbot_args)
command.extend(additional_args)
print('--> Invoke command:\n=====\n{0}\n====='.format(subprocess.list2cmdline(command)))
return command, new_environ
def main():
args = sys.argv[1:]
# Default config is pebble
directory_url = os.environ.get('SERVER', PEBBLE_DIRECTORY_URL)
http_01_port = int(os.environ.get('HTTP_01_PORT', HTTP_01_PORT))
tls_alpn_01_port = int(os.environ.get('TLS_ALPN_01_PORT', TLS_ALPN_01_PORT))
# Execution of certbot in a self-contained workspace
workspace = os.environ.get('WORKSPACE', os.path.join(os.getcwd(), '.certbot_test_workspace'))
if not os.path.exists(workspace):
print('--> Creating a workspace for certbot_test: {0}'.format(workspace))
os.mkdir(workspace)
else:
print('--> Using an existing workspace for certbot_test: {0}'.format(workspace))
config_dir = os.path.join(workspace, 'conf')
# Invoke certbot in test mode, without capturing output so users see directly the outcome.
command, env = _prepare_args_env(args, directory_url, http_01_port, tls_alpn_01_port,
config_dir, workspace, True)
subprocess.check_call(command, universal_newlines=True, cwd=workspace, env=env)
if __name__ == '__main__':
main()

View File

@@ -0,0 +1,7 @@
"""Some useful constants to use throughout certbot-ci integration tests"""
HTTP_01_PORT = 5002
TLS_ALPN_01_PORT = 5001
CHALLTESTSRV_PORT = 8055
BOULDER_V1_DIRECTORY_URL = 'http://localhost:4000/directory'
BOULDER_V2_DIRECTORY_URL = 'http://localhost:4001/directory'
PEBBLE_DIRECTORY_URL = 'https://localhost:14000/dir'

View File

@@ -23,19 +23,18 @@ from cryptography.hazmat.primitives.asymmetric import ec
from cryptography.hazmat.primitives.serialization import Encoding, PrivateFormat, NoEncryption
from six.moves import socketserver, SimpleHTTPServer
from acme import crypto_util
RSA_KEY_TYPE = 'rsa'
ECDSA_KEY_TYPE = 'ecdsa'
def check_until_timeout(url):
def check_until_timeout(url, attempts=30):
"""
Wait and block until given url responds with status 200, or raise an exception
after 150 attempts.
after the specified number of attempts.
:param str url: the URL to test
:raise ValueError: exception raised after 150 unsuccessful attempts to reach the URL
:param int attempts: the number of times to try to connect to the URL
:raise ValueError: exception raised if unable to reach the URL
"""
try:
import urllib3
@@ -45,7 +44,7 @@ def check_until_timeout(url):
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
for _ in range(0, 150):
for _ in range(attempts):
time.sleep(1)
try:
if requests.get(url, verify=False).status_code == 200:
@@ -53,7 +52,7 @@ def check_until_timeout(url):
except requests.exceptions.ConnectionError:
pass
raise ValueError('Error, url did not respond after 150 attempts: {0}'.format(url))
raise ValueError('Error, url did not respond after {0} attempts: {1}'.format(attempts, url))
class GracefulTCPServer(socketserver.TCPServer):
@@ -250,13 +249,20 @@ def generate_csr(domains, key_path, csr_path, key_type=RSA_KEY_TYPE):
else:
raise ValueError('Invalid key type: {0}'.format(key_type))
key_bytes = crypto.dump_privatekey(crypto.FILETYPE_PEM, key)
with open(key_path, 'wb') as file:
file.write(key_bytes)
with open(key_path, 'wb') as file_h:
file_h.write(crypto.dump_privatekey(crypto.FILETYPE_PEM, key))
csr_bytes = crypto_util.make_csr(key_bytes, domains)
with open(csr_path, 'wb') as file:
file.write(csr_bytes)
req = crypto.X509Req()
san = ', '.join(['DNS:{0}'.format(item) for item in domains])
san_constraint = crypto.X509Extension(b'subjectAltName', False, san.encode('utf-8'))
req.add_extensions([san_constraint])
req.set_pubkey(key)
req.set_version(2)
req.sign(key, 'sha256')
with open(csr_path, 'wb') as file_h:
file_h.write(crypto.dump_certificate_request(crypto.FILETYPE_ASN1, req))
def read_certificate(cert_path):

View File

@@ -0,0 +1,52 @@
import json
import platform
import os
import stat
import pkg_resources
import requests
PEBBLE_VERSION = 'v2.1.0'
ASSETS_PATH = pkg_resources.resource_filename('certbot_integration_tests', 'assets')
def fetch(workspace):
suffix = '{0}-{1}{2}'.format(platform.system().lower(),
platform.machine().lower().replace('x86_64', 'amd64'),
'.exe' if platform.system() == 'Windows' else '')
pebble_path = _fetch_asset('pebble', suffix)
challtestsrv_path = _fetch_asset('pebble-challtestsrv', suffix)
pebble_config_path = _build_pebble_config(workspace)
return pebble_path, challtestsrv_path, pebble_config_path
def _fetch_asset(asset, suffix):
asset_path = os.path.join(ASSETS_PATH, '{0}_{1}_{2}'.format(asset, PEBBLE_VERSION, suffix))
if not os.path.exists(asset_path):
asset_url = ('https://github.com/letsencrypt/pebble/releases/download/{0}/{1}_{2}'
.format(PEBBLE_VERSION, asset, suffix))
response = requests.get(asset_url)
response.raise_for_status()
with open(asset_path, 'wb') as file_h:
file_h.write(response.content)
os.chmod(asset_path, os.stat(asset_path).st_mode | stat.S_IEXEC)
return asset_path
def _build_pebble_config(workspace):
config_path = os.path.join(workspace, 'pebble-config.json')
with open(config_path, 'w') as file_h:
file_h.write(json.dumps({
'pebble': {
'listenAddress': '0.0.0.0:14000',
'certificate': os.path.join(ASSETS_PATH, 'cert.pem'),
'privateKey': os.path.join(ASSETS_PATH, 'key.pem'),
'httpPort': 5002,
'tlsPort': 5001,
},
}))
return config_path

View File

@@ -0,0 +1,36 @@
#!/usr/bin/env python
import json
import sys
import re
import requests
from six.moves import BaseHTTPServer
from certbot_integration_tests.utils.misc import GracefulTCPServer
def _create_proxy(mapping):
class ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
def do_GET(self):
headers = {key.lower(): value for key, value in self.headers.items()}
backend = [backend for pattern, backend in mapping.items()
if re.match(pattern, headers['host'])][0]
response = requests.get(backend + self.path, headers=headers)
self.send_response(response.status_code)
for key, value in response.headers.items():
self.send_header(key, value)
self.end_headers()
self.wfile.write(response.content)
return ProxyHandler
if __name__ == '__main__':
http_port = int(sys.argv[1])
port_mapping = json.loads(sys.argv[2])
httpd = GracefulTCPServer(('', http_port), _create_proxy(port_mapping))
try:
httpd.serve_forever()
except KeyboardInterrupt:
pass

View File

@@ -5,7 +5,6 @@ from setuptools import find_packages
version = '0.32.0.dev0'
install_requires = [
'acme',
'coverage',
'cryptography',
'pyopenssl',
@@ -45,4 +44,11 @@ setup(
packages=find_packages(),
include_package_data=True,
install_requires=install_requires,
entry_points={
'console_scripts': [
'certbot_test=certbot_integration_tests.utils.certbot_call:main',
'run_acme_server=certbot_integration_tests.utils.acme_server:main',
],
}
)

View File

@@ -1,4 +1,4 @@
FROM debian:jessie
FROM debian:stretch
MAINTAINER Brad Warren <bmw@eff.org>
# no need to mkdir anything:

View File

@@ -4,7 +4,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
install_requires = [
'certbot',

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-cloudflare
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-cloudflare

View File

@@ -22,7 +22,9 @@ Credentials
Use of this plugin requires a configuration file containing Cloudflare API
credentials, obtained from your Cloudflare
`account page <https://www.cloudflare.com/a/account/my-account>`_.
`account page <https://www.cloudflare.com/a/account/my-account>`_. This plugin
does not currently support Cloudflare's "API Tokens", so please ensure you use
the "Global API Key" for authentication.
.. code-block:: ini
:name: credentials.ini

View File

@@ -10,7 +10,7 @@ from certbot.plugins import dns_common
logger = logging.getLogger(__name__)
ACCOUNT_URL = 'https://www.cloudflare.com/a/account/my-account'
ACCOUNT_URL = 'https://dash.cloudflare.com/profile/api-tokens'
@zope.interface.implementer(interfaces.IAuthenticator)

View File

@@ -2,7 +2,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-cloudxns
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-cloudxns

View File

@@ -2,7 +2,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-digitalocean
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-digitalocean

View File

@@ -2,7 +2,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-dnsimple
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-dnsimple

View File

@@ -3,7 +3,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-dnsmadeeasy
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-dnsmadeeasy

View File

@@ -2,7 +2,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-gehirn
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-gehirn

View File

@@ -2,7 +2,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Please update tox.ini when modifying dependency version requirements
install_requires = [

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-google
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-google

View File

@@ -274,10 +274,11 @@ class _GoogleClient(object):
raise errors.PluginError('Encountered error finding managed zone: {0}'
.format(e))
if zones:
zone_id = zones[0]['id']
logger.debug('Found id of %s for %s using name %s', zone_id, domain, zone_name)
return zone_id
for zone in zones:
zone_id = zone['id']
if 'privateVisibilityConfig' not in zone:
logger.debug('Found id of %s for %s using name %s', zone_id, domain, zone_name)
return zone_id
raise errors.PluginError('Unable to determine managed zone for {0} using zone names: {1}.'
.format(domain, zone_dns_name_guesses))

View File

@@ -2,7 +2,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-linode
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-linode

View File

@@ -1,7 +1,7 @@
from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Please update tox.ini when modifying dependency version requirements
install_requires = [

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-luadns
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-luadns

View File

@@ -2,7 +2,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-nsone
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-nsone

View File

@@ -2,7 +2,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-ovh
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-ovh

View File

@@ -2,7 +2,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-rfc2136
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-rfc2136

View File

@@ -21,8 +21,8 @@ Credentials
-----------
Use of this plugin requires a configuration file containing the target DNS
server, optional authorative domain and optional port that supports RFC 2136 Dynamic Updates,
the name of the TSIG key, the TSIG key secret itself and the algorithm used if it's
server and optional port that supports RFC 2136 Dynamic Updates, the name
of the TSIG key, the TSIG key secret itself and the algorithm used if it's
different to HMAC-MD5.
.. code-block:: ini
@@ -33,8 +33,6 @@ different to HMAC-MD5.
dns_rfc2136_server = 192.0.2.1
# Target DNS port
dns_rfc2136_port = 53
# Authorative domain (optional, will try to auto-detect if missing)
dns_rfc2136_base_domain = example.com
# TSIG key name
dns_rfc2136_name = keyname.
# TSIG key secret

View File

@@ -79,33 +79,25 @@ class Authenticator(dns_common.DNSAuthenticator):
self._get_rfc2136_client().del_txt_record(validation_name, validation)
def _get_rfc2136_client(self):
key = _RFC2136Key(self.credentials.conf('name'),
self.credentials.conf('secret'),
self.ALGORITHMS.get(self.credentials.conf('algorithm'),
dns.tsig.HMAC_MD5))
return _RFC2136Client(self.credentials.conf('server'),
int(self.credentials.conf('port') or self.PORT),
key,
self.credentials.conf('base-domain'))
self.credentials.conf('name'),
self.credentials.conf('secret'),
self.ALGORITHMS.get(self.credentials.conf('algorithm'),
dns.tsig.HMAC_MD5))
class _RFC2136Key(object):
def __init__(self, name, secret, algorithm):
self.name = name
self.secret = secret
self.algorithm = algorithm
class _RFC2136Client(object):
"""
Encapsulates all communication with the target DNS server.
"""
def __init__(self, server, port, base_domain, key):
def __init__(self, server, port, key_name, key_secret, key_algorithm):
self.server = server
self.port = port
self.keyring = dns.tsigkeyring.from_text({
key.name: key.secret
key_name: key_secret
})
self.algorithm = key.algorithm
self.base_domain = base_domain
self.algorithm = key_algorithm
def add_txt_record(self, record_name, record_content, record_ttl):
"""
@@ -179,33 +171,23 @@ class _RFC2136Client(object):
def _find_domain(self, record_name):
"""
If 'base_domain' option is specified check if the requested domain matches this base domain
and return it. If not explicitly specified find the closest domain with an SOA record for
the given domain name.
Find the closest domain with an SOA record for a given domain name.
:param str record_name: The record name for which to find the base domain.
:param str record_name: The record name for which to find the closest SOA record.
:returns: The domain, if found.
:rtype: str
:raises certbot.errors.PluginError: if no SOA record can be found.
"""
if self.base_domain:
if not record_name.endswith(self.base_domain):
raise errors.PluginError('Requested domain {0} does not match specified base '
'domain {1}.'
.format(record_name, self.base_domain))
else:
return self.base_domain
else:
domain_name_guesses = dns_common.base_domain_name_guesses(record_name)
domain_name_guesses = dns_common.base_domain_name_guesses(record_name)
# Loop through until we find an authoritative SOA record
for guess in domain_name_guesses:
if self._query_soa(guess):
return guess
# Loop through until we find an authoritative SOA record
for guess in domain_name_guesses:
if self._query_soa(guess):
return guess
raise errors.PluginError('Unable to determine base domain for {0} using names: {1}.'
.format(record_name, domain_name_guesses))
raise errors.PluginError('Unable to determine base domain for {0} using names: {1}.'
.format(record_name, domain_name_guesses))
def _query_soa(self, domain_name):
"""

View File

@@ -73,12 +73,9 @@ class AuthenticatorTest(test_util.TempDirTestCase, dns_test_common.BaseAuthentic
class RFC2136ClientTest(unittest.TestCase):
def setUp(self):
from certbot_dns_rfc2136.dns_rfc2136 import _RFC2136Client, _RFC2136Key
from certbot_dns_rfc2136.dns_rfc2136 import _RFC2136Client
self.rfc2136_client = _RFC2136Client(SERVER,
PORT,
None,
_RFC2136Key(NAME, SECRET, dns.tsig.HMAC_MD5))
self.rfc2136_client = _RFC2136Client(SERVER, PORT, NAME, SECRET, dns.tsig.HMAC_MD5)
@mock.patch("dns.query.tcp")
def test_add_txt_record(self, query_mock):
@@ -165,28 +162,6 @@ class RFC2136ClientTest(unittest.TestCase):
self.rfc2136_client._find_domain,
'foo.bar.'+DOMAIN)
def test_find_domain_with_base(self):
# _query_soa | pylint: disable=protected-access
self.rfc2136_client._query_soa = mock.MagicMock(side_effect=[False, False, True])
self.rfc2136_client.base_domain = 'bar.' + DOMAIN
# _find_domain | pylint: disable=protected-access
domain = self.rfc2136_client._find_domain('foo.bar.' + DOMAIN)
self.assertTrue(domain == 'bar.' + DOMAIN)
def test_find_domain_with_wrong_base(self):
# _query_soa | pylint: disable=protected-access
self.rfc2136_client._query_soa = mock.MagicMock(side_effect=[False, False, True])
self.rfc2136_client.base_domain = 'wrong.' + DOMAIN
self.assertRaises(
errors.PluginError,
# _find_domain | pylint: disable=protected-access
self.rfc2136_client._find_domain,
'foo.bar.' + DOMAIN)
@mock.patch("dns.query.udp")
def test_query_soa_found(self, query_mock):
query_mock.return_value = mock.MagicMock(answer=[mock.MagicMock()], flags=dns.flags.AA)

View File

@@ -2,7 +2,7 @@ from setuptools import setup
from setuptools import find_packages
version = '0.35.0.dev0'
version = '0.37.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.

View File

@@ -1,5 +0,0 @@
FROM certbot/certbot
COPY . src/certbot-dns-route53
RUN pip install --constraint docker_constraints.txt --no-cache-dir --editable src/certbot-dns-route53

View File

@@ -1,3 +1,3 @@
include LICENSE
include LICENSE.txt
include README
recursive-include docs *

Some files were not shown because too many files have changed in this diff Show More