Compare commits

...

151 Commits

Author SHA1 Message Date
Adrien Ferrand
57a8401369 Realign coverage 2022-01-19 12:16:31 +01:00
Adrien Ferrand
f0f0b4db08 Final typing fixes 2022-01-19 11:47:34 +01:00
Adrien Ferrand
8840618c2d Merge remote-tracking branch 'upstream/master' into apache_typing 2022-01-19 11:17:26 +01:00
Adrien Ferrand
d1097d476f Merge branch 'master' into pr/9071 2022-01-19 11:12:47 +01:00
Adrien Ferrand
2ef0a702c9 Fixing remaining typing issues 2022-01-19 10:59:56 +01:00
kevgrig
afc5be5abe Add wildcard example (#9164)
* Add wildcard example

* Update wildcard example
2022-01-18 23:20:25 +01:00
Adrien Ferrand
16aad35d31 Fully type certbot-nginx module (#9124)
* Work in progress

* Fix type

* Work in progress

* Work in progress

* Work in progress

* Work in progress

* Work in progress

* Oups.

* Fix typing in UnspacedList

* Fix logic

* Finish typing

* List certbot-nginx as fully typed in tox

* Fix lint

* Fix checks

* Organize imports

* Fix typing for Python 3.6

* Fix checks

* Fix lint

* Update certbot-nginx/certbot_nginx/_internal/configurator.py

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot-nginx/certbot_nginx/_internal/configurator.py

Co-authored-by: alexzorin <alex@zor.io>

* Fix signature of deploy_cert regarding the installer interface

* Update certbot-nginx/certbot_nginx/_internal/obj.py

Co-authored-by: alexzorin <alex@zor.io>

* Fix types

* Update certbot-nginx/certbot_nginx/_internal/parser.py

Co-authored-by: alexzorin <alex@zor.io>

* Precise type

* Precise _coerce possible inputs/outputs

* Fix type

* Update certbot-nginx/certbot_nginx/_internal/http_01.py

Co-authored-by: ohemorange <ebportnoy@gmail.com>

* Fix type

* Remove an undesirable implementation.

* Fix type

Co-authored-by: alexzorin <alex@zor.io>
Co-authored-by: ohemorange <ebportnoy@gmail.com>
2022-01-12 16:36:51 -08:00
Mads Jensen
30b066f082 Remove outdated pylint comments (#9167)
* Remove outdated pylint: disable=unused-import annotations.

* remove # pylint: disable=ungrouped-imports annotations.

* Remove single pylint: disable = unused-argument in DeleteIfAppropriateTest.test_opt_in_deletion.
2022-01-09 22:51:06 +01:00
Mads Jensen
7e5e51aeff Use super().__init__ instead of explicitly calling named super-class. (#9166)
* Use super().__init__ instead of explicitly calling named super-class.

* Fix unittest (typo fix).
2022-01-09 22:50:44 +01:00
Mads Jensen
f62eabc94e Fix a bit more 2022-01-05 17:49:49 +01:00
Mads Jensen
b1c5362929 Fixing some errors. 2022-01-05 17:49:49 +01:00
Mads Jensen
2c496138bc Fix some things 2022-01-05 17:49:17 +01:00
Mads Jensen
f962a3c613 Add typing to certbot.apache 2022-01-05 17:49:17 +01:00
Mads Jensen
ed7964b424 Improve assertions in nginx and DNS plugin tests. (#9157)
* Improve assertions in nginx and DNS plugin tests.

* Use assertIs for asserting is True/False.
2022-01-04 23:59:58 +01:00
Adrien Ferrand
97a09dee19 Revert "Remove win2016 from Azure devops pipelines. (#9145)" (#9158)
This reverts commit dc66c87928.
2022-01-04 20:24:24 +11:00
Mads Jensen
a0dbe1e850 Improve assertions in certbot-apache tests. (#9131)
* Improve assertions in certbot-apache tests.

Replacements inspired by flake8-assertive.

* Fix test failures

* assertEqual is not for None :D

* Pass all tests :)
2022-01-03 22:05:21 +01:00
Mads Jensen
eeca208c8f Various clean-ups in certbot-apache. Use f-strings. (#9132)
* Various clean-ups in certbot-apache. Use f-strings.

* Smaller tweaks
2022-01-02 00:27:47 +01:00
alexzorin
00f98fa911 letstest: bump ubuntu groovy to impish (#9155) 2022-01-02 00:23:24 +01:00
Mads Jensen
dc66c87928 Remove win2016 from Azure devops pipelines. (#9145)
From March 2022, support will be removed.
https://github.com/actions/virtual-environments/issues/4312
2022-01-02 00:22:52 +01:00
osirisinferi
93c2852fdb Add show_account subcommand to retrieve account info from ACME server (#9127)
* Fetch and print account contacts from ACME server

* Add tests

* Add changelog entryAdd changelog entry

* Add account URI and thumbprint output

Only show these items when verbosity > 0

* Add test case for account URI and thumbprint

* Move changelog entry to new placeholder

* Add test for `cb_client.acme` (coverage)

* Address comments

* Update changelog

* Few small word changes

* Add server to error messages

* Remove phone contact parts
2021-12-27 19:12:52 +11:00
osirisinferi
a391a34631 Add appending behaviour of max-log-backups = 0 (#9146) 2021-12-22 08:20:01 +11:00
Brad Warren
1577cd8663 write docs on how to test release script (#9142)
Alexis (rightfully) wasn't sure how to test this when working on https://github.com/certbot/certbot/pull/9076. This PR documents it in the script based on what I wrote at https://github.com/certbot/certbot/pull/8351#issue-715227127 which I reverified.
2021-12-21 09:28:31 -07:00
Adrien Ferrand
89ccbccff0 Fully type all DNS plugins (#9125)
* Add types in all DNS plugins

* Order imports

* Fix type

* Update certbot-dns-route53/certbot_dns_route53/_internal/dns_route53.py

Co-authored-by: alexzorin <alex@zor.io>

* Clean up imports

Co-authored-by: alexzorin <alex@zor.io>
2021-12-14 12:38:14 +11:00
Adrien Ferrand
cb3e1403cd Fully type certbot-compatibility-test (#9133)
* Finish typing the module

* Use cast

* Precise type
2021-12-14 12:14:11 +11:00
alexzorin
3353c0df43 tests: remove Boulder v1 endpoint from certbot-ci and azure (#9140) 2021-12-13 10:42:15 -08:00
Adrien Ferrand
97d9e2c97d Fully type lock_test.py (#9126)
* Type lock_test.py

* Reconfigure tox

* Fix imports
2021-12-13 14:01:31 +11:00
Adrien Ferrand
89cefc177a Fix --help output (#9130) 2021-12-11 12:58:33 +11:00
Brad Warren
8799b108c2 fix macos coverage (#9137) 2021-12-11 12:43:17 +11:00
Brad Warren
dab7864809 Add macOS instructions (#9136)
* add macOS instructions

* add integration test warning
2021-12-11 12:28:18 +11:00
Brad Warren
693c674a7e Merge pull request #9128 from certbot/candidate-1.22.0
Release 1.22.0
2021-12-08 09:42:08 -07:00
Erica Portnoy
c02ead0f11 Bump version to 1.23.0 2021-12-07 14:03:51 -08:00
Erica Portnoy
d5ea9072af Add contents to certbot/CHANGELOG.md for next version 2021-12-07 14:03:51 -08:00
Erica Portnoy
6463a2e22d Release 1.22.0 2021-12-07 14:03:50 -08:00
Erica Portnoy
d6adc4a2d0 Update changelog for 1.22.0 release 2021-12-07 14:02:45 -08:00
Mads Jensen
402f18e039 Apache augeas clean up (#9114)
The `# self.comment = comment` caught my eye while working on #9071 as well as the intermediate variables, which aren't really needed. As a result, I reformatted the code slightly in those places.

* Remove comment in AugeasCommentNode.__init__

* Replace some intermediate varibles with return-statements in apache augeas parser.

* more clean-up
2021-12-02 08:45:16 -07:00
Adrien Ferrand
aeb7beb1b1 Fully type certbot-ci module (#9120)
* Fully type certbot-ci module

* Fix lint, focus lint

* Add trailing comma

* Remove unused private function

* Type properly for future usages

* Update certbot-ci/certbot_integration_tests/utils/acme_server.py

Co-authored-by: alexzorin <alex@zor.io>

* Cleanup files

* Fix import

* Fix mypy and lint

Co-authored-by: alexzorin <alex@zor.io>
2021-11-30 08:24:39 +11:00
moratori
0d10a44f4b Added --issuance-timeout command line option (#9056)
* Added --issuance-timeout command line option

* clarification of command line option name,docstring and add tests

* fix test case for python36

* improved the command line options
2021-11-30 08:17:06 +11:00
Adrien Ferrand
86406ab63a Add type annotations to the certbot package (part 4) (#9087)
* Extract from #9084

* Cast/ignore types during the transition

* Remove useless casts and type ignore directives

* Fix lint

* Fix a cast

* Mandatory typing for certbot packages

* Update certbot/certbot/_internal/plugins/disco.py

Co-authored-by: alexzorin <alex@zor.io>

* Remove unused type import

* Fix iterator type

* Fix type

* Fix types in selection

Co-authored-by: alexzorin <alex@zor.io>
2021-11-26 09:00:03 +11:00
Aaron Gable
7d3a344d43 Update py cryptography to >=2.5.0 (#9110)
* Update py cryptography to >=2.5.0

* Review feedback
2021-11-24 14:46:11 -08:00
Adrien Ferrand
250d7b1542 Add type annotations to the certbot package (part 3) (#9086)
* Extract from #9084

* Cast/ignore types during the transition

* Fix after review

* Fix lint

* Update certbot/certbot/_internal/storage.py

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/certbot/_internal/storage.py

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/certbot/_internal/main.py

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/certbot/_internal/main.py

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/certbot/_internal/client.py

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/certbot/_internal/client.py

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/certbot/_internal/auth_handler.py

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/certbot/_internal/auth_handler.py

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/certbot/_internal/auth_handler.py

Co-authored-by: alexzorin <alex@zor.io>

* Remove a cast usage

* Fix import

* Remove now useless cast

* Update certbot/certbot/_internal/client.py

Co-authored-by: alexzorin <alex@zor.io>

Co-authored-by: alexzorin <alex@zor.io>
2021-11-25 07:47:36 +11:00
Adrien Ferrand
19147e1b8c Add type annotations to the certbot package (part 2) (#9085)
* Extract from #9084

* Cast/ignore types during the transition

* Clean up

* Fix assertion

* Update certbot/certbot/display/ops.py

Co-authored-by: alexzorin <alex@zor.io>

* Use sequence

* Improve documentation of "default" in display

* Fix contract

* Fix types

* Fix type

* Fix type

* Update certbot/certbot/display/ops.py

Co-authored-by: alexzorin <alex@zor.io>

Co-authored-by: alexzorin <alex@zor.io>
2021-11-24 18:33:09 +11:00
Brad Warren
d1821b3ad7 Pin back setuptools-rust (#9112)
* pin back setuptools-rust

* make pylint happy

This was taken from https://github.com/certbot/certbot/pull/9073.

* pin back josepy

* Apply lint's code style suggestions

* fix lint again

Co-authored-by: Erica Portnoy <ebportnoy@gmail.com>
2021-11-23 12:35:49 -08:00
Paul Kehrer
267fb94478 Remove use of deprecated verifier with cryptography (#9105)
This was deprecated in version 2.1 and cryptography will be
removing it soon. The replacement function is available in all
versions of cryptography that certbot supports (2.1+)
2021-11-23 10:18:22 -08:00
alexzorin
6766786049 Fix Windows webroot crash when multiple domains have the same webroot (#9108)
* Certificate issuing on Window while having web.confing and more then one domain in request

* add a test

* update changelog

Co-authored-by: Serghei Trufkin <Serghei.Trufkin@Technosoft.md>
2021-11-22 19:00:55 +01:00
alexzorin
d2578e05e7 docs: describe how to modify renewal config (#9014)
* docs: describe how to modify renewal config

* Apply suggestions from code review

Co-authored-by: ohemorange <ebportnoy@gmail.com>

* reword warning about manual modifications

* explain the flags in the --force-renewal command

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2021-11-18 12:14:10 -08:00
alexzorin
2746fc572f webroot: unset existing mime type in web.config (#9092) 2021-11-15 14:35:18 +11:00
Adrien Ferrand
d20e42562c Add type annotations to the certbot package (part 1) (#9084)
* Extract from #9084

* Cast/ignore types during the transition

* Fix after review

* Fix lint
2021-11-12 14:27:46 +11:00
alexzorin
4756b66089 docs: update intersphinx url for certbot project (#9096) 2021-11-11 09:44:59 +01:00
Brad Warren
e8265dbf9c Add Python 3.10 support and tests (#9077)
Fixes https://github.com/certbot/certbot/issues/9058.

The changes to the CI config are equivalent to the ones made in https://github.com/certbot/certbot/pull/8460.

Other than ignoring some warnings raised by botocore, the main additional work that had to be done here was switching away from using `distutils.version.LooseVersion` since the entire `distutils` module was deprecated in Python 3.10. To do that, I took a few different approaches:

* If the version strings being parsed are from Python packages such as Certbot or setuptools, I switched to using [pkg_resources.parse_version](https://setuptools.pypa.io/en/latest/pkg_resources.html#parsing-utilities) from `setuptools`. This functionality has been available since [setuptools 8.0 from 2014](https://setuptools.pypa.io/en/latest/history.html#id865).
* If the version strings being parsed are not from Python packages, I added code equivalent to `distutils.version.LooseVersion` in `certbot.util.parse_loose_version`.
* The code for `CERTBOT_PIP_NO_BINARY` can be completely removed since that variable isn't used or referenced anywhere in this repo.

* add python 3.10 support

* make some version changes

* don't use looseversion in setup.py

* switch to pkg_resources

* deprecate get_strict_version

* fix route53 tests

* remove unused CERTBOT_PIP_NO_BINARY code

* stop using distutils in letstest

* add unit tests

* more changelog entries
2021-11-08 15:55:32 -08:00
orangepizza
b1edda8a65 fix a typo in gen_ss_cert type hint (#9089) 2021-11-07 14:18:15 +11:00
Brad Warren
81d5d2b421 Pin readthedocs deps (#9083)
* pin readthedocs deps

* fix reqs path
2021-11-04 20:35:44 +11:00
alexzorin
8f8dd2824e Merge pull request #9082 from certbot/candidate-1.21.0
Update files from 1.21.0 release
2021-11-04 12:41:05 +11:00
Brad Warren
9740f5428e Bump version to 1.22.0 2021-11-02 14:28:34 -07:00
Brad Warren
91c079ab41 Add contents to certbot/CHANGELOG.md for next version 2021-11-02 14:28:34 -07:00
Brad Warren
200e1f1709 Release 1.21.0 2021-11-02 14:28:33 -07:00
Brad Warren
e501e277b3 Update changelog for 1.21.0 release 2021-11-02 14:27:18 -07:00
Chris Swan
cdbc264bb6 Fix copyright date s/2015-2015/2015/ (#9070) 2021-10-25 12:20:57 -07:00
Adrien Ferrand
a0f22d21ce Add type annotations to the acme project (#9036)
* Start more types

* Second run

* Work in progress

* Types in all acme module

* Various fixes

* Various fixes

* Final fixes

* Disallow untyped defs for acme project

* Fix coverage

* Remote unecessary type ignore

* Use Mapping instead of Dict as input whenever it is possible

* Update acme/acme/client.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update acme/acme/client.py

Co-authored-by: alexzorin <alex@zor.io>

* Various fixes

* Fix code

* Fix code

* Update acme/acme/client.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update acme/acme/challenges.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update acme/acme/client.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Fix deactivate_registration and simplify signature of update_registration

* Do not leak personal data during account deactivation

* Clean more Dicts

* New fix to not leak contact field in the account deactivation payload.

* Add ignore for python 3.6 type check

* Revert "Add ignore for python 3.6 type check"

This reverts commit da7338137b798e3ace34de15ed12f76ec3cf3888.

* Let's find a smarter way than "type: ignore"

* Update certbot/certbot/_internal/account.py

Co-authored-by: alexzorin <alex@zor.io>

* Fix an annotation

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
Co-authored-by: alexzorin <alex@zor.io>
2021-10-25 09:43:21 +11:00
Adrien Ferrand
94af235713 Generate a web.config file for IIS to serve properly the challenge files in webroot plugin (#9054)
* Generate a web.config file to serve properly challenge files with IIS

* Fix cleanup, add test

* FIx lint

* Do not overwrite existing web.config. Delete only web.config when it has been created by Certbot and is unmodified.

* Fix lint

* Update certbot/certbot/_internal/plugins/webroot.py

Co-authored-by: alexzorin <alex@zor.io>

* Add log

* Check for POSIX_MODE before web.config deletion attempt.

* Add documentation

* Update certbot/CHANGELOG.md

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/docs/using.rst

Co-authored-by: alexzorin <alex@zor.io>
2021-10-24 08:37:40 +11:00
alexzorin
2375d87831 delete: add a warning about safe deletion (#8949) 2021-10-20 13:57:48 +11:00
Brad Warren
1a698fa235 update packaging docs to mention dl.eff.org (#9068) 2021-10-19 12:10:22 +11:00
Brad Warren
d250d34193 Change PGP keys (#9046)
* automate determining the key

* update packaging docs

* switch to new keys

* add changelog entry

* put keys in changelog
2021-10-14 14:27:15 -07:00
Piotr Kasprzyk
777935c8ed Remove trailing spaces from docs (#9064) 2021-10-12 22:56:24 +02:00
alexzorin
15c2792036 Merge pull request #9057 from certbot/candidate-1.20.0
Update files from 1.20.0 release
2021-10-06 18:54:52 +11:00
Brad Warren
46beb8af84 Bump version to 1.21.0 2021-10-05 06:53:59 -07:00
Brad Warren
aa63688450 Add contents to certbot/CHANGELOG.md for next version 2021-10-05 06:53:59 -07:00
Brad Warren
93f61887be Release 1.20.0 2021-10-05 06:53:57 -07:00
Brad Warren
54475964bd Update changelog for 1.20.0 release 2021-10-05 06:52:55 -07:00
Adrien Ferrand
065df4c9a7 Support Python 3.9 on Windows and package installer on it (#9053)
It seems that all required pre-compiled wheels to install Certbot on Python 3.9 on Windows are present.

This PR upgrades Windows tests on Python 3.9 and repackages the installer on this version of Python.
2021-10-04 14:20:49 -07:00
Brad Warren
cde3e1fa97 fix typo in error message (#9047) 2021-09-29 10:29:49 -07:00
alexzorin
bb2db252a7 stop using deprecated jose abstractclassmethod (#9045)
The josepy 1.10.0 release deprecated this decorator and [caused the nightly `nopin` test to break](https://dev.azure.com/certbot/certbot/_build/results?buildId=4548&view=logs&j=ce03f7c1-1e3f-5d55-28be-f084e7c62a50&t=597fea95-d44e-53a2-5b71-76ed20bd4dde).
2021-09-28 10:48:50 -07:00
alexzorin
abe23c0e60 missing trailing '?' in non-interactive checklist (#9043) 2021-09-22 11:07:30 -07:00
alexzorin
b0aa064640 dns-rfc2136: use certbot's own is_ipaddress func (#9035)
* dns-rfc2136: use certbot's own is_ipaddress func

* oldest: pin dnspython==1.15.0 (epel8 version)

* inhibit deprecationwarning for dnspython==1.15.0

* dns-rfc2136: declare minimum version of dnspython

* add changelog entry
2021-09-14 07:48:15 +10:00
Adrien Ferrand
bd5f4f2d8a Increase minimum of josepy version to use and update the oldest contraints. (#9032)
As a follow-up to #9027, this PR increases the minimum version of `josepy` to use and updates the oldest constraints accordingly.
2021-09-10 16:08:13 -07:00
alexzorin
aea3c7e363 add --no-reuse-key (#9029)
Fixes #9002.
2021-09-10 12:27:53 -07:00
Adrien Ferrand
fc02b10560 Upgrade pinned versions of certbot dependencies (josepy in particular) (#9027)
This PR upgrades the pinned version of the dependencies. Version `1.9.0` of josepy is used so errors related to JWK serialization with EC keys (see https://github.com/certbot/josepy/issues/109) are fixed for Certbot.
2021-09-10 12:26:07 -07:00
Brad Warren
ee190db235 Update oldest pyproject.toml comments (#8999)
* update oldest pyproject.toml comments

* Apply suggestions from code review

Co-authored-by: ohemorange <erica@eff.org>

* improve wording

Co-authored-by: ohemorange <erica@eff.org>
2021-09-09 14:57:55 -07:00
Brad Warren
077d28828a Add documentation about legacy cb-auto files (#9011)
* Add documentation about legacy cb-auto files

* Apply suggestions from code review

Co-authored-by: ohemorange <erica@eff.org>

Co-authored-by: ohemorange <erica@eff.org>
2021-09-09 13:21:47 -07:00
alexzorin
0b63d81f95 cli: minor copy changes to renew help text (#9025)
Fixes #9009.
2021-09-09 12:13:09 -07:00
alexzorin
d139e26a1c fix 'NEXT STEPS' being printed to stdout during -q (#9023)
@osirisinferi noticed [in chat](https://opensource.eff.org/eff-open-source/pl/sa85u4n71tywfpc15c1wu59wae) that "NEXT STEPS:" was ignoring `--quiet` and was being printed unconditionally.

I think it ended up being written this way in #8860 because I was trying not to avoid dumping ANSI escapes and newlines into the log file and confused myself in the process. 

This change makes things a bit more explicit in separating presentation/message.

* fix 'NEXT STEPS' being printed to stdout during -q

* fix tests
2021-09-09 12:10:27 -07:00
alexzorin
dedd0b84a8 Merge pull request #9024 from certbot/candidate-1.19.0
Update files from 1.19.0 release
2021-09-09 08:42:39 +10:00
Brad Warren
b9e4763de3 Bump version to 1.20.0 2021-09-07 10:15:07 -07:00
Brad Warren
8897a81f7d Add contents to certbot/CHANGELOG.md for next version 2021-09-07 10:15:07 -07:00
Brad Warren
5d6abc3234 Release 1.19.0 2021-09-07 10:15:05 -07:00
Brad Warren
dc7524d1d6 Update changelog for 1.19.0 release 2021-09-07 10:13:51 -07:00
alexzorin
70a18a9486 disable donation prompt during --quiet (#9022)
Issuing a certificate with --quiet was crashing during the donation
atexit call because it was trying to use the /dev/null fd after the
displayer context manager had already closed it.
2021-09-07 08:38:27 -07:00
alexzorin
b7bde05aee docs: redirect macOS users to different cron guide (#9013)
Due to macOS having some complications about Certbot from Homebrew being
in the PATH, the instructions we have in the Automated Renewal section
do not work for them. Instead, send those users to the instruction
generator.
2021-09-03 07:49:25 -07:00
alexzorin
8ff7153019 snap: revert to checking snapctl file existence (#9018)
While the previous approach of testing the functionality of snapctl
worked, the snapd developers told us they could not guarantee its
reliability.

---

As with #8955, I tested this on Debian 9, 10 and CentOS 7, 8, Stream.
2021-09-03 07:47:12 -07:00
Stefan Weil
0d4f92fa81 Fix some typos (found by codespell) (#9017)
* Fix some typos (found by codespell)

Signed-off-by: Stefan Weil <sw@weilnetz.de>

* Remove typo fixes for some files which should not be modified

Signed-off-by: Stefan Weil <sw@weilnetz.de>
2021-09-03 06:43:13 +10:00
Brad Warren
1a2d74decc Add comment about security alerts. (#9016) 2021-09-03 06:40:18 +10:00
Brad Warren
f6d5c8ffbe Make ACMEv1 deprecation warnings scarier (#9015)
Fixes https://github.com/certbot/certbot/issues/6844.

This PR does two things:

1. Changes ACMEv1 deprecation warnings from `PendingDeprecationWarning` to `DeprecationWarning`.
2. Changes the ACMEv1 deprecation warnings to be on references to the class themselves. This is the approach taken in https://github.com/certbot/certbot/pull/8989, the PRs linked there, and the `cryptography` code in the code comment. I think this approach warns in more cases and I updated our unit tests to avoid hitting these warnings.
2021-08-30 15:38:12 -07:00
orangepizza
52e207a404 add ip address support to certbot/acme lib (2021 ver) (#8904)
* add ip address support to acme saving

* remove client-site check for ip address

* using right prefix for san parsing

* remove type hint for backward compatibility

* remove bare ip blocking check from main_test

* upppercase

* lint tix

* add additional tests for new IP support

* support for ipv6 bare address

* make apache and nginx plugin raise error for certs with ip address

* linting

* add pem file's last newline char

* gen_ss_cert ip support and comment fixup

* fix test coverage

* indent fix and assetTrue to assetIN

* indent mistake, made a note where class end

* acme lib now receive IPs as separate list

* fix typos

* type 2

* fix tests

* Deny IP address on certbot/certbot side as LE don't support it

* remove excess empty line to rerun tox

* comment indent and typo fix

Apply suggestions from code review

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* trim unused functions

* trim unused import

* make raw san list extraction as separate function

* Apply suggestions from code review

mostly comment suggestions here

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* apply patches suggested on review.

* remove excessive empty lines

* update CHANGELOG.md

* added acme lib update about ipaddress support in CHANGELOG.md

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2021-08-27 06:47:01 -07:00
alexzorin
694c03bd6a lower coverage threshold for rfc2136 integration (#9006) 2021-08-25 07:40:26 -07:00
Brad Warren
058faeadac Propagate requirement that ACME responses are UTF-8 (#9001)
I think this fixes https://github.com/certbot/certbot/issues/8968.

The only other calls with `requests` we make in our code outside of our tests that I could find are:

1. [Here](a8a8a39ff1/certbot/certbot/_internal/eff.py (L91)) where we assume the response is JSON and I think [requests behavior](db575eeedc/requests/models.py (L891-L896)) is sane.
2. [Here](a8a8a39ff1/certbot/certbot/ocsp.py (L190)) where we know the response contains binary data.

I think this is a pretty minor change because we were already assuming the response was UTF-8 in the code here when logging it which I think is a valid assumption because the spec says that [all content should be UTF-8 encoded](https://datatracker.ietf.org/doc/html/rfc8555#section-5).

I added the check for the `Accept` header due to the text [here](https://datatracker.ietf.org/doc/html/rfc8555#section-7.4.2) saying that it can be used to request the certificate in an alternate format such as DER. We currently set the Accept header in our own ACMEv1 client code before downloading the DER certificate, but this isn't required according to [the closest thing I think we have to an ACMEv1 spec](f1894f8d1d/docs/acme-divergences-v1.md (section-742)) so I left the content type check with a comment that it can be removed in the future.

* Revert "add chardet dep (#8965)"

This reverts commit 1129d850d3.

* set response.encoding in acme

* more docs
2021-08-23 10:57:34 -07:00
osirisinferi
295dc5a2a9 certbot-dns-rfc2136: catch error when a hostname is being used for dns_rfc2136_server (#8990)
* Raise separate error when a hostname is being used for `dns_rfc2136_server`

* Explicitly say IP address instead of hostname in docs

* Don't catch ValueError, but actually check the server value

* Add tests

* Add CHANGELOG entry
2021-08-23 09:38:14 +10:00
Brad Warren
a8a8a39ff1 upgrade pip (#9000)
This is just an oldest tests version of https://github.com/certbot/certbot/pull/8993.
2021-08-19 15:15:31 -07:00
Brad Warren
435ae075a5 remove zope from plugin example (#8998) 2021-08-18 09:43:40 -07:00
Adrien Ferrand
06c8113863 Cleanup zope dependencies in plugins and upgrade sphinx (#8997)
This PR removes all zope dependencies from plugins configuration.

It also lets Sphinx upgrade to the next major version by removing the plugin dedicated to zope interfaces documentation. As a consequence, the deprecated zope interfaces are not documented anymore.

* Cleanup zope dependencies in plugins and upgrade sphinx

* Update pinnings
2021-08-18 08:12:55 -07:00
Adrien Ferrand
143ea15253 Remove all non essential references to the old Zope interfaces (#8988)
As a follow-up to #8971, this PR removes all references to the old Zope interfaces, except the ones used to deprecate them and prepare for their removal.

In the process, some documentation and tests about the `Display` objects are simply removed since they are not relevant anymore given that they are removed from the public API.

* Cleanup some interfaces.IInstaller

* Cleanup IConfig doc

* Allmost complete removal

* Remove useless tests

* Fixes

* More cleanup

* More cleanup

* More cleanup

* Remove a non existent reference

* Better type

* Fix lint
2021-08-17 14:51:26 -07:00
Adrien Ferrand
acf48df979 Use latest version of mypy (#8992)
Fixes #8899

This PR removes the pinning upper limit of mypy currently set to <0.900 and adds the required types-* stub packages to make recent versions of mypy work.

* Unpin mypy

* Improve type in TempHandler

* Add types
2021-08-17 10:52:57 -07:00
Adrien Ferrand
6a9e0ec59d Add deprecation warnings for deprecated elements in certbot.display.util (#8989)
Fix #8982.

This PR takes essentially the same approach than in #8970 and https://github.com/certbot/certbot/pull/6859/files#diff-e5eaf744409c293203b898ba9896da75689fd04ff5f1566c035940a5b195c257

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2021-08-17 09:13:54 -07:00
Adrien Ferrand
5b96cc9c37 Release pip pinning (#8993)
The bug that was requiring pip to stay on 20.2.4 has been fixed on version 21.2.x. Let's release the pip pinning with this PR.
2021-08-16 15:14:22 -07:00
Adrien Ferrand
525c427c60 Cleanup some useless type ignore directives (#8987)
* Cleanup some useless type ignore directives

* Cleanup one more type ignore directive

Co-authored-by: Adrien Ferrand <aferrand@ecomundo.eu>
2021-08-17 07:43:56 +10:00
Adrien Ferrand
23e1e07139 Emit deprecation warnings for Zope interfaces (#8970)
* Monkeypatch certbot.interfaces module to warn about deprecations

* Ignore our own warning

* Fix type

* Add a changelog entry
2021-08-15 07:06:29 +10:00
alexzorin
241a7c32a2 docs: add basic intro to certbot in user guide (#8979)
* docs: add basic intro to certbot in user guide

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2021-08-13 17:39:12 -07:00
alexzorin
10dc41e83d docs: add "Deleting Certificates" to user guide (#8910)
I want this for #8949.

I think this is quite verbose, but purposefully so as an intervention to try prevent users from hitting this problem. It's more of a "How-To Guide" than a "Reference Guide" (in the lingo of https://documentation.divio.com). 

* docs: add "Deleting Certificates" to user guide

* try a less convoluted explanation

about what the installer did in the first place

* add a warning early on: read the full thing

* erica's copy changes

* rewrite as a how-to guide

* rewrite self-signed step 2 for mental model++

* rewrite intro to "safely deleting certificates"
2021-08-13 14:04:47 -07:00
Adrien Ferrand
6943cea6b7 Reimplement zope interfaces into abc in compatibility tests (#8971)
* Reimplement zope interfaces into abc in compatibility tests

* Refactor to fix lint and mypy warnings

* Fix inheritance
2021-08-13 11:00:33 +10:00
Brad Warren
b4c49cf781 Improve snapcraft remote build (#8985)
[Snapcraft 5.0](https://forum.snapcraft.io/t/release-notes-snapcraft-5-0/25751) implemented creating build IDs based on the project's contents instead of the directory path in https://github.com/snapcore/snapcraft/pull/3554. This is a feature we initially wanted, but it broke our workaround added in https://github.com/certbot/certbot/pull/8719. Our workaround is broken because now that the build ID is based on the project's contents, copying the project to a temporary directory has no effect.

This PR removes the workaround from https://github.com/certbot/certbot/pull/8719 and instead constructs a random build ID that it provides to snapcraft. This provides us with even more randomness to avoid build ID conflicts while avoiding having to copy the project to a temporary directory before every build.

* improve-remote-build

* use lowercase letters
2021-08-12 15:34:40 -07:00
Yaroslav Halchenko
5e87aee968 BF: apache cfg parsing - relax assumption that value cannot contain = (#8930)
* BF: apache cfg parsing - relax assumption that value cannot contain =

* Remove failing test_update_runtime_vars_bad_output

* Add test Define statements: with = in value, and an empty value

* update CHANGELOG

Co-authored-by: Alex Zorin <alex@zorin.id.au>
2021-08-13 07:57:24 +10:00
Brad Warren
693a2a7904 remove outdated example code (#8984)
There are a couple problems with these files.

1. `python -m acme.standalone` from the README hasn't worked since https://github.com/certbot/certbot/pull/7483.
2. The symlinks for the PEM files have been broken since https://github.com/certbot/certbot/pull/7600.

Because of this and the fact [these example files are causing snap build failures](https://dev.azure.com/certbot/certbot/_build/results?buildId=4395&view=logs&j=f44d40a4-7318-5ffe-762c-ae4557889284&t=07786725-57f8-5198-4d13-ea77f640bd5c&l=78), let's delete it.
2021-08-12 14:04:22 -07:00
Brad Warren
3058b6e748 Fix circular import (#8967)
* add internal display util

* Move display constants internal.

* move other utilities internal

* fix OK and CANCEL documentation
2021-08-05 08:49:20 +02:00
Brad Warren
7b78770010 fix egg-info cleanup (#8966) 2021-08-05 07:04:05 +10:00
Brad Warren
cd2dff2db1 Merge pull request #8969 from certbot/candidate-1.18.0
Release 1.18.0
2021-08-03 16:05:39 -07:00
Erica Portnoy
8194e8faef Bump version to 1.19.0 2021-08-03 13:23:45 -07:00
Erica Portnoy
06698ad95f Add contents to certbot/CHANGELOG.md for next version 2021-08-03 13:23:45 -07:00
Erica Portnoy
0d76d1f219 Release 1.18.0 2021-08-03 13:23:13 -07:00
Erica Portnoy
5c3c682b6e Update changelog for 1.18.0 release 2021-08-03 13:12:59 -07:00
Brad Warren
1129d850d3 add chardet dep (#8965) 2021-08-03 10:35:00 +10:00
alexzorin
bdc48e6a32 snap: workaround for snapctl crash in plugin hook (#8955)
* snap: workaround for snapctl crash in plugin hook

* test functionality, not existence
2021-08-02 16:15:46 -07:00
alexzorin
523f8f5e65 stop using deprecated distro.linux_distribution (#8961)
`distro.linux_distribution` was deprecated (https://github.com/python-distro/distro/pull/296) in the release of `distro` at the end of last week. The deprecation is causing the `nopin` nightly tests to fail.

This change migrates Certbot off that function.

As far as I can tell, the Arch Linux edge case described in the code comments no longer happens, but better to be safe than sorry I think.

* stop using deprecated distro.linux_distribution

* update comment

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2021-08-02 15:39:07 -07:00
Brad Warren
1dabddeb85 make display classes implement IDisplay (#8963) 2021-08-02 11:59:07 -07:00
Brad Warren
f9ef894141 update snapcraft.cfg comment (#8959) 2021-08-01 12:15:01 +10:00
Adrien Ferrand
979e21dcbf Reimplement Certbot zope.interfaces into abstract base classes (#8950)
* Implement certbot services

* Various fixes

* Local oldest requirements

* Clean imports

* Add unit tests for certbot.services

* Clean code

* Protect against nullity of global services

* Fix CLI

* Fix tests

* Consistent test behavior

* Define new ABC classes

* Reimplement services with new ABC classes

* Adapt plugins discovery and selection

* Remove zope interfaces from plugins

* Re-enable delegation for simplicity

* Fix interfaces declaration

* Remove interface implementer

* Interfaces ordering

* Extract zope logic from discovery

* Cleanup imports

* Fixing tests

* Fix main_test

* Finish certbot unit tests

* Fix lint

* Various fixes thanks to mypy

* Fix lint

* Order imports

* Various fixes

* Clean code

* Remove reporter service, migrate display service in certbot.display.util.

* Fix test

* Fix apache compatibility test

* Fix oldest test

* Setup certbot.display.service module

* Reintegrate in util

* Fix imports

* Fix tests and documentation

* Refactor

* Cleanup

* Cleanup

* Clean imports

* Add unit tests

* Borrow sphinx build fix from #8863

* Align zope interfaces on ABC

* Various fixes

* Fix type

* Fix type

* Some cleanup

* Fix lint

* Update certbot/certbot/_internal/configuration.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/certbot/_internal/configuration.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Fix imports

* Fix Config contract (accounts_dir property)

* Remove unnecessary interface

* Set NamespaceConfig public, remove Config interface

* Remove Display ABC and implementation of IDisplay

* Clean lint

* Cleanup old decorators

* Contract on plugin constructor only

* Update certbot/certbot/tests/util.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/certbot/configuration.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/certbot/interfaces.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Some corrections

* Add changelog

* Fix --authenticators and --installers flags on plugins subcommand

* Fix multiheritance on the interface Plugin

* Update certbot/certbot/_internal/plugins/manual.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/certbot/_internal/plugins/disco.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Add warnings in logger also

* Add deprecation warnings also when plugins are verified.

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2021-07-29 13:45:29 -07:00
Adrien Ferrand
8133d3e70a Fix python-augeas failure on Windows (v2) (#8951)
This PR is a new approach for fixing #8732 based on the discussions occurred in the first PR #8877.

This PR upgrades python-augeas to the latest version, and avoids tests failure of Windows because of this upgrade. To do so it leverages the [tox multi-platform feature](https://tox.readthedocs.io/en/latest/example/platform.html) and modifications to `tools/venv.py` in order to not install and not test `certbot-apache` on Windows.

* Unpin python-augeas and upgrade current pinnings

* Do not install certbot-apache in Windows dev environments

* Introduce tox specific win packages + remove certbot compatibility on windows

* Add libaugeas to sphinx build

* Redefine lint and mypy targets

* Keep the lint and mypy environments
2021-07-29 11:25:25 -07:00
Brad Warren
08839758bd Finish pinning system rewrite (#8934)
* add oldest pyproject.toml file that works

* make single oldest_constraints.txt file

* remove unused merge_requirements.py

* remove unused import

* make conditional right

* simplify pip_install.py

* fix typo

* bump min dns-lexicon dependency

* fix zope import warning

* pin back wheel

* refactor pinning script

* Add oldest script.

* add pip comment

* add pipstrap extra

* simplify pinning scripts

* remove pipstrap extra

* update contributing

* Add design doc

* Update tools/pinning/DESIGN.md

Co-authored-by: ohemorange <erica@eff.org>

* Update tools/pinning/DESIGN.md

Co-authored-by: ohemorange <erica@eff.org>

* Update tools/pinning/DESIGN.md

Co-authored-by: ohemorange <erica@eff.org>

* Update tools/pinning/DESIGN.md

Co-authored-by: ohemorange <erica@eff.org>

* rename normal to current

* no dummies

* script improvements

* mention need to update setup.py

* try and clarify poetry behavior

* tweak section title

Co-authored-by: ohemorange <erica@eff.org>
2021-07-22 12:00:30 -07:00
Adrien Ferrand
10eecf9c97 Deprecate zope.component in favor of an direct calls to functions from certbot.display.util module (#8835)
* Implement certbot services

* Various fixes

* Local oldest requirements

* Clean imports

* Add unit tests for certbot.services

* Clean code

* Protect against nullity of global services

* Fix CLI

* Fix tests

* Consistent test behavior

* Various fixes

* Clean code

* Remove reporter service, migrate display service in certbot.display.util.

* Fix test

* Fix apache compatibility test

* Fix oldest test

* Setup certbot.display.service module

* Reintegrate in util

* Fix imports

* Fix tests and documentation

* Refactor

* Cleanup

* Cleanup

* Clean imports

* Add unit tests

* Borrow sphinx build fix from #8863

* Fix type

* Add comment

* Do not reuse existing display service, which never exist at that time

* Make get_display() private

* Fix lint

* Make display internal

* Fix circular dependencies

* Fixing circular dependencies

* Rename patch methods and update docstring

* Update deprecation messages

* Update certbot/certbot/_internal/display/obj.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/certbot/tests/util.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/certbot/tests/util.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/certbot/tests/util.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update certbot/certbot/tests/util.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Add links

* Avoid relying on internal certbot packages from certbot-apache

* Keep same behavior for patch_get_utility*

* Better diff

* Add changelog

* Update certbot/certbot/tests/util.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2021-07-19 17:09:06 -07:00
alexzorin
bebd399488 acme: deprecate ACMEv1 client classes (#8931)
* acme: deprecate ACMEv1 client classes

Adds pending deprecations to:
- acme.client.Client
- acme.client.BackwardsCompatibleClientV2

Adds a warning to Certbot when a v1 server is detected.

* move thsi change from 1.17 to 1.18

* revert some whitespace changes
2021-07-16 08:50:16 +10:00
alexzorin
a105b587ac apache: fix crash when authenticating empty vhosts (#8941)
Fixes #8940.
2021-07-15 11:12:14 -07:00
alexzorin
8e29063ba7 pylint: upgrade pinned verson and fix new lints (#8936)
While bumping pinned packages in #8928, we came across a new version of pylint (2.9.3). Upgrading to this version requires some changes to Certbot's code, which is what this change is about.

* pylint: upgrade pinned verson and fix new lints

* maxsplit should be 1, not -1, for rsplit
2021-07-15 11:03:39 -07:00
Brad Warren
117791b582 Remove unneeded certbot-auto files (#8938) 2021-07-14 14:34:54 -07:00
Brad Warren
2ab7857fa5 Do not guess HTTP-01 response encoding (#8942)
* fix http-01 encoding

* improve comment
2021-07-14 14:11:50 -07:00
ohemorange
7ede5c3487 Merge pull request #8933 from certbot/candidate-1.17.0
Update files from 1.17.0 release
2021-07-06 12:38:04 -07:00
Brad Warren
915459258b Bump version to 1.18.0 2021-07-06 08:42:52 -07:00
Brad Warren
d94cf0e1d6 Add contents to certbot/CHANGELOG.md for next version 2021-07-06 08:42:51 -07:00
Brad Warren
952a296e20 Release 1.17.0 2021-07-06 08:42:49 -07:00
Brad Warren
d9a1850eaa Update changelog for 1.17.0 release 2021-07-06 08:41:16 -07:00
alexzorin
667750f3ff docs: explain the situation with --manual renewal (#8911)
* docs: explain the situation with --manual renewal

* note that the non-hook command can't be cronned

* add xref to #renewing-certificates

* update manual description in the plugins table

* redirect manual users towards other plugins

* refer to authentication hook scripts in table
2021-06-28 16:40:24 -07:00
Rene Luria
8b610239bf Adds Infonaniak 3rd party plugin (#8923) 2021-06-25 14:46:37 -04:00
ohemorange
62426caa5a Merge pull request #8919 from alexzorin/standalone-error-ux
Improve standalone errors
2021-06-21 16:54:36 -07:00
Alex Zorin
f137d8424e acme.standalone: expose original socket.error 2021-06-22 09:24:53 +10:00
Alex Zorin
e5c41e76c5 standalone: add an auth_hint 2021-06-22 09:24:44 +10:00
alexzorin
1e114b4ef8 apache: configure nameless vhosts during auth (#8898)
In the apache2 package on Debian-based distros, the default
000-default.conf virtual host does not include a ServerName.

Depending on the FQDN hostname of the machine and DNS setup, Apache
assigns a name to this unnamed vhost at runtime. As a result, the
Apache config end up with vhosts that have duplicative names.

Previously, Certbot did not identify that the nameless vhost could be
a match for the requested identifier, which would, depending on
configuration load order, cause the authenticator to fail.

This change causes Certbot to include all unnamed vhosts on top of
matched vhosts, during authentication. If no vhosts matched, the
existing behavior remains the same.

* apache: configure nameless vhosts during auth

* vhost is only unnamed if ServerName is not set

* also fix test to only match ServerName

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2021-06-21 07:18:29 -04:00
alexzorin
bc7c953bcc cli: vary renewal advice for hookless manual certs (#8914)
* cli: vary renewal advice for hookless manual certs

1. Don't print that the certificate will be automatically renewed,
because it won't be.
2. Add a "NEXT STEP" telling the user that they will need to manually
re-issue the certificate in order to renew it.

* kill superfluous comma

Co-authored-by: ohemorange <ebportnoy@gmail.com>

* clarify wording of the next step

* fix the test

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2021-06-17 16:36:54 -07:00
alexzorin
60a91eb688 certonly: hide "NEXT STEPS" for dry-runs (#8901)
* certonly: hide "NEXT STEPS" for dry-runs

* add a test
2021-06-14 14:25:43 -07:00
chaptergy
1b025e84e8 Adds njalla, DuckDNS and Porkbun 3rd party plugins (#8907) 2021-06-14 13:23:35 -07:00
kartikynwa
d3555623ba certbot-apache: Add Void Linux overrides (#8891)
* certbot-apache: Add Void Linux overrides

* certbot-apache: Correct distro name to Void Linux
2021-06-12 17:02:16 +10:00
Brad Warren
18ea72faf1 Split out testing extras (#8893)
* split out test extras

* update extras and regenerate pinnings

* pin back mypy
2021-06-11 13:17:50 -07:00
ohemorange
c8255dded5 Add --verbose-level flag and fix logging level calculations (#8900)
Also, update `dev-cli.ini` example to use new flag.

Although https://github.com/bw2/ConfigArgParse/pull/216 allowed setting a `count` action value in a config file, our default detection system won't let us use that functionality. While we should eventually fix that, for now, let developers have a cli.ini with a higher logging level by adding this flag.

Note that this flag is intended to work the same way adding `-vvv`s does; that is, as a modifier to the pre-set level, rather than setting the absolute level. The number it is set to is equivalent to the number of `v`s that would otherwise have been passed, with "2" as the current maximum effective number of levels (warning --> info --> debug).

* Add --verbose-level flag for devs to set in cli.ini

* Update dev-cli.ini to use new flag
2021-06-10 16:45:07 -07:00
ohemorange
b48e336554 Allow nginx parser to handle empty file (#8895)
* Allow parsing empty files

* add unit test

* lint

* update parser_test

* Update configurator_test

* update changelog
2021-06-11 09:21:52 +10:00
alexzorin
0c637860cd cli: improve error messages for enhance errors (#8884)
* cli: improve error messages for enhance errors

* remove status message after enhance config revert
2021-06-10 15:58:11 -07:00
Brad Warren
0b08a80dce Pin pip & co like our other dependencies (#8868)
* use poetry 1.2.0a1

* pin pip normally

* use normal constraints file with pipstrap

* remove unused STRIP_HASHES var

* Check for old poetry versions

* keep pip, setuptools, and wheel pinned in oldest

* remove strip hashes

* pin back pip

* fix new lint error
2021-06-09 17:01:54 -07:00
356 changed files with 9973 additions and 10945 deletions

View File

@@ -4,7 +4,7 @@ jobs:
- name: IMAGE_NAME
value: ubuntu-18.04
- name: PYTHON_VERSION
value: 3.9
value: 3.10
- group: certbot-common
strategy:
matrix:
@@ -17,60 +17,43 @@ jobs:
linux-py38:
PYTHON_VERSION: 3.8
TOXENV: py38
linux-py39:
PYTHON_VERSION: 3.9
TOXENV: py39
linux-py37-nopin:
PYTHON_VERSION: 3.7
TOXENV: py37
CERTBOT_NO_PIN: 1
linux-external-mock:
TOXENV: external-mock
linux-boulder-v1-integration-certbot-oldest:
PYTHON_VERSION: 3.6
TOXENV: integration-certbot-oldest
ACME_SERVER: boulder-v1
linux-boulder-v2-integration-certbot-oldest:
PYTHON_VERSION: 3.6
TOXENV: integration-certbot-oldest
ACME_SERVER: boulder-v2
linux-boulder-v1-integration-nginx-oldest:
PYTHON_VERSION: 3.6
TOXENV: integration-nginx-oldest
ACME_SERVER: boulder-v1
linux-boulder-v2-integration-nginx-oldest:
PYTHON_VERSION: 3.6
TOXENV: integration-nginx-oldest
ACME_SERVER: boulder-v2
linux-boulder-v1-py36-integration:
PYTHON_VERSION: 3.6
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py36-integration:
PYTHON_VERSION: 3.6
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py37-integration:
PYTHON_VERSION: 3.7
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py37-integration:
PYTHON_VERSION: 3.7
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py38-integration:
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py38-integration:
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py39-integration:
PYTHON_VERSION: 3.9
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py39-integration:
PYTHON_VERSION: 3.9
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v2-py310-integration:
PYTHON_VERSION: 3.10
TOXENV: integration
ACME_SERVER: boulder-v2
nginx-compat:
TOXENV: nginx_compat
linux-integration-rfc2136:
@@ -79,6 +62,9 @@ jobs:
TOXENV: integration-dns-rfc2136
docker-dev:
TOXENV: docker_dev
le-modification:
IMAGE_NAME: ubuntu-18.04
TOXENV: modification
macos-farmtest-apache2:
# We run one of these test farm tests on macOS to help ensure the
# tests continue to work on the platform.

View File

@@ -59,7 +59,7 @@ jobs:
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
versionSpec: 3.9
architecture: x86
addToPath: true
- script: |
@@ -100,7 +100,7 @@ jobs:
displayName: Check Powershell 5.x is used in vs2017-win2016
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
versionSpec: 3.9
addToPath: true
- task: DownloadPipelineArtifact@2
inputs:

View File

@@ -1,28 +1,28 @@
jobs:
- job: test
variables:
PYTHON_VERSION: 3.9
PYTHON_VERSION: 3.10
strategy:
matrix:
macos-py36:
macos-py36-cover:
IMAGE_NAME: macOS-10.15
PYTHON_VERSION: 3.6
TOXENV: py36
macos-py39:
TOXENV: py36-cover
macos-py310-cover:
IMAGE_NAME: macOS-10.15
PYTHON_VERSION: 3.9
TOXENV: py39
PYTHON_VERSION: 3.10
TOXENV: py310-cover
windows-py36:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.6
TOXENV: py36
windows-py38-cover:
TOXENV: py36-win
windows-py39-cover:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.8
TOXENV: py38-cover
PYTHON_VERSION: 3.9
TOXENV: py39-cover-win
windows-integration-certbot:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.8
PYTHON_VERSION: 3.9
TOXENV: integration-certbot
linux-oldest-tests-1:
IMAGE_NAME: ubuntu-18.04
@@ -36,18 +36,18 @@ jobs:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.6
TOXENV: py36
linux-py39-cover:
linux-py310-cover:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.9
TOXENV: py39-cover
linux-py39-lint:
PYTHON_VERSION: 3.10
TOXENV: py310-cover
linux-py310-lint:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.9
TOXENV: lint
linux-py39-mypy:
PYTHON_VERSION: 3.10
TOXENV: lint-posix
linux-py310-mypy:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.9
TOXENV: mypy
PYTHON_VERSION: 3.10
TOXENV: mypy-posix
linux-integration:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.8
@@ -56,11 +56,6 @@ jobs:
apache-compat:
IMAGE_NAME: ubuntu-18.04
TOXENV: apache_compat
# le-modification can be moved to the extended test suite once
# https://github.com/certbot/certbot/issues/8742 is resolved.
le-modification:
IMAGE_NAME: ubuntu-18.04
TOXENV: modification
apacheconftest:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.6

View File

@@ -19,11 +19,12 @@ stages:
# Then the file was added as a secure file in Azure pipelines
# with the name snapcraft.cfg by following the instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops
# including authorizing the file in all pipelines as described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops#how-do-i-authorize-a-secure-file-for-use-in-all-pipelines.
# including authorizing the file for use in the "nightly" and "release"
# pipelines as described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops#q-how-do-i-authorize-a-secure-file-for-use-in-a-specific-pipeline.
#
# This file has a maximum lifetime of one year and the current
# file will expire on 2021-07-28 which is also tracked by
# file will expire on 2022-07-25 which is also tracked by
# https://github.com/certbot/certbot/issues/7931. The file will
# need to be updated before then to prevent automated deploys
# from breaking.

View File

@@ -1,5 +1,8 @@
steps:
- bash: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends libaugeas0
FINAL_STATUS=0
declare -a FAILED_BUILDS
tools/venv.py

View File

@@ -67,7 +67,14 @@ extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
# 5) wrong-import-order generates false positives and a pylint developer
# suggests that people using isort should disable this check at
# https://github.com/PyCQA/pylint/issues/3817#issuecomment-687892090.
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue,raise-missing-from,wrong-import-order
# 6) unspecified-encoding generates errors when encoding is not specified in
# in a call to the built-in open function. This relates more to a design decision
# (unspecified encoding makes the open function use the default encoding of the system)
# than a clear flaw on which a check should be enforced. Anyway the project does
# not need to enforce encoding on files so we disable this check.
# 7) consider-using-f-string is "suggesting" to move to f-string when possible with an error. This
# clearly relates to code design and not to potential defects in the code, let's just ignore that.
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue,raise-missing-from,wrong-import-order,unspecified-encoding,consider-using-f-string
[REPORTS]

View File

@@ -138,6 +138,7 @@ Authors
* [Joubin Jabbari](https://github.com/joubin)
* [Juho Juopperi](https://github.com/jkjuopperi)
* [Kane York](https://github.com/riking)
* [Katsuyoshi Ozaki](https://github.com/moratori)
* [Kenichi Maehashi](https://github.com/kmaehashi)
* [Kenneth Skovhede](https://github.com/kenkendk)
* [Kevin Burke](https://github.com/kevinburke)

View File

@@ -5,12 +5,18 @@ import functools
import hashlib
import logging
import socket
from typing import cast
from typing import Any
from typing import Dict
from typing import Mapping
from typing import Optional
from typing import Tuple
from typing import Type
from cryptography.hazmat.primitives import hashes # type: ignore
from cryptography.hazmat.primitives import hashes
import josepy as jose
from OpenSSL import crypto
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
from OpenSSL import SSL
import requests
from acme import crypto_util
@@ -25,10 +31,10 @@ logger = logging.getLogger(__name__)
class Challenge(jose.TypedJSONObjectWithFields):
# _fields_to_partial_json
"""ACME challenge."""
TYPES: dict = {}
TYPES: Dict[str, Type['Challenge']] = {}
@classmethod
def from_json(cls, jobj):
def from_json(cls, jobj: Mapping[str, Any]) -> 'Challenge':
try:
return super().from_json(jobj)
except jose.UnrecognizedTypeError as error:
@@ -39,7 +45,7 @@ class Challenge(jose.TypedJSONObjectWithFields):
class ChallengeResponse(ResourceMixin, TypeMixin, jose.TypedJSONObjectWithFields):
# _fields_to_partial_json
"""ACME challenge response."""
TYPES: dict = {}
TYPES: Dict[str, Type['ChallengeResponse']] = {}
resource_type = 'challenge'
resource = fields.Resource(resource_type)
@@ -57,15 +63,15 @@ class UnrecognizedChallenge(Challenge):
"""
def __init__(self, jobj):
def __init__(self, jobj: Mapping[str, Any]) -> None:
super().__init__()
object.__setattr__(self, "jobj", jobj)
def to_partial_json(self):
def to_partial_json(self) -> Dict[str, Any]:
return self.jobj # pylint: disable=no-member
@classmethod
def from_json(cls, jobj):
def from_json(cls, jobj: Mapping[str, Any]) -> 'UnrecognizedChallenge':
return cls(jobj)
@@ -79,13 +85,13 @@ class _TokenChallenge(Challenge):
"""Minimum size of the :attr:`token` in bytes."""
# TODO: acme-spec doesn't specify token as base64-encoded value
token = jose.Field(
token: bytes = jose.Field(
"token", encoder=jose.encode_b64jose, decoder=functools.partial(
jose.decode_b64jose, size=TOKEN_SIZE, minimum=True))
# XXX: rename to ~token_good_for_url
@property
def good_token(self): # XXX: @token.decoder
def good_token(self) -> bool: # XXX: @token.decoder
"""Is `token` good?
.. todo:: acme-spec wants "It MUST NOT contain any non-ASCII
@@ -108,7 +114,7 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
key_authorization = jose.Field("keyAuthorization")
thumbprint_hash_function = hashes.SHA256
def verify(self, chall, account_public_key):
def verify(self, chall: 'KeyAuthorizationChallenge', account_public_key: jose.JWK) -> bool:
"""Verify the key authorization.
:param KeyAuthorization chall: Challenge that corresponds to
@@ -140,7 +146,7 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
return True
def to_partial_json(self):
def to_partial_json(self) -> Dict[str, Any]:
jobj = super().to_partial_json()
jobj.pop('keyAuthorization', None)
return jobj
@@ -158,7 +164,7 @@ class KeyAuthorizationChallenge(_TokenChallenge, metaclass=abc.ABCMeta):
thumbprint_hash_function = (
KeyAuthorizationChallengeResponse.thumbprint_hash_function)
def key_authorization(self, account_key):
def key_authorization(self, account_key: jose.JWK) -> str:
"""Generate Key Authorization.
:param JWK account_key:
@@ -169,7 +175,7 @@ class KeyAuthorizationChallenge(_TokenChallenge, metaclass=abc.ABCMeta):
account_key.thumbprint(
hash_function=self.thumbprint_hash_function)).decode()
def response(self, account_key):
def response(self, account_key: jose.JWK) -> KeyAuthorizationChallengeResponse:
"""Generate response to the challenge.
:param JWK account_key:
@@ -182,7 +188,7 @@ class KeyAuthorizationChallenge(_TokenChallenge, metaclass=abc.ABCMeta):
key_authorization=self.key_authorization(account_key))
@abc.abstractmethod
def validation(self, account_key, **kwargs):
def validation(self, account_key: jose.JWK, **kwargs: Any) -> Any:
"""Generate validation for the challenge.
Subclasses must implement this method, but they are likely to
@@ -196,7 +202,8 @@ class KeyAuthorizationChallenge(_TokenChallenge, metaclass=abc.ABCMeta):
"""
raise NotImplementedError() # pragma: no cover
def response_and_validation(self, account_key, *args, **kwargs):
def response_and_validation(self, account_key: jose.JWK, *args: Any, **kwargs: Any
) -> Tuple[KeyAuthorizationChallengeResponse, Any]:
"""Generate response and validation.
Convenience function that return results of `response` and
@@ -215,7 +222,7 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
"""ACME dns-01 challenge response."""
typ = "dns-01"
def simple_verify(self, chall, domain, account_public_key): # pylint: disable=unused-argument
def simple_verify(self, chall: 'DNS01', domain: str, account_public_key: jose.JWK) -> bool: # pylint: disable=unused-argument
"""Simple verify.
This method no longer checks DNS records and is a simple wrapper
@@ -246,7 +253,7 @@ class DNS01(KeyAuthorizationChallenge):
LABEL = "_acme-challenge"
"""Label clients prepend to the domain name being validated."""
def validation(self, account_key, **unused_kwargs):
def validation(self, account_key: jose.JWK, **unused_kwargs: Any) -> str:
"""Generate validation.
:param JWK account_key:
@@ -256,7 +263,7 @@ class DNS01(KeyAuthorizationChallenge):
return jose.b64encode(hashlib.sha256(self.key_authorization(
account_key).encode("utf-8")).digest()).decode()
def validation_domain_name(self, name):
def validation_domain_name(self, name: str) -> str:
"""Domain name for TXT validation record.
:param unicode name: Domain name being validated.
@@ -281,7 +288,8 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
WHITESPACE_CUTSET = "\n\r\t "
"""Whitespace characters which should be ignored at the end of the body."""
def simple_verify(self, chall, domain, account_public_key, port=None):
def simple_verify(self, chall: 'HTTP01', domain: str, account_public_key: jose.JWK,
port: Optional[int] = None) -> bool:
"""Simple verify.
:param challenges.SimpleHTTP chall: Corresponding challenge.
@@ -314,6 +322,15 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
except requests.exceptions.RequestException as error:
logger.error("Unable to reach %s: %s", uri, error)
return False
# By default, http_response.text will try to guess the encoding to use
# when decoding the response to Python unicode strings. This guesswork
# is error prone. RFC 8555 specifies that HTTP-01 responses should be
# key authorizations with possible trailing whitespace. Since key
# authorizations must be composed entirely of the base64url alphabet
# plus ".", we tell requests that the response should be ASCII. See
# https://datatracker.ietf.org/doc/html/rfc8555#section-8.3 for more
# info.
http_response.encoding = "ascii"
logger.debug("Received %s: %s. Headers: %s", http_response,
http_response.text, http_response.headers)
@@ -337,7 +354,7 @@ class HTTP01(KeyAuthorizationChallenge):
"""URI root path for the server provisioned resource."""
@property
def path(self):
def path(self) -> str:
"""Path (starting with '/') for provisioned resource.
:rtype: string
@@ -345,7 +362,7 @@ class HTTP01(KeyAuthorizationChallenge):
"""
return '/' + self.URI_ROOT_PATH + '/' + self.encode('token')
def uri(self, domain):
def uri(self, domain: str) -> str:
"""Create an URI to the provisioned resource.
Forms an URI to the HTTPS server provisioned resource
@@ -357,7 +374,7 @@ class HTTP01(KeyAuthorizationChallenge):
"""
return "http://" + domain + self.path
def validation(self, account_key, **unused_kwargs):
def validation(self, account_key: jose.JWK, **unused_kwargs: Any) -> str:
"""Generate validation.
:param JWK account_key:
@@ -384,11 +401,12 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
ACME_TLS_1_PROTOCOL = "acme-tls/1"
@property
def h(self):
def h(self) -> bytes:
"""Hash value stored in challenge certificate"""
return hashlib.sha256(self.key_authorization.encode('utf-8')).digest()
def gen_cert(self, domain, key=None, bits=2048):
def gen_cert(self, domain: str, key: Optional[crypto.PKey] = None, bits: int = 2048
) -> Tuple[crypto.X509, crypto.PKey]:
"""Generate tls-alpn-01 certificate.
:param unicode domain: Domain verified by the challenge.
@@ -404,15 +422,15 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
key = crypto.PKey()
key.generate_key(crypto.TYPE_RSA, bits)
der_value = b"DER:" + codecs.encode(self.h, 'hex')
acme_extension = crypto.X509Extension(self.ID_PE_ACME_IDENTIFIER_V1,
critical=True, value=der_value)
critical=True, value=der_value)
return crypto_util.gen_ss_cert(key, [domain], force_san=True,
extensions=[acme_extension]), key
extensions=[acme_extension]), key
def probe_cert(self, domain, host=None, port=None):
def probe_cert(self, domain: str, host: Optional[str] = None,
port: Optional[int] = None) -> crypto.X509:
"""Probe tls-alpn-01 challenge certificate.
:param unicode domain: domain being validated, required.
@@ -426,10 +444,10 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
if port is None:
port = self.PORT
return crypto_util.probe_sni(host=host, port=port, name=domain,
alpn_protocols=[self.ACME_TLS_1_PROTOCOL])
return crypto_util.probe_sni(host=host.encode(), port=port, name=domain.encode(),
alpn_protocols=[self.ACME_TLS_1_PROTOCOL])
def verify_cert(self, domain, cert):
def verify_cert(self, domain: str, cert: crypto.X509) -> bool:
"""Verify tls-alpn-01 challenge certificate.
:param unicode domain: Domain name being validated.
@@ -441,7 +459,10 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""
# pylint: disable=protected-access
names = crypto_util._pyopenssl_cert_or_req_all_names(cert)
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), names)
# Type ignore needed due to
# https://github.com/pyca/pyopenssl/issues/730.
logger.debug('Certificate %s. SANs: %s',
cert.digest('sha256'), names) # type: ignore[arg-type]
if len(names) != 1 or names[0].lower() != domain.lower():
return False
@@ -456,8 +477,9 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
return False
# pylint: disable=too-many-arguments
def simple_verify(self, chall, domain, account_public_key,
cert=None, host=None, port=None):
def simple_verify(self, chall: 'TLSALPN01', domain: str, account_public_key: jose.JWK,
cert: Optional[crypto.X509] = None, host: Optional[str] = None,
port: Optional[int] = None) -> bool:
"""Simple verify.
Verify ``validation`` using ``account_public_key``, optionally
@@ -497,7 +519,7 @@ class TLSALPN01(KeyAuthorizationChallenge):
response_cls = TLSALPN01Response
typ = response_cls.typ
def validation(self, account_key, **kwargs):
def validation(self, account_key: jose.JWK, **kwargs: Any) -> Tuple[crypto.X509, crypto.PKey]:
"""Generate validation.
:param JWK account_key:
@@ -514,7 +536,7 @@ class TLSALPN01(KeyAuthorizationChallenge):
domain=kwargs.get('domain'))
@staticmethod
def is_supported():
def is_supported() -> bool:
"""
Check if TLS-ALPN-01 challenge is supported on this machine.
This implies that a recent version of OpenSSL is installed (>= 1.0.2),
@@ -536,7 +558,8 @@ class DNS(_TokenChallenge):
LABEL = "_acme-challenge"
"""Label clients prepend to the domain name being validated."""
def gen_validation(self, account_key, alg=jose.RS256, **kwargs):
def gen_validation(self, account_key: jose.JWK, alg: jose.JWASignature = jose.RS256,
**kwargs: Any) -> jose.JWS:
"""Generate validation.
:param .JWK account_key: Private account key.
@@ -550,7 +573,7 @@ class DNS(_TokenChallenge):
payload=self.json_dumps(sort_keys=True).encode('utf-8'),
key=account_key, alg=alg, **kwargs)
def check_validation(self, validation, account_public_key):
def check_validation(self, validation: jose.JWS, account_public_key: jose.JWK) -> bool:
"""Check validation.
:param JWS validation:
@@ -567,7 +590,7 @@ class DNS(_TokenChallenge):
logger.debug("Checking validation for DNS failed: %s", error)
return False
def gen_response(self, account_key, **kwargs):
def gen_response(self, account_key: jose.JWK, **kwargs: Any) -> 'DNSResponse':
"""Generate response.
:param .JWK account_key: Private account key.
@@ -579,7 +602,7 @@ class DNS(_TokenChallenge):
return DNSResponse(validation=self.gen_validation(
account_key, **kwargs))
def validation_domain_name(self, name):
def validation_domain_name(self, name: str) -> str:
"""Domain name for TXT validation record.
:param unicode name: Domain name being validated.
@@ -599,7 +622,7 @@ class DNSResponse(ChallengeResponse):
validation = jose.Field("validation", decoder=jose.JWS.from_json)
def check_validation(self, chall, account_public_key):
def check_validation(self, chall: 'DNS', account_public_key: jose.JWK) -> bool:
"""Check validation.
:param challenges.DNS chall:
@@ -608,4 +631,4 @@ class DNSResponse(ChallengeResponse):
:rtype: bool
"""
return chall.check_validation(self.validation, account_public_key)
return chall.check_validation(cast(jose.JWS, self.validation), account_public_key)

View File

@@ -1,4 +1,7 @@
"""ACME client API."""
# pylint: disable=too-many-lines
# This pylint disable can be deleted once the deprecated ACMEv1 code is
# removed.
import base64
import collections
import datetime
@@ -7,13 +10,21 @@ import heapq
import http.client as http_client
import logging
import re
import sys
import time
from types import ModuleType
from typing import Any
from typing import Callable
from typing import cast
from typing import Dict
from typing import Iterable
from typing import List
from typing import Optional
from typing import Set
from typing import Text
from typing import Tuple
from typing import Union
import warnings
import josepy as jose
import OpenSSL
@@ -42,7 +53,8 @@ class ClientBase:
:ivar .ClientNetwork net: Client network.
:ivar int acme_version: ACME protocol version. 1 or 2.
"""
def __init__(self, directory, net, acme_version):
def __init__(self, directory: messages.Directory, net: 'ClientNetwork',
acme_version: int) -> None:
"""Initialize.
:param .messages.Directory directory: Directory Resource
@@ -54,7 +66,9 @@ class ClientBase:
self.acme_version = acme_version
@classmethod
def _regr_from_response(cls, response, uri=None, terms_of_service=None):
def _regr_from_response(cls, response: requests.Response, uri: Optional[str] = None,
terms_of_service: Optional[str] = None
) -> messages.RegistrationResource:
if 'terms-of-service' in response.links:
terms_of_service = response.links['terms-of-service']['url']
@@ -63,7 +77,8 @@ class ClientBase:
uri=response.headers.get('Location', uri),
terms_of_service=terms_of_service)
def _send_recv_regr(self, regr, body):
def _send_recv_regr(self, regr: messages.RegistrationResource,
body: messages.Registration) -> messages.RegistrationResource:
response = self._post(regr.uri, body)
# TODO: Boulder returns httplib.ACCEPTED
@@ -76,7 +91,7 @@ class ClientBase:
response, uri=regr.uri,
terms_of_service=regr.terms_of_service)
def _post(self, *args, **kwargs):
def _post(self, *args: Any, **kwargs: Any) -> requests.Response:
"""Wrapper around self.net.post that adds the acme_version.
"""
@@ -85,7 +100,9 @@ class ClientBase:
kwargs.setdefault('new_nonce_url', getattr(self.directory, 'newNonce'))
return self.net.post(*args, **kwargs)
def update_registration(self, regr, update=None):
def update_registration(self, regr: messages.RegistrationResource,
update: Optional[messages.Registration] = None
) -> messages.RegistrationResource:
"""Update registration.
:param messages.RegistrationResource regr: Registration Resource.
@@ -102,7 +119,8 @@ class ClientBase:
self.net.account = updated_regr
return updated_regr
def deactivate_registration(self, regr):
def deactivate_registration(self, regr: messages.RegistrationResource
) -> messages.RegistrationResource:
"""Deactivate registration.
:param messages.RegistrationResource regr: The Registration Resource
@@ -112,7 +130,8 @@ class ClientBase:
:rtype: `.RegistrationResource`
"""
return self.update_registration(regr, update={'status': 'deactivated'})
return self.update_registration(regr, messages.Registration.from_json(
{"status": "deactivated", "contact": None}))
def deactivate_authorization(self,
authzr: messages.AuthorizationResource
@@ -131,7 +150,9 @@ class ClientBase:
return self._authzr_from_response(response,
authzr.body.identifier, authzr.uri)
def _authzr_from_response(self, response, identifier=None, uri=None):
def _authzr_from_response(self, response: requests.Response,
identifier: Optional[messages.Identifier] = None,
uri: Optional[str] = None) -> messages.AuthorizationResource:
authzr = messages.AuthorizationResource(
body=messages.Authorization.from_json(response.json()),
uri=response.headers.get('Location', uri))
@@ -139,7 +160,8 @@ class ClientBase:
raise errors.UnexpectedUpdate(authzr)
return authzr
def answer_challenge(self, challb, response):
def answer_challenge(self, challb: messages.ChallengeBody, response: requests.Response
) -> messages.ChallengeResource:
"""Answer challenge.
:param challb: Challenge Resource body.
@@ -168,7 +190,7 @@ class ClientBase:
return challr
@classmethod
def retry_after(cls, response, default):
def retry_after(cls, response: requests.Response, default: int) -> datetime.datetime:
"""Compute next `poll` time based on response ``Retry-After`` header.
Handles integers and various datestring formats per
@@ -199,7 +221,7 @@ class ClientBase:
return datetime.datetime.now() + datetime.timedelta(seconds=seconds)
def _revoke(self, cert, rsn, url):
def _revoke(self, cert: jose.ComparableX509, rsn: int, url: str) -> None:
"""Revoke certificate.
:param .ComparableX509 cert: `OpenSSL.crypto.X509` wrapped in
@@ -224,6 +246,9 @@ class ClientBase:
class Client(ClientBase):
"""ACME client for a v1 API.
.. deprecated:: 1.18.0
Use :class:`ClientV2` instead.
.. todo::
Clean up raised error types hierarchy, document, and handle (wrap)
instances of `.DeserializationError` raised in `from_json()`.
@@ -238,8 +263,9 @@ class Client(ClientBase):
"""
def __init__(self, directory, key, alg=jose.RS256, verify_ssl=True,
net=None):
def __init__(self, directory: messages.Directory, key: jose.JWK,
alg: jose.JWASignature=jose.RS256, verify_ssl: bool = True,
net: Optional['ClientNetwork'] = None) -> None:
"""Initialize.
:param directory: Directory Resource (`.messages.Directory`) or
@@ -254,9 +280,10 @@ class Client(ClientBase):
directory = messages.Directory.from_json(
net.get(directory).json())
super().__init__(directory=directory,
net=net, acme_version=1)
net=net, acme_version=1)
def register(self, new_reg=None):
def register(self, new_reg: Optional[messages.NewRegistration] = None
) -> messages.RegistrationResource:
"""Register.
:param .NewRegistration new_reg:
@@ -273,16 +300,18 @@ class Client(ClientBase):
# "Instance of 'Field' has no key/contact member" bug:
return self._regr_from_response(response)
def query_registration(self, regr):
def query_registration(self, regr: messages.RegistrationResource
) -> messages.RegistrationResource:
"""Query server about registration.
:param messages.RegistrationResource: Existing Registration
:param messages.RegistrationResource regr: Existing Registration
Resource.
"""
return self._send_recv_regr(regr, messages.UpdateRegistration())
def agree_to_tos(self, regr):
def agree_to_tos(self, regr: messages.RegistrationResource
) -> messages.RegistrationResource:
"""Agree to the terms-of-service.
Agree to the terms-of-service in a Registration Resource.
@@ -297,7 +326,8 @@ class Client(ClientBase):
return self.update_registration(
regr.update(body=regr.body.update(agreement=regr.terms_of_service)))
def request_challenges(self, identifier, new_authzr_uri=None):
def request_challenges(self, identifier: messages.Identifier,
new_authzr_uri: Optional[str] = None) -> messages.AuthorizationResource:
"""Request challenges.
:param .messages.Identifier identifier: Identifier to be challenged.
@@ -323,7 +353,8 @@ class Client(ClientBase):
assert response.status_code == http_client.CREATED
return self._authzr_from_response(response, identifier)
def request_domain_challenges(self, domain, new_authzr_uri=None):
def request_domain_challenges(self, domain: str,new_authzr_uri: Optional[str] = None
) -> messages.AuthorizationResource:
"""Request challenges for domain names.
This is simply a convenience function that wraps around
@@ -343,7 +374,9 @@ class Client(ClientBase):
return self.request_challenges(messages.Identifier(
typ=messages.IDENTIFIER_FQDN, value=domain), new_authzr_uri)
def request_issuance(self, csr, authzrs):
def request_issuance(self, csr: jose.ComparableX509,
authzrs: Iterable[messages.AuthorizationResource]
) -> messages.CertificateResource:
"""Request issuance.
:param csr: CSR
@@ -380,7 +413,8 @@ class Client(ClientBase):
body=jose.ComparableX509(OpenSSL.crypto.load_certificate(
OpenSSL.crypto.FILETYPE_ASN1, response.content)))
def poll(self, authzr):
def poll(self, authzr: messages.AuthorizationResource
) -> Tuple[messages.AuthorizationResource, requests.Response]:
"""Poll Authorization Resource for status.
:param authzr: Authorization Resource
@@ -396,8 +430,11 @@ class Client(ClientBase):
response, authzr.body.identifier, authzr.uri)
return updated_authzr, response
def poll_and_request_issuance(
self, csr, authzrs, mintime=5, max_attempts=10):
def poll_and_request_issuance(self, csr: jose.ComparableX509,
authzrs: Iterable[messages.AuthorizationResource],
mintime: int = 5, max_attempts: int = 10
) -> Tuple[messages.CertificateResource,
Tuple[messages.AuthorizationResource, ...]]:
"""Poll and request issuance.
This function polls all provided Authorization Resource URIs
@@ -471,7 +508,7 @@ class Client(ClientBase):
updated_authzrs = tuple(updated[authzr] for authzr in authzrs)
return self.request_issuance(csr, updated_authzrs), updated_authzrs
def _get_cert(self, uri):
def _get_cert(self, uri: str) -> Tuple[requests.Response, jose.ComparableX509]:
"""Returns certificate from URI.
:param str uri: URI of certificate
@@ -487,7 +524,7 @@ class Client(ClientBase):
return response, jose.ComparableX509(OpenSSL.crypto.load_certificate(
OpenSSL.crypto.FILETYPE_ASN1, response.content))
def check_cert(self, certr):
def check_cert(self, certr: messages.CertificateResource) -> messages.CertificateResource:
"""Check for new cert.
:param certr: Certificate Resource
@@ -506,7 +543,7 @@ class Client(ClientBase):
raise errors.UnexpectedUpdate(response.text)
return certr.update(body=cert)
def refresh(self, certr):
def refresh(self, certr: messages.CertificateResource) -> messages.CertificateResource:
"""Refresh certificate.
:param certr: Certificate Resource
@@ -521,7 +558,8 @@ class Client(ClientBase):
# respond with status code 403 (Forbidden)
return self.check_cert(certr)
def fetch_chain(self, certr, max_length=10):
def fetch_chain(self, certr: messages.CertificateResource,
max_length: int = 10) -> List[jose.ComparableX509]:
"""Fetch chain for certificate.
:param .CertificateResource certr: Certificate Resource
@@ -550,7 +588,7 @@ class Client(ClientBase):
"Recursion limit reached. Didn't get {0}".format(uri))
return chain
def revoke(self, cert, rsn):
def revoke(self, cert: jose.ComparableX509, rsn: int) -> None:
"""Revoke certificate.
:param .ComparableX509 cert: `OpenSSL.crypto.X509` wrapped in
@@ -561,7 +599,7 @@ class Client(ClientBase):
:raises .ClientError: If revocation is unsuccessful.
"""
return self._revoke(cert, rsn, self.directory[messages.Revocation])
self._revoke(cert, rsn, self.directory[cast(str, messages.Revocation)])
class ClientV2(ClientBase):
@@ -571,16 +609,15 @@ class ClientV2(ClientBase):
:ivar .ClientNetwork net: Client network.
"""
def __init__(self, directory, net):
def __init__(self, directory: messages.Directory, net: 'ClientNetwork') -> None:
"""Initialize.
:param .messages.Directory directory: Directory Resource
:param .ClientNetwork net: Client network.
"""
super().__init__(directory=directory,
net=net, acme_version=2)
super().__init__(directory=directory, net=net, acme_version=2)
def new_account(self, new_account):
def new_account(self, new_account: messages.NewRegistration) -> messages.RegistrationResource:
"""Register.
:param .NewRegistration new_account:
@@ -593,16 +630,17 @@ class ClientV2(ClientBase):
response = self._post(self.directory['newAccount'], new_account)
# if account already exists
if response.status_code == 200 and 'Location' in response.headers:
raise errors.ConflictError(response.headers.get('Location'))
raise errors.ConflictError(response.headers['Location'])
# "Instance of 'Field' has no key/contact member" bug:
regr = self._regr_from_response(response)
self.net.account = regr
return regr
def query_registration(self, regr):
def query_registration(self, regr: messages.RegistrationResource
) -> messages.RegistrationResource:
"""Query server about registration.
:param messages.RegistrationResource: Existing Registration
:param messages.RegistrationResource regr: Existing Registration
Resource.
"""
@@ -614,7 +652,9 @@ class ClientV2(ClientBase):
terms_of_service=regr.terms_of_service)
return self.net.account
def update_registration(self, regr, update=None):
def update_registration(self, regr: messages.RegistrationResource,
update: Optional[messages.Registration] = None
) -> messages.RegistrationResource:
"""Update registration.
:param messages.RegistrationResource regr: Registration Resource.
@@ -629,7 +669,7 @@ class ClientV2(ClientBase):
new_regr = self._get_v2_account(regr)
return super().update_registration(new_regr, update)
def _get_v2_account(self, regr):
def _get_v2_account(self, regr: messages.RegistrationResource) -> messages.RegistrationResource:
self.net.account = None
only_existing_reg = regr.body.update(only_return_existing=True)
response = self._post(self.directory['newAccount'], only_existing_reg)
@@ -638,10 +678,10 @@ class ClientV2(ClientBase):
self.net.account = new_regr
return new_regr
def new_order(self, csr_pem):
def new_order(self, csr_pem: bytes) -> messages.OrderResource:
"""Request a new Order object from the server.
:param str csr_pem: A CSR in PEM format.
:param bytes csr_pem: A CSR in PEM format.
:returns: The newly created order.
:rtype: OrderResource
@@ -649,16 +689,23 @@ class ClientV2(ClientBase):
csr = OpenSSL.crypto.load_certificate_request(OpenSSL.crypto.FILETYPE_PEM, csr_pem)
# pylint: disable=protected-access
dnsNames = crypto_util._pyopenssl_cert_or_req_all_names(csr)
ipNames = crypto_util._pyopenssl_cert_or_req_san_ip(csr)
# ipNames is now []string
identifiers = []
for name in dnsNames:
identifiers.append(messages.Identifier(typ=messages.IDENTIFIER_FQDN,
value=name))
for ips in ipNames:
identifiers.append(messages.Identifier(typ=messages.IDENTIFIER_IP,
value=ips))
order = messages.NewOrder(identifiers=identifiers)
response = self._post(self.directory['newOrder'], order)
body = messages.Order.from_json(response.json())
authorizations = []
for url in body.authorizations:
# pylint has trouble understanding our josepy based objects which use
# things like custom metaclass logic. body.authorizations should be a
# list of strings containing URLs so let's disable this check here.
for url in body.authorizations: # pylint: disable=not-an-iterable
authorizations.append(self._authzr_from_response(self._post_as_get(url), uri=url))
return messages.OrderResource(
body=body,
@@ -666,7 +713,8 @@ class ClientV2(ClientBase):
authorizations=authorizations,
csr_pem=csr_pem)
def poll(self, authzr):
def poll(self, authzr: messages.AuthorizationResource
) -> Tuple[messages.AuthorizationResource, requests.Response]:
"""Poll Authorization Resource for status.
:param authzr: Authorization Resource
@@ -682,7 +730,8 @@ class ClientV2(ClientBase):
response, authzr.body.identifier, authzr.uri)
return updated_authzr, response
def poll_and_finalize(self, orderr, deadline=None):
def poll_and_finalize(self, orderr: messages.OrderResource,
deadline: Optional[datetime.datetime] = None) -> messages.OrderResource:
"""Poll authorizations and finalize the order.
If no deadline is provided, this method will timeout after 90
@@ -700,7 +749,8 @@ class ClientV2(ClientBase):
orderr = self.poll_authorizations(orderr, deadline)
return self.finalize_order(orderr, deadline)
def poll_authorizations(self, orderr, deadline):
def poll_authorizations(self, orderr: messages.OrderResource, deadline: datetime.datetime
) -> messages.OrderResource:
"""Poll Order Resource for status."""
responses = []
for url in orderr.body.authorizations:
@@ -724,7 +774,8 @@ class ClientV2(ClientBase):
raise errors.ValidationError(failed)
return orderr.update(authorizations=responses)
def finalize_order(self, orderr, deadline, fetch_alternative_chains=False):
def finalize_order(self, orderr: messages.OrderResource, deadline: datetime.datetime,
fetch_alternative_chains: bool = False) -> messages.OrderResource:
"""Finalize an order and obtain a certificate.
:param messages.OrderResource orderr: order to finalize
@@ -756,7 +807,7 @@ class ClientV2(ClientBase):
return orderr
raise errors.TimeoutError()
def revoke(self, cert, rsn):
def revoke(self, cert: jose.ComparableX509, rsn: int) -> None:
"""Revoke certificate.
:param .ComparableX509 cert: `OpenSSL.crypto.X509` wrapped in
@@ -767,13 +818,13 @@ class ClientV2(ClientBase):
:raises .ClientError: If revocation is unsuccessful.
"""
return self._revoke(cert, rsn, self.directory['revokeCert'])
self._revoke(cert, rsn, self.directory['revokeCert'])
def external_account_required(self):
def external_account_required(self) -> bool:
"""Checks if ACME server requires External Account Binding authentication."""
return hasattr(self.directory, 'meta') and self.directory.meta.external_account_required
def _post_as_get(self, *args, **kwargs):
def _post_as_get(self, *args: Any, **kwargs: Any) -> requests.Response:
"""
Send GET request using the POST-as-GET protocol.
:param args:
@@ -783,7 +834,7 @@ class ClientV2(ClientBase):
new_args = args[:1] + (None,) + args[1:]
return self._post(*new_args, **kwargs)
def _get_links(self, response, relation_type):
def _get_links(self, response: requests.Response, relation_type: str) -> List[str]:
"""
Retrieves all Link URIs of relation_type from the response.
:param requests.Response response: The requests HTTP response.
@@ -802,6 +853,9 @@ class BackwardsCompatibleClientV2:
"""ACME client wrapper that tends towards V2-style calls, but
supports V1 servers.
.. deprecated:: 1.18.0
Use :class:`ClientV2` instead.
.. note:: While this class handles the majority of the differences
between versions of the ACME protocol, if you need to support an
ACME server based on version 3 or older of the IETF ACME draft
@@ -817,7 +871,7 @@ class BackwardsCompatibleClientV2:
:ivar .ClientBase client: either Client or ClientV2
"""
def __init__(self, net, key, server):
def __init__(self, net: 'ClientNetwork', key: jose.JWK, server: str) -> None:
directory = messages.Directory.from_json(net.get(server).json())
self.acme_version = self._acme_version_from_directory(directory)
self.client: Union[Client, ClientV2]
@@ -826,17 +880,19 @@ class BackwardsCompatibleClientV2:
else:
self.client = ClientV2(directory, net=net)
def __getattr__(self, name):
def __getattr__(self, name: str) -> Any:
return getattr(self.client, name)
def new_account_and_tos(self, regr, check_tos_cb=None):
def new_account_and_tos(self, regr: messages.NewRegistration,
check_tos_cb: Optional[Callable[[str], None]] = None
) -> messages.RegistrationResource:
"""Combined register and agree_tos for V1, new_account for V2
:param .NewRegistration regr:
:param callable check_tos_cb: callback that raises an error if
the check does not work
"""
def _assess_tos(tos):
def _assess_tos(tos: str) -> None:
if check_tos_cb is not None:
check_tos_cb(tos)
if self.acme_version == 1:
@@ -853,13 +909,13 @@ class BackwardsCompatibleClientV2:
regr = regr.update(terms_of_service_agreed=True)
return client_v2.new_account(regr)
def new_order(self, csr_pem):
def new_order(self, csr_pem: bytes) -> messages.OrderResource:
"""Request a new Order object from the server.
If using ACMEv1, returns a dummy OrderResource with only
the authorizations field filled in.
:param str csr_pem: A CSR in PEM format.
:param bytes csr_pem: A CSR in PEM format.
:returns: The newly created order.
:rtype: OrderResource
@@ -879,7 +935,8 @@ class BackwardsCompatibleClientV2:
return messages.OrderResource(authorizations=authorizations, csr_pem=csr_pem)
return cast(ClientV2, self.client).new_order(csr_pem)
def finalize_order(self, orderr, deadline, fetch_alternative_chains=False):
def finalize_order(self, orderr: messages.OrderResource, deadline: datetime.datetime,
fetch_alternative_chains: bool = False) -> messages.OrderResource:
"""Finalize an order and obtain a certificate.
:param messages.OrderResource orderr: order to finalize
@@ -914,13 +971,13 @@ class BackwardsCompatibleClientV2:
cert = OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_PEM, certr.body.wrapped).decode()
chain = crypto_util.dump_pyopenssl_chain(chain).decode()
chain_str = crypto_util.dump_pyopenssl_chain(chain).decode()
return orderr.update(fullchain_pem=(cert + chain))
return orderr.update(fullchain_pem=(cert + chain_str))
return cast(ClientV2, self.client).finalize_order(
orderr, deadline, fetch_alternative_chains)
def revoke(self, cert, rsn):
def revoke(self, cert: jose.ComparableX509, rsn: int) -> None:
"""Revoke certificate.
:param .ComparableX509 cert: `OpenSSL.crypto.X509` wrapped in
@@ -931,14 +988,14 @@ class BackwardsCompatibleClientV2:
:raises .ClientError: If revocation is unsuccessful.
"""
return self.client.revoke(cert, rsn)
self.client.revoke(cert, rsn)
def _acme_version_from_directory(self, directory):
def _acme_version_from_directory(self, directory: messages.Directory) -> int:
if hasattr(directory, 'newNonce'):
return 2
return 1
def external_account_required(self):
def external_account_required(self) -> bool:
"""Checks if the server requires an external account for ACMEv2 servers.
Always return False for ACMEv1 servers, as it doesn't use External Account Binding."""
@@ -970,9 +1027,10 @@ class ClientNetwork:
:param source_address: Optional source address to bind to when making requests.
:type source_address: str or tuple(str, int)
"""
def __init__(self, key, account=None, alg=jose.RS256, verify_ssl=True,
user_agent='acme-python', timeout=DEFAULT_NETWORK_TIMEOUT,
source_address=None):
def __init__(self, key: jose.JWK, account: Optional[messages.RegistrationResource] = None,
alg: jose.JWASignature = jose.RS256, verify_ssl: bool = True,
user_agent: str = 'acme-python', timeout: int = DEFAULT_NETWORK_TIMEOUT,
source_address: Optional[Union[str, Tuple[str, int]]] = None) -> None:
self.key = key
self.account = account
self.alg = alg
@@ -989,7 +1047,7 @@ class ClientNetwork:
self.session.mount("http://", adapter)
self.session.mount("https://", adapter)
def __del__(self):
def __del__(self) -> None:
# Try to close the session, but don't show exceptions to the
# user if the call to close() fails. See #4840.
try:
@@ -997,14 +1055,15 @@ class ClientNetwork:
except Exception: # pylint: disable=broad-except
pass
def _wrap_in_jws(self, obj, nonce, url, acme_version):
def _wrap_in_jws(self, obj: jose.JSONDeSerializable, nonce: str, url: str,
acme_version: int) -> jose.JWS:
"""Wrap `JSONDeSerializable` object in JWS.
.. todo:: Implement ``acmePath``.
:param josepy.JSONDeSerializable obj:
:param str url: The URL to which this object will be POSTed
:param bytes nonce:
:param str nonce:
:rtype: `josepy.JWS`
"""
@@ -1026,7 +1085,8 @@ class ClientNetwork:
return jws.JWS.sign(jobj, **kwargs).json_dumps(indent=2)
@classmethod
def _check_response(cls, response, content_type=None):
def _check_response(cls, response: requests.Response,
content_type: Optional[str] = None) -> requests.Response:
"""Check response content and its type.
.. note::
@@ -1056,7 +1116,7 @@ class ClientNetwork:
jobj = None
if response.status_code == 409:
raise errors.ConflictError(response.headers.get('Location'))
raise errors.ConflictError(response.headers.get('Location', 'UNKNOWN-LOCATION'))
if not response.ok:
if jobj is not None:
@@ -1084,7 +1144,7 @@ class ClientNetwork:
return response
def _send_request(self, method, url, *args, **kwargs):
def _send_request(self, method: str, url: str, *args: Any, **kwargs: Any) -> requests.Response:
"""Send HTTP request.
Makes sure that `verify_ssl` is respected. Logs request and
@@ -1135,13 +1195,23 @@ class ClientNetwork:
host, path, _err_no, err_msg = m.groups()
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
# If content is DER, log the base64 of it instead of raw bytes, to keep
# binary data out of the logs.
# If the Content-Type is DER or an Accept header was sent in the
# request, the response may not be UTF-8 encoded. In this case, we
# don't set response.encoding and log the base64 response instead of
# raw bytes to keep binary data out of the logs. This code can be
# simplified to only check for an Accept header in the request when
# ACMEv1 support is dropped.
debug_content: Union[bytes, str]
if response.headers.get("Content-Type") == DER_CONTENT_TYPE:
if (response.headers.get("Content-Type") == DER_CONTENT_TYPE or
"Accept" in kwargs["headers"]):
debug_content = base64.b64encode(response.content)
else:
debug_content = response.content.decode("utf-8")
# We set response.encoding so response.text knows the response is
# UTF-8 encoded instead of trying to guess the encoding that was
# used which is error prone. This setting affects all future
# accesses of .text made on the returned response object as well.
response.encoding = "utf-8"
debug_content = response.text
logger.debug('Received response:\nHTTP %d\n%s\n\n%s',
response.status_code,
"\n".join("{0}: {1}".format(k, v)
@@ -1149,7 +1219,7 @@ class ClientNetwork:
debug_content)
return response
def head(self, *args, **kwargs):
def head(self, *args: Any, **kwargs: Any) -> requests.Response:
"""Send HEAD request without checking the response.
Note, that `_check_response` is not called, as it is expected
@@ -1159,12 +1229,13 @@ class ClientNetwork:
"""
return self._send_request('HEAD', *args, **kwargs)
def get(self, url, content_type=JSON_CONTENT_TYPE, **kwargs):
def get(self, url: str, content_type: str = JSON_CONTENT_TYPE,
**kwargs: Any) -> requests.Response:
"""Send GET request and check response."""
return self._check_response(
self._send_request('GET', url, **kwargs), content_type=content_type)
def _add_nonce(self, response):
def _add_nonce(self, response: requests.Response) -> None:
if self.REPLAY_NONCE_HEADER in response.headers:
nonce = response.headers[self.REPLAY_NONCE_HEADER]
try:
@@ -1176,7 +1247,7 @@ class ClientNetwork:
else:
raise errors.MissingNonce(response)
def _get_nonce(self, url, new_nonce_url):
def _get_nonce(self, url: str, new_nonce_url: str) -> str:
if not self._nonces:
logger.debug('Requesting fresh nonce')
if new_nonce_url is None:
@@ -1187,7 +1258,7 @@ class ClientNetwork:
self._add_nonce(response)
return self._nonces.pop()
def post(self, *args, **kwargs):
def post(self, *args: Any, **kwargs: Any) -> requests.Response:
"""POST object wrapped in `.JWS` and check response.
If the server responded with a badNonce error, the request will
@@ -1202,8 +1273,9 @@ class ClientNetwork:
return self._post_once(*args, **kwargs)
raise
def _post_once(self, url, obj, content_type=JOSE_CONTENT_TYPE,
acme_version=1, **kwargs):
def _post_once(self, url: str, obj: jose.JSONDeSerializable,
content_type: str = JOSE_CONTENT_TYPE, acme_version: int = 1,
**kwargs: Any) -> requests.Response:
new_nonce_url = kwargs.pop('new_nonce_url', None)
data = self._wrap_in_jws(obj, self._get_nonce(url, new_nonce_url), url, acme_version)
kwargs.setdefault('headers', {'Content-Type': content_type})
@@ -1211,3 +1283,35 @@ class ClientNetwork:
response = self._check_response(response, content_type=content_type)
self._add_nonce(response)
return response
# This class takes a similar approach to the cryptography project to deprecate attributes
# in public modules. See the _ModuleWithDeprecation class here:
# https://github.com/pyca/cryptography/blob/91105952739442a74582d3e62b3d2111365b0dc7/src/cryptography/utils.py#L129
class _ClientDeprecationModule:
"""
Internal class delegating to a module, and displaying warnings when attributes
related to deprecated attributes in the acme.client module.
"""
def __init__(self, module: ModuleType) -> None:
self.__dict__['_module'] = module
def __getattr__(self, attr: str) -> Any:
if attr in ('Client', 'BackwardsCompatibleClientV2'):
warnings.warn('The {0} attribute in acme.client is deprecated '
'and will be removed soon.'.format(attr),
DeprecationWarning, stacklevel=2)
return getattr(self._module, attr)
def __setattr__(self, attr: str, value: Any) -> None: # pragma: no cover
setattr(self._module, attr, value)
def __delattr__(self, attr: str) -> None: # pragma: no cover
delattr(self._module, attr)
def __dir__(self) -> List[str]: # pragma: no cover
return ['_module'] + dir(self._module)
# Patching ourselves to warn about deprecation and planned removal of some elements in the module.
sys.modules[__name__] = cast(ModuleType, _ClientDeprecationModule(sys.modules[__name__]))

View File

@@ -1,17 +1,23 @@
"""Crypto utilities."""
import binascii
import contextlib
import ipaddress
import logging
import os
import re
import socket
from typing import Any
from typing import Callable
from typing import List
from typing import Mapping
from typing import Optional
from typing import Set
from typing import Tuple
from typing import Union
import josepy as jose
from OpenSSL import crypto
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
from OpenSSL import SSL
from acme import errors
@@ -24,14 +30,14 @@ logger = logging.getLogger(__name__)
# https://www.openssl.org/docs/ssl/SSLv23_method.html). _serve_sni
# should be changed to use "set_options" to disable SSLv2 and SSLv3,
# in case it's used for things other than probing/serving!
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD
class _DefaultCertSelection:
def __init__(self, certs):
def __init__(self, certs: Mapping[bytes, Tuple[crypto.PKey, crypto.X509]]):
self.certs = certs
def __call__(self, connection):
def __call__(self, connection: SSL.Connection) -> Optional[Tuple[crypto.PKey, crypto.X509]]:
server_name = connection.get_servername()
return self.certs.get(server_name, None)
@@ -49,9 +55,13 @@ class SSLSocket: # pylint: disable=too-few-public-methods
`certs` parameter would be ignored, and therefore must be empty.
"""
def __init__(self, sock, certs=None,
method=_DEFAULT_SSL_METHOD, alpn_selection=None,
cert_selection=None):
def __init__(self, sock: socket.socket,
certs: Optional[Mapping[bytes, Tuple[crypto.PKey, crypto.X509]]] = None,
method: int = _DEFAULT_SSL_METHOD,
alpn_selection: Optional[Callable[[SSL.Connection, List[bytes]], bytes]] = None,
cert_selection: Optional[Callable[[SSL.Connection],
Tuple[crypto.PKey, crypto.X509]]] = None
) -> None:
self.sock = sock
self.alpn_selection = alpn_selection
self.method = method
@@ -59,14 +69,18 @@ class SSLSocket: # pylint: disable=too-few-public-methods
raise ValueError("Neither cert_selection or certs specified.")
if cert_selection and certs:
raise ValueError("Both cert_selection and certs specified.")
if cert_selection is None:
cert_selection = _DefaultCertSelection(certs)
self.cert_selection = cert_selection
actual_cert_selection: Union[_DefaultCertSelection,
Optional[Callable[[SSL.Connection],
Tuple[crypto.PKey,
crypto.X509]]]] = cert_selection
if actual_cert_selection is None:
actual_cert_selection = _DefaultCertSelection(certs if certs else {})
self.cert_selection = actual_cert_selection
def __getattr__(self, name):
def __getattr__(self, name: str) -> Any:
return getattr(self.sock, name)
def _pick_certificate_cb(self, connection):
def _pick_certificate_cb(self, connection: SSL.Connection) -> None:
"""SNI certificate callback.
This method will set a new OpenSSL context object for this
@@ -98,17 +112,17 @@ class SSLSocket: # pylint: disable=too-few-public-methods
# pylint: disable=missing-function-docstring
def __init__(self, connection):
def __init__(self, connection: SSL.Connection) -> None:
self._wrapped = connection
def __getattr__(self, name):
def __getattr__(self, name: str) -> Any:
return getattr(self._wrapped, name)
def shutdown(self, *unused_args):
def shutdown(self, *unused_args: Any) -> bool:
# OpenSSL.SSL.Connection.shutdown doesn't accept any args
return self._wrapped.shutdown()
def accept(self): # pylint: disable=missing-function-docstring
def accept(self) -> Tuple[FakeConnection, Any]: # pylint: disable=missing-function-docstring
sock, addr = self.sock.accept()
context = SSL.Context(self.method)
@@ -132,9 +146,9 @@ class SSLSocket: # pylint: disable=too-few-public-methods
return ssl_sock, addr
def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-arguments
method=_DEFAULT_SSL_METHOD, source_address=('', 0),
alpn_protocols=None):
def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, # pylint: disable=too-many-arguments
method: int = _DEFAULT_SSL_METHOD, source_address: Tuple[str, int] = ('', 0),
alpn_protocols: Optional[List[str]] = None) -> crypto.X509:
"""Probe SNI server for SSL certificate.
:param bytes name: Byte string to send as the server name in the
@@ -147,7 +161,7 @@ def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-argu
of source interface). See `socket.creation_connection` for more
info. Available only in Python 2.7+.
:param alpn_protocols: Protocols to request using ALPN.
:type alpn_protocols: `list` of `bytes`
:type alpn_protocols: `list` of `str`
:raises acme.errors.Error: In case of any problems.
@@ -168,8 +182,8 @@ def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-argu
source_address[1]
) if any(source_address) else ""
)
socket_tuple: Tuple[str, int] = (host, port)
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore
socket_tuple: Tuple[bytes, int] = (host, port)
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore[arg-type]
except socket.error as error:
raise errors.Error(error)
@@ -187,23 +201,45 @@ def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-argu
return client_ssl.get_peer_certificate()
def make_csr(private_key_pem, domains, must_staple=False):
"""Generate a CSR containing a list of domains as subjectAltNames.
def make_csr(private_key_pem: bytes, domains: Optional[Union[Set[str], List[str]]] = None,
must_staple: bool = False,
ipaddrs: Optional[List[Union[ipaddress.IPv4Address, ipaddress.IPv6Address]]] = None
) -> bytes:
"""Generate a CSR containing domains or IPs as subjectAltNames.
:param buffer private_key_pem: Private key, in PEM PKCS#8 format.
:param list domains: List of DNS names to include in subjectAltNames of CSR.
:param bool must_staple: Whether to include the TLS Feature extension (aka
OCSP Must Staple: https://tools.ietf.org/html/rfc7633).
:param list ipaddrs: List of IPaddress(type ipaddress.IPv4Address or ipaddress.IPv6Address)
names to include in subbjectAltNames of CSR.
params ordered this way for backward competablity when called by positional argument.
:returns: buffer PEM-encoded Certificate Signing Request.
"""
private_key = crypto.load_privatekey(
crypto.FILETYPE_PEM, private_key_pem)
csr = crypto.X509Req()
sanlist = []
# if domain or ip list not supplied make it empty list so it's easier to iterate
if domains is None:
domains = []
if ipaddrs is None:
ipaddrs = []
if len(domains)+len(ipaddrs) == 0:
raise ValueError("At least one of domains or ipaddrs parameter need to be not empty")
for address in domains:
sanlist.append('DNS:' + address)
for ips in ipaddrs:
sanlist.append('IP:' + ips.exploded)
# make sure its ascii encoded
san_string = ', '.join(sanlist).encode('ascii')
# for IP san it's actually need to be octet-string,
# but somewhere downsteam thankfully handle it for us
extensions = [
crypto.X509Extension(
b'subjectAltName',
critical=False,
value=', '.join('DNS:' + d for d in domains).encode('ascii')
value=san_string
),
]
if must_staple:
@@ -219,7 +255,9 @@ def make_csr(private_key_pem, domains, must_staple=False):
crypto.FILETYPE_PEM, csr)
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req: Union[crypto.X509, crypto.X509Req]
) -> List[str]:
# unlike its name this only outputs DNS names, other type of idents will ignored
common_name = loaded_cert_or_req.get_subject().CN
sans = _pyopenssl_cert_or_req_san(loaded_cert_or_req)
@@ -228,7 +266,7 @@ def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
return [common_name] + [d for d in sans if d != common_name]
def _pyopenssl_cert_or_req_san(cert_or_req):
def _pyopenssl_cert_or_req_san(cert_or_req: Union[crypto.X509, crypto.X509Req]) -> List[str]:
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
.. todo:: Implement directly in PyOpenSSL!
@@ -239,40 +277,79 @@ def _pyopenssl_cert_or_req_san(cert_or_req):
:param cert_or_req: Certificate or CSR.
:type cert_or_req: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
:returns: A list of Subject Alternative Names.
:returns: A list of Subject Alternative Names that is DNS.
:rtype: `list` of `unicode`
"""
# This function finds SANs by dumping the certificate/CSR to text and
# searching for "X509v3 Subject Alternative Name" in the text. This method
# is used to support PyOpenSSL version 0.13 where the
# `_subjectAltNameString` and `get_extensions` methods are not available
# for CSRs.
# This function finds SANs with dns name
# constants based on PyOpenSSL certificate/CSR text dump
part_separator = ":"
parts_separator = ", "
prefix = "DNS" + part_separator
if isinstance(cert_or_req, crypto.X509):
# pylint: disable=line-too-long
func: Union[Callable[[int, crypto.X509Req], bytes], Callable[[int, crypto.X509], bytes]] = crypto.dump_certificate
else:
func = crypto.dump_certificate_request
text = func(crypto.FILETYPE_TEXT, cert_or_req).decode("utf-8")
# WARNING: this function does not support multiple SANs extensions.
# Multiple X509v3 extensions of the same type is disallowed by RFC 5280.
match = re.search(r"X509v3 Subject Alternative Name:(?: critical)?\s*(.*)", text)
# WARNING: this function assumes that no SAN can include
# parts_separator, hence the split!
sans_parts = [] if match is None else match.group(1).split(parts_separator)
sans_parts = _pyopenssl_extract_san_list_raw(cert_or_req)
return [part.split(part_separator)[1]
for part in sans_parts if part.startswith(prefix)]
def gen_ss_cert(key, domains, not_before=None,
validity=(7 * 24 * 60 * 60), force_san=True, extensions=None):
def _pyopenssl_cert_or_req_san_ip(cert_or_req: Union[crypto.X509, crypto.X509Req]) -> List[str]:
"""Get Subject Alternative Names IPs from certificate or CSR using pyOpenSSL.
:param cert_or_req: Certificate or CSR.
:type cert_or_req: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
:returns: A list of Subject Alternative Names that are IP Addresses.
:rtype: `list` of `unicode`. note that this returns as string, not IPaddress object
"""
# constants based on PyOpenSSL certificate/CSR text dump
part_separator = ":"
prefix = "IP Address" + part_separator
sans_parts = _pyopenssl_extract_san_list_raw(cert_or_req)
return [part[len(prefix):] for part in sans_parts if part.startswith(prefix)]
def _pyopenssl_extract_san_list_raw(cert_or_req: Union[crypto.X509, crypto.X509Req]) -> List[str]:
"""Get raw SAN string from cert or csr, parse it as UTF-8 and return.
:param cert_or_req: Certificate or CSR.
:type cert_or_req: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
:returns: raw san strings, parsed byte as utf-8
:rtype: `list` of `unicode`
"""
# This function finds SANs by dumping the certificate/CSR to text and
# searching for "X509v3 Subject Alternative Name" in the text. This method
# is used to because in PyOpenSSL version <0.17 `_subjectAltNameString` methods are
# not able to Parse IP Addresses in subjectAltName string.
if isinstance(cert_or_req, crypto.X509):
# pylint: disable=line-too-long
text = crypto.dump_certificate(crypto.FILETYPE_TEXT, cert_or_req).decode('utf-8')
else:
text = crypto.dump_certificate_request(crypto.FILETYPE_TEXT, cert_or_req).decode('utf-8')
# WARNING: this function does not support multiple SANs extensions.
# Multiple X509v3 extensions of the same type is disallowed by RFC 5280.
raw_san = re.search(r"X509v3 Subject Alternative Name:(?: critical)?\s*(.*)", text)
parts_separator = ", "
# WARNING: this function assumes that no SAN can include
# parts_separator, hence the split!
sans_parts = [] if raw_san is None else raw_san.group(1).split(parts_separator)
return sans_parts
def gen_ss_cert(key: crypto.PKey, domains: Optional[List[str]] = None,
not_before: Optional[int] = None,
validity: int = (7 * 24 * 60 * 60), force_san: bool = True,
extensions: Optional[List[crypto.X509Extension]] = None,
ips: Optional[List[Union[ipaddress.IPv4Address, ipaddress.IPv6Address]]] = None
) -> crypto.X509:
"""Generate new self-signed certificate.
:type domains: `list` of `unicode`
@@ -280,6 +357,7 @@ def gen_ss_cert(key, domains, not_before=None,
:param bool force_san:
:param extensions: List of additional extensions to include in the cert.
:type extensions: `list` of `OpenSSL.crypto.X509Extension`
:type ips: `list` of (`ipaddress.IPv4Address` or `ipaddress.IPv6Address`)
If more than one domain is provided, all of the domains are put into
``subjectAltName`` X.509 extension and first domain is set as the
@@ -287,28 +365,39 @@ def gen_ss_cert(key, domains, not_before=None,
extension is used, unless `force_san` is ``True``.
"""
assert domains, "Must provide one or more hostnames for the cert."
assert domains or ips, "Must provide one or more hostnames or IPs for the cert."
cert = crypto.X509()
cert.set_serial_number(int(binascii.hexlify(os.urandom(16)), 16))
cert.set_version(2)
if extensions is None:
extensions = []
if domains is None:
domains = []
if ips is None:
ips = []
extensions.append(
crypto.X509Extension(
b"basicConstraints", True, b"CA:TRUE, pathlen:0"),
)
cert.get_subject().CN = domains[0]
if len(domains) > 0:
cert.get_subject().CN = domains[0]
# TODO: what to put into cert.get_subject()?
cert.set_issuer(cert.get_subject())
if force_san or len(domains) > 1:
sanlist = []
for address in domains:
sanlist.append('DNS:' + address)
for ip in ips:
sanlist.append('IP:' + ip.exploded)
san_string = ', '.join(sanlist).encode('ascii')
if force_san or len(domains) > 1 or len(ips) > 0:
extensions.append(crypto.X509Extension(
b"subjectAltName",
critical=False,
value=b", ".join(b"DNS:" + d.encode() for d in domains)
value=san_string
))
cert.add_extensions(extensions)
@@ -321,7 +410,7 @@ def gen_ss_cert(key, domains, not_before=None,
return cert
def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
def dump_pyopenssl_chain(chain: List[crypto.X509], filetype: int = crypto.FILETYPE_PEM) -> bytes:
"""Dump certificate chain into a bundle.
:param list chain: List of `OpenSSL.crypto.X509` (or wrapped in
@@ -334,7 +423,7 @@ def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
# XXX: returns empty string when no chain is available, which
# shuts up RenewableCert, but might not be the best solution...
def _dump_cert(cert):
def _dump_cert(cert: Union[jose.ComparableX509, crypto.X509]) -> bytes:
if isinstance(cert, jose.ComparableX509):
cert = cert.wrapped
return crypto.dump_certificate(filetype, cert)

View File

@@ -1,5 +1,17 @@
"""ACME errors."""
import typing
from typing import Any
from typing import List
from typing import Mapping
from typing import Set
from josepy import errors as jose_errors
import requests
# We import acme.messages only during type check to avoid circular dependencies. Type references
# to acme.message.* must be quoted to be lazily initialized and avoid compilation errors.
if typing.TYPE_CHECKING:
from acme import messages # pragma: no cover
class Error(Exception):
@@ -28,12 +40,12 @@ class NonceError(ClientError):
class BadNonce(NonceError):
"""Bad nonce error."""
def __init__(self, nonce, error, *args):
def __init__(self, nonce: str, error: Exception, *args: Any) -> None:
super().__init__(*args)
self.nonce = nonce
self.error = error
def __str__(self):
def __str__(self) -> str:
return 'Invalid nonce ({0!r}): {1}'.format(self.nonce, self.error)
@@ -47,11 +59,11 @@ class MissingNonce(NonceError):
:ivar requests.Response ~.response: HTTP Response
"""
def __init__(self, response, *args):
def __init__(self, response: requests.Response, *args: Any) -> None:
super().__init__(*args)
self.response = response
def __str__(self):
def __str__(self) -> str:
return ('Server {0} response did not include a replay '
'nonce, headers: {1} (This may be a service outage)'.format(
self.response.request.method, self.response.headers))
@@ -69,17 +81,20 @@ class PollError(ClientError):
to the most recently updated one
"""
def __init__(self, exhausted, updated):
def __init__(self, exhausted: Set['messages.AuthorizationResource'],
updated: Mapping['messages.AuthorizationResource',
'messages.AuthorizationResource']
) -> None:
self.exhausted = exhausted
self.updated = updated
super().__init__()
@property
def timeout(self):
def timeout(self) -> bool:
"""Was the error caused by timeout?"""
return bool(self.exhausted)
def __repr__(self):
def __repr__(self) -> str:
return '{0}(exhausted={1!r}, updated={2!r})'.format(
self.__class__.__name__, self.exhausted, self.updated)
@@ -88,7 +103,7 @@ class ValidationError(Error):
"""Error for authorization failures. Contains a list of authorization
resources, each of which is invalid and should have an error field.
"""
def __init__(self, failed_authzrs):
def __init__(self, failed_authzrs: List['messages.AuthorizationResource']) -> None:
self.failed_authzrs = failed_authzrs
super().__init__()
@@ -100,7 +115,7 @@ class TimeoutError(Error): # pylint: disable=redefined-builtin
class IssuanceError(Error):
"""Error sent by the server after requesting issuance of a certificate."""
def __init__(self, error):
def __init__(self, error: 'messages.Error') -> None:
"""Initialize.
:param messages.Error error: The error provided by the server.
@@ -117,7 +132,7 @@ class ConflictError(ClientError):
Also used in V2 of the ACME client for the same purpose.
"""
def __init__(self, location):
def __init__(self, location: str) -> None:
self.location = location
super().__init__()

View File

@@ -1,4 +1,7 @@
"""ACME JSON fields."""
import datetime
from typing import Any
import logging
import josepy as jose
@@ -10,17 +13,17 @@ logger = logging.getLogger(__name__)
class Fixed(jose.Field):
"""Fixed field."""
def __init__(self, json_name, value):
def __init__(self, json_name: str, value: Any) -> None:
self.value = value
super().__init__(
json_name=json_name, default=value, omitempty=False)
def decode(self, value):
def decode(self, value: Any) -> Any:
if value != self.value:
raise jose.DeserializationError('Expected {0!r}'.format(self.value))
return self.value
def encode(self, value):
def encode(self, value: Any) -> Any:
if value != self.value:
logger.warning(
'Overriding fixed field (%s) with %r', self.json_name, value)
@@ -37,11 +40,11 @@ class RFC3339Field(jose.Field):
"""
@classmethod
def default_encoder(cls, value):
def default_encoder(cls, value: datetime.datetime) -> str:
return pyrfc3339.generate(value)
@classmethod
def default_decoder(cls, value):
def default_decoder(cls, value: str) -> datetime.datetime:
try:
return pyrfc3339.parse(value)
except ValueError as error:
@@ -51,12 +54,12 @@ class RFC3339Field(jose.Field):
class Resource(jose.Field):
"""Resource MITM field."""
def __init__(self, resource_type, *args, **kwargs):
def __init__(self, resource_type: str, *args: Any, **kwargs: Any) -> None:
self.resource_type = resource_type
super().__init__(
'resource', default=resource_type, *args, **kwargs)
def decode(self, value):
def decode(self, value: Any) -> Any:
if value != self.resource_type:
raise jose.DeserializationError(
'Wrong resource type: {0} instead of {1}'.format(

View File

@@ -4,6 +4,8 @@ The JWS implementation in josepy only implements the base JOSE standard. In
order to support the new header fields defined in ACME, this module defines some
ACME-specific classes that layer on top of josepy.
"""
from typing import Optional
import josepy as jose
@@ -17,7 +19,7 @@ class Header(jose.Header):
# Mypy does not understand the josepy magic happening here, and falsely claims
# that nonce is redefined. Let's ignore the type check here.
@nonce.decoder # type: ignore
def nonce(value): # pylint: disable=no-self-argument,missing-function-docstring
def nonce(value: str) -> bytes: # pylint: disable=no-self-argument,missing-function-docstring
try:
return jose.decode_b64jose(value)
except jose.DeserializationError as error:
@@ -46,11 +48,12 @@ class JWS(jose.JWS):
@classmethod
# pylint: disable=arguments-differ
def sign(cls, payload, key, alg, nonce, url=None, kid=None):
def sign(cls, payload: bytes, key: jose.JWK, alg: jose.JWASignature, nonce: Optional[bytes],
url: Optional[str] = None, kid: Optional[str] = None) -> jose.JWS:
# Per ACME spec, jwk and kid are mutually exclusive, so only include a
# jwk field if kid is not provided.
include_jwk = kid is None
return super().sign(payload, key=key, alg=alg,
protect=frozenset(['nonce', 'url', 'kid', 'jwk', 'alg']),
nonce=nonce, url=url, kid=kid,
include_jwk=include_jwk)
protect=frozenset(['nonce', 'url', 'kid', 'jwk', 'alg']),
nonce=nonce, url=url, kid=kid,
include_jwk=include_jwk)

View File

@@ -6,12 +6,13 @@ available. This code is being kept for now for backwards compatibility.
"""
import warnings
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
from typing import Collection, IO # type: ignore
from typing import Any
warnings.warn("acme.magic_typing is deprecated and will be removed in a future release.",
DeprecationWarning)
class TypingClass:
"""Ignore import errors by getting anything"""
def __getattr__(self, name):
def __getattr__(self, name: str) -> Any:
return None # pragma: no cover

View File

@@ -3,7 +3,13 @@ from collections.abc import Hashable
import json
from typing import Any
from typing import Dict
from typing import Iterator
from typing import List
from typing import Mapping
from typing import MutableMapping
from typing import Tuple
from typing import Type
from typing import Optional
import josepy as jose
@@ -57,7 +63,7 @@ ERROR_TYPE_DESCRIPTIONS.update(dict( # add errors with old prefix, deprecate me
(OLD_ERROR_PREFIX + name, desc) for name, desc in ERROR_CODES.items()))
def is_acme_error(err):
def is_acme_error(err: BaseException) -> bool:
"""Check if argument is an ACME error."""
if isinstance(err, Error) and (err.typ is not None):
return (ERROR_PREFIX in err.typ) or (OLD_ERROR_PREFIX in err.typ)
@@ -79,7 +85,7 @@ class Error(jose.JSONObjectWithFields, errors.Error):
detail = jose.Field('detail', omitempty=True)
@classmethod
def with_code(cls, code, **kwargs):
def with_code(cls, code: str, **kwargs: Any) -> 'Error':
"""Create an Error instance with an ACME Error code.
:unicode code: An ACME error code, like 'dnssec'.
@@ -95,7 +101,7 @@ class Error(jose.JSONObjectWithFields, errors.Error):
return cls(typ=typ, **kwargs) # type: ignore
@property
def description(self):
def description(self) -> Optional[str]:
"""Hardcoded error description based on its type.
:returns: Description if standard ACME error or ``None``.
@@ -105,7 +111,7 @@ class Error(jose.JSONObjectWithFields, errors.Error):
return ERROR_TYPE_DESCRIPTIONS.get(self.typ)
@property
def code(self):
def code(self) -> Optional[str]:
"""ACME error code.
Basically self.typ without the ERROR_PREFIX.
@@ -114,51 +120,53 @@ class Error(jose.JSONObjectWithFields, errors.Error):
:rtype: unicode
"""
code = str(self.typ).split(':')[-1]
code = str(self.typ).rsplit(':', maxsplit=1)[-1]
if code in ERROR_CODES:
return code
return None
def __str__(self):
def __str__(self) -> str:
return b' :: '.join(
part.encode('ascii', 'backslashreplace') for part in
(self.typ, self.description, self.detail, self.title)
if part is not None).decode()
class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
class _Constant(jose.JSONDeSerializable, Hashable):
"""ACME constant."""
__slots__ = ('name',)
POSSIBLE_NAMES: Dict[str, '_Constant'] = NotImplemented
def __init__(self, name):
def __init__(self, name: str) -> None:
super().__init__()
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
self.name = name
def to_partial_json(self):
def to_partial_json(self) -> str:
return self.name
@classmethod
def from_json(cls, jobj):
def from_json(cls, jobj: str) -> '_Constant':
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
raise jose.DeserializationError(
'{0} not recognized'.format(cls.__name__))
return cls.POSSIBLE_NAMES[jobj]
def __repr__(self):
def __repr__(self) -> str:
return '{0}({1})'.format(self.__class__.__name__, self.name)
def __eq__(self, other):
def __eq__(self, other: Any) -> bool:
return isinstance(other, type(self)) and other.name == self.name
def __hash__(self):
def __hash__(self) -> int:
return hash((self.__class__, self.name))
class Status(_Constant):
"""ACME "status" field."""
POSSIBLE_NAMES: dict = {}
POSSIBLE_NAMES: Dict[str, 'Status'] = {}
STATUS_UNKNOWN = Status('unknown')
STATUS_PENDING = Status('pending')
STATUS_PROCESSING = Status('processing')
@@ -172,7 +180,10 @@ STATUS_DEACTIVATED = Status('deactivated')
class IdentifierType(_Constant):
"""ACME identifier type."""
POSSIBLE_NAMES: Dict[str, 'IdentifierType'] = {}
IDENTIFIER_FQDN = IdentifierType('dns') # IdentifierDNS in Boulder
IDENTIFIER_IP = IdentifierType('ip') # IdentifierIP in pebble - not in Boulder yet
class Identifier(jose.JSONObjectWithFields):
@@ -189,7 +200,7 @@ class Identifier(jose.JSONObjectWithFields):
class Directory(jose.JSONDeSerializable):
"""Directory."""
_REGISTERED_TYPES: Dict[str, Type[Any]] = {}
_REGISTERED_TYPES: Dict[str, Type['Directory']] = {}
class Meta(jose.JSONObjectWithFields):
"""Directory Meta."""
@@ -199,60 +210,59 @@ class Directory(jose.JSONDeSerializable):
caa_identities = jose.Field('caaIdentities', omitempty=True)
external_account_required = jose.Field('externalAccountRequired', omitempty=True)
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any) -> None:
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
super().__init__(**kwargs)
@property
def terms_of_service(self):
def terms_of_service(self) -> str:
"""URL for the CA TOS"""
return self._terms_of_service or self._terms_of_service_v2
def __iter__(self):
def __iter__(self) -> Iterator[str]:
# When iterating over fields, use the external name 'terms_of_service' instead of
# the internal '_terms_of_service'.
for name in super().__iter__():
yield name[1:] if name == '_terms_of_service' else name
def _internal_name(self, name):
def _internal_name(self, name: str) -> str:
return '_' + name if name == 'terms_of_service' else name
@classmethod
def _canon_key(cls, key):
def _canon_key(cls, key: str) -> str:
return getattr(key, 'resource_type', key)
@classmethod
def register(cls, resource_body_cls: Type[Any]) -> Type[Any]:
def register(cls, resource_body_cls: Type['Directory']) -> Type['Directory']:
"""Register resource."""
resource_type = resource_body_cls.resource_type
assert resource_type not in cls._REGISTERED_TYPES
cls._REGISTERED_TYPES[resource_type] = resource_body_cls
return resource_body_cls
def __init__(self, jobj):
def __init__(self, jobj: Mapping[str, Any]) -> None:
canon_jobj = util.map_keys(jobj, self._canon_key)
# TODO: check that everything is an absolute URL; acme-spec is
# not clear on that
self._jobj = canon_jobj
def __getattr__(self, name):
def __getattr__(self, name: str) -> Any:
try:
return self[name.replace('_', '-')]
except KeyError as error:
raise AttributeError(str(error))
def __getitem__(self, name):
def __getitem__(self, name: str) -> Any:
try:
return self._jobj[self._canon_key(name)]
except KeyError:
raise KeyError('Directory field "' + self._canon_key(name) + '" not found')
def to_partial_json(self):
def to_partial_json(self) -> Dict[str, Any]:
return self._jobj
@classmethod
def from_json(cls, jobj):
def from_json(cls, jobj: MutableMapping[str, Any]) -> 'Directory':
jobj['meta'] = cls.Meta.from_json(jobj.pop('meta', {}))
return cls(jobj)
@@ -283,7 +293,8 @@ class ExternalAccountBinding:
"""ACME External Account Binding"""
@classmethod
def from_data(cls, account_public_key, kid, hmac_key, directory):
def from_data(cls, account_public_key: jose.JWK, kid: str, hmac_key: str,
directory: Directory) -> Dict[str, Any]:
"""Create External Account Binding Resource from contact details, kid and hmac."""
key_json = json.dumps(account_public_key.to_partial_json()).encode()
@@ -323,7 +334,9 @@ class Registration(ResourceBody):
email_prefix = 'mailto:'
@classmethod
def from_data(cls, phone=None, email=None, external_account_binding=None, **kwargs):
def from_data(cls, phone: Optional[str] = None, email: Optional[str] = None,
external_account_binding: Optional[Dict[str, Any]] = None,
**kwargs: Any) -> 'Registration':
"""
Create registration resource from contact details.
@@ -352,19 +365,19 @@ class Registration(ResourceBody):
return cls(**kwargs)
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any) -> None:
"""Note if the user provides a value for the `contact` member."""
if 'contact' in kwargs:
if 'contact' in kwargs and kwargs['contact'] is not None:
# Avoid the __setattr__ used by jose.TypedJSONObjectWithFields
object.__setattr__(self, '_add_contact', True)
super().__init__(**kwargs)
def _filter_contact(self, prefix):
def _filter_contact(self, prefix: str) -> Tuple[str, ...]:
return tuple(
detail[len(prefix):] for detail in self.contact # pylint: disable=not-an-iterable
if detail.startswith(prefix))
def _add_contact_if_appropriate(self, jobj):
def _add_contact_if_appropriate(self, jobj: Dict[str, Any]) -> Dict[str, Any]:
"""
The `contact` member of Registration objects should not be required when
de-serializing (as it would be if the Fields' `omitempty` flag were `False`), but
@@ -381,23 +394,23 @@ class Registration(ResourceBody):
return jobj
def to_partial_json(self):
def to_partial_json(self) -> Dict[str, Any]:
"""Modify josepy.JSONDeserializable.to_partial_json()"""
jobj = super().to_partial_json()
return self._add_contact_if_appropriate(jobj)
def fields_to_partial_json(self):
def fields_to_partial_json(self) -> Dict[str, Any]:
"""Modify josepy.JSONObjectWithFields.fields_to_partial_json()"""
jobj = super().fields_to_partial_json()
return self._add_contact_if_appropriate(jobj)
@property
def phones(self):
def phones(self) -> Tuple[str, ...]:
"""All phones found in the ``contact`` field."""
return self._filter_contact(self.phone_prefix)
@property
def emails(self):
def emails(self) -> Tuple[str, ...]:
"""All emails found in the ``contact`` field."""
return self._filter_contact(self.email_prefix)
@@ -458,39 +471,39 @@ class ChallengeBody(ResourceBody):
error = jose.Field('error', decoder=Error.from_json,
omitempty=True, default=None)
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any) -> None:
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
super().__init__(**kwargs)
def encode(self, name):
def encode(self, name: str) -> Any:
return super().encode(self._internal_name(name))
def to_partial_json(self):
def to_partial_json(self) -> Dict[str, Any]:
jobj = super().to_partial_json()
jobj.update(self.chall.to_partial_json())
return jobj
@classmethod
def fields_from_json(cls, jobj):
def fields_from_json(cls, jobj: Mapping[str, Any]) -> Dict[str, Any]:
jobj_fields = super().fields_from_json(jobj)
jobj_fields['chall'] = challenges.Challenge.from_json(jobj)
return jobj_fields
@property
def uri(self):
def uri(self) -> str:
"""The URL of this challenge."""
return self._url or self._uri
def __getattr__(self, name):
def __getattr__(self, name: str) -> Any:
return getattr(self.chall, name)
def __iter__(self):
def __iter__(self) -> Iterator[str]:
# When iterating over fields, use the external name 'uri' instead of
# the internal '_uri'.
for name in super().__iter__():
yield name[1:] if name == '_uri' else name
def _internal_name(self, name):
def _internal_name(self, name: str) -> str:
return '_' + name if name == 'uri' else name
@@ -505,7 +518,7 @@ class ChallengeResource(Resource):
authzr_uri = jose.Field('authzr_uri')
@property
def uri(self):
def uri(self) -> str:
"""The URL of the challenge body."""
return self.body.uri
@@ -536,11 +549,11 @@ class Authorization(ResourceBody):
# Mypy does not understand the josepy magic happening here, and falsely claims
# that challenge is redefined. Let's ignore the type check here.
@challenges.decoder # type: ignore
def challenges(value): # pylint: disable=no-self-argument,missing-function-docstring
def challenges(value: List[Mapping[str, Any]]) -> Tuple[ChallengeBody, ...]: # pylint: disable=no-self-argument,missing-function-docstring
return tuple(ChallengeBody.from_json(chall) for chall in value)
@property
def resolved_combinations(self):
def resolved_combinations(self) -> Tuple[Tuple[Dict[str, Any], ...], ...]:
"""Combinations with challenges instead of indices."""
return tuple(tuple(self.challenges[idx] for idx in combo)
for combo in self.combinations) # pylint: disable=not-an-iterable
@@ -637,9 +650,10 @@ class Order(ResourceBody):
# Mypy does not understand the josepy magic happening here, and falsely claims
# that identifiers is redefined. Let's ignore the type check here.
@identifiers.decoder # type: ignore
def identifiers(value): # pylint: disable=no-self-argument,missing-function-docstring
def identifiers(value: List[Mapping[str, Any]]) -> Tuple[Identifier, ...]: # pylint: disable=no-self-argument,missing-function-docstring
return tuple(Identifier.from_json(identifier) for identifier in value)
class OrderResource(ResourceWithURI):
"""Order Resource.

View File

@@ -1,21 +1,23 @@
"""Useful mixins for Challenge and Resource objects"""
from typing import Any
from typing import Dict
class VersionedLEACMEMixin:
"""This mixin stores the version of Let's Encrypt's endpoint being used."""
@property
def le_acme_version(self):
def le_acme_version(self) -> int:
"""Define the version of ACME protocol to use"""
return getattr(self, '_le_acme_version', 1)
@le_acme_version.setter
def le_acme_version(self, version):
def le_acme_version(self, version: int) -> None:
# We need to use object.__setattr__ to not depend on the specific implementation of
# __setattr__ in current class (eg. jose.TypedJSONObjectWithFields raises AttributeError
# for any attempt to set an attribute to make objects immutable).
object.__setattr__(self, '_le_acme_version', version)
def __setattr__(self, key, value):
def __setattr__(self, key: str, value: Any) -> None:
if key == 'le_acme_version':
# Required for @property to operate properly. See comment above.
object.__setattr__(self, key, value)
@@ -28,12 +30,12 @@ class ResourceMixin(VersionedLEACMEMixin):
This mixin generates a RFC8555 compliant JWS payload
by removing the `resource` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self):
def to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(),
'to_partial_json', 'resource')
def fields_to_partial_json(self):
def fields_to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(),
'fields_to_partial_json', 'resource')
@@ -44,20 +46,21 @@ class TypeMixin(VersionedLEACMEMixin):
This mixin allows generation of a RFC8555 compliant JWS payload
by removing the `type` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self):
def to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(),
'to_partial_json', 'type')
def fields_to_partial_json(self):
def fields_to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(),
'fields_to_partial_json', 'type')
def _safe_jobj_compliance(instance, jobj_method, uncompliant_field):
def _safe_jobj_compliance(instance: Any, jobj_method: str,
uncompliant_field: str) -> Dict[str, Any]:
if hasattr(instance, jobj_method):
jobj = getattr(instance, jobj_method)()
jobj: Dict[str, Any] = getattr(instance, jobj_method)()
if instance.le_acme_version == 2:
jobj.pop(uncompliant_field, None)
return jobj

0
acme/acme/py.typed Normal file
View File

View File

@@ -7,7 +7,16 @@ import logging
import socket
import socketserver
import threading
from typing import Any
from typing import List
from typing import Mapping
from typing import Optional
from typing import Set
from typing import Tuple
from typing import Type
from OpenSSL import crypto
from OpenSSL import SSL
from acme import challenges
from acme import crypto_util
@@ -18,30 +27,30 @@ logger = logging.getLogger(__name__)
class TLSServer(socketserver.TCPServer):
"""Generic TLS Server."""
def __init__(self, *args, **kwargs):
def __init__(self, *args: Any, **kwargs: Any) -> None:
self.ipv6 = kwargs.pop("ipv6", False)
if self.ipv6:
self.address_family = socket.AF_INET6
else:
self.address_family = socket.AF_INET
self.certs = kwargs.pop("certs", {})
self.method = kwargs.pop(
"method", crypto_util._DEFAULT_SSL_METHOD)
self.method = kwargs.pop("method", crypto_util._DEFAULT_SSL_METHOD)
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
socketserver.TCPServer.__init__(self, *args, **kwargs)
super().__init__(*args, **kwargs)
def _wrap_sock(self):
def _wrap_sock(self) -> None:
self.socket = crypto_util.SSLSocket(
self.socket, cert_selection=self._cert_selection,
alpn_selection=getattr(self, '_alpn_selection', None),
method=self.method)
def _cert_selection(self, connection): # pragma: no cover
def _cert_selection(self, connection: SSL.Connection
) -> Tuple[crypto.PKey, crypto.X509]: # pragma: no cover
"""Callback selecting certificate for connection."""
server_name = connection.get_servername()
return self.certs.get(server_name, None)
def server_bind(self):
def server_bind(self) -> None:
self._wrap_sock()
return socketserver.TCPServer.server_bind(self)
@@ -61,11 +70,15 @@ class BaseDualNetworkedServers:
If two servers are instantiated, they will serve on the same port.
"""
def __init__(self, ServerClass, server_address, *remaining_args, **kwargs):
def __init__(self, ServerClass: Type[socketserver.TCPServer], server_address: Tuple[str, int],
*remaining_args: Any, **kwargs: Any) -> None:
port = server_address[1]
self.threads: List[threading.Thread] = []
self.servers: List[socketserver.BaseServer] = []
# Preserve socket error for re-raising, if no servers can be started
last_socket_err: Optional[socket.error] = None
# Must try True first.
# Ubuntu, for example, will fail to bind to IPv4 if we've already bound
# to IPv6. But that's ok, since it will accept IPv4 connections on the IPv6
@@ -82,7 +95,8 @@ class BaseDualNetworkedServers:
logger.debug(
"Successfully bound to %s:%s using %s", new_address[0],
new_address[1], "IPv6" if ip_version else "IPv4")
except socket.error:
except socket.error as e:
last_socket_err = e
if self.servers:
# Already bound using IPv6.
logger.debug(
@@ -101,9 +115,12 @@ class BaseDualNetworkedServers:
# bind to the same port for both servers.
port = server.socket.getsockname()[1]
if not self.servers:
raise socket.error("Could not bind to IPv4 or IPv6.")
if last_socket_err:
raise last_socket_err
else: # pragma: no cover
raise socket.error("Could not bind to IPv4 or IPv6.")
def serve_forever(self):
def serve_forever(self) -> None:
"""Wraps socketserver.TCPServer.serve_forever"""
for server in self.servers:
thread = threading.Thread(
@@ -111,11 +128,11 @@ class BaseDualNetworkedServers:
thread.start()
self.threads.append(thread)
def getsocknames(self):
def getsocknames(self) -> List[Tuple[str, int]]:
"""Wraps socketserver.TCPServer.socket.getsockname"""
return [server.socket.getsockname() for server in self.servers]
def shutdown_and_server_close(self):
def shutdown_and_server_close(self) -> None:
"""Wraps socketserver.TCPServer.shutdown, socketserver.TCPServer.server_close, and
threading.Thread.join"""
for server in self.servers:
@@ -131,13 +148,16 @@ class TLSALPN01Server(TLSServer, ACMEServerMixin):
ACME_TLS_1_PROTOCOL = b"acme-tls/1"
def __init__(self, server_address, certs, challenge_certs, ipv6=False):
def __init__(self, server_address: Tuple[str, int],
certs: List[Tuple[crypto.PKey, crypto.X509]],
challenge_certs: Mapping[str, Tuple[crypto.PKey, crypto.X509]],
ipv6: bool = False) -> None:
TLSServer.__init__(
self, server_address, _BaseRequestHandlerWithLogging, certs=certs,
ipv6=ipv6)
self.challenge_certs = challenge_certs
def _cert_selection(self, connection):
def _cert_selection(self, connection: SSL.Connection) -> Tuple[crypto.PKey, crypto.X509]:
# TODO: We would like to serve challenge cert only if asked for it via
# ALPN. To do this, we need to retrieve the list of protos from client
# hello, but this is currently impossible with openssl [0], and ALPN
@@ -147,9 +167,9 @@ class TLSALPN01Server(TLSServer, ACMEServerMixin):
# [0] https://github.com/openssl/openssl/issues/4952
server_name = connection.get_servername()
logger.debug("Serving challenge cert for server name %s", server_name)
return self.challenge_certs.get(server_name, None)
return self.challenge_certs[server_name]
def _alpn_selection(self, _connection, alpn_protos):
def _alpn_selection(self, _connection: SSL.Connection, alpn_protos: List[bytes]) -> bytes:
"""Callback to select alpn protocol."""
if len(alpn_protos) == 1 and alpn_protos[0] == self.ACME_TLS_1_PROTOCOL:
logger.debug("Agreed on %s ALPN", self.ACME_TLS_1_PROTOCOL)
@@ -163,21 +183,22 @@ class TLSALPN01Server(TLSServer, ACMEServerMixin):
class HTTPServer(BaseHTTPServer.HTTPServer):
"""Generic HTTP Server."""
def __init__(self, *args, **kwargs):
def __init__(self, *args: Any, **kwargs: Any) -> None:
self.ipv6 = kwargs.pop("ipv6", False)
if self.ipv6:
self.address_family = socket.AF_INET6
else:
self.address_family = socket.AF_INET
BaseHTTPServer.HTTPServer.__init__(self, *args, **kwargs)
super().__init__(*args, **kwargs)
class HTTP01Server(HTTPServer, ACMEServerMixin):
"""HTTP01 Server."""
def __init__(self, server_address, resources, ipv6=False, timeout=30):
HTTPServer.__init__(
self, server_address, HTTP01RequestHandler.partial_init(
def __init__(self, server_address: Tuple[str, int], resources: Set[challenges.HTTP01],
ipv6: bool = False, timeout: int = 30) -> None:
super().__init__(
server_address, HTTP01RequestHandler.partial_init(
simple_http_resources=resources, timeout=timeout), ipv6=ipv6)
@@ -185,8 +206,8 @@ class HTTP01DualNetworkedServers(BaseDualNetworkedServers):
"""HTTP01Server Wrapper. Tries everything for both. Failures for one don't
affect the other."""
def __init__(self, *args, **kwargs):
BaseDualNetworkedServers.__init__(self, HTTP01Server, *args, **kwargs)
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(HTTP01Server, *args, **kwargs)
class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
@@ -201,10 +222,10 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
HTTP01Resource = collections.namedtuple(
"HTTP01Resource", "chall response validation")
def __init__(self, *args, **kwargs):
def __init__(self, *args: Any, **kwargs: Any) -> None:
self.simple_http_resources = kwargs.pop("simple_http_resources", set())
self._timeout = kwargs.pop('timeout', 30)
BaseHTTPServer.BaseHTTPRequestHandler.__init__(self, *args, **kwargs)
super().__init__(*args, **kwargs)
self.server: HTTP01Server
# In parent class BaseHTTPRequestHandler, 'timeout' is a class-level property but we
@@ -214,7 +235,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
# everyone happy, we statically redefine 'timeout' as a method property, and set the
# timeout value in a new internal instance-level property _timeout.
@property
def timeout(self):
def timeout(self) -> int: # type: ignore[override]
"""
The default timeout this server should apply to requests.
:return: timeout to apply
@@ -222,16 +243,16 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
"""
return self._timeout
def log_message(self, format, *args): # pylint: disable=redefined-builtin
def log_message(self, format: str, *args: Any) -> None: # pylint: disable=redefined-builtin
"""Log arbitrary message."""
logger.debug("%s - - %s", self.client_address[0], format % args)
def handle(self):
def handle(self) -> None:
"""Handle request."""
self.log_message("Incoming request")
BaseHTTPServer.BaseHTTPRequestHandler.handle(self)
def do_GET(self): # pylint: disable=invalid-name,missing-function-docstring
def do_GET(self) -> None: # pylint: disable=invalid-name,missing-function-docstring
if self.path == "/":
self.handle_index()
elif self.path.startswith("/" + challenges.HTTP01.URI_ROOT_PATH):
@@ -239,21 +260,21 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
else:
self.handle_404()
def handle_index(self):
def handle_index(self) -> None:
"""Handle index page."""
self.send_response(200)
self.send_header("Content-Type", "text/html")
self.end_headers()
self.wfile.write(self.server.server_version.encode())
def handle_404(self):
def handle_404(self) -> None:
"""Handler 404 Not Found errors."""
self.send_response(http_client.NOT_FOUND, message="Not Found")
self.send_header("Content-type", "text/html")
self.end_headers()
self.wfile.write(b"404")
def handle_simple_http_resource(self):
def handle_simple_http_resource(self) -> None:
"""Handle HTTP01 provisioned resources."""
for resource in self.simple_http_resources:
if resource.chall.path == self.path:
@@ -269,7 +290,8 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
self.path)
@classmethod
def partial_init(cls, simple_http_resources, timeout):
def partial_init(cls, simple_http_resources: Set[challenges.HTTP01],
timeout: int) -> 'functools.partial[HTTP01RequestHandler]':
"""Partially initialize this handler.
This is useful because `socketserver.BaseServer` takes
@@ -285,11 +307,11 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
class _BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
"""BaseRequestHandler with logging."""
def log_message(self, format, *args): # pylint: disable=redefined-builtin
def log_message(self, format: str, *args: Any) -> None: # pylint: disable=redefined-builtin
"""Log arbitrary message."""
logger.debug("%s - - %s", self.client_address[0], format % args)
def handle(self):
def handle(self) -> None:
"""Handle request."""
self.log_message("Incoming request")
socketserver.BaseRequestHandler.handle(self)

View File

@@ -1,6 +1,10 @@
"""ACME utilities."""
from typing import Any
from typing import Callable
from typing import Dict
from typing import Mapping
def map_keys(dikt, func):
def map_keys(dikt: Mapping[Any, Any], func: Callable[[Any], Any]) -> Dict[Any, Any]:
"""Map dictionary keys."""
return {func(key): value for key, value in dikt.items()}

View File

@@ -58,7 +58,7 @@ master_doc = 'index'
# General information about the project.
project = u'acme-python'
copyright = u'2015-2015, Let\'s Encrypt Project'
copyright = u'2015, Let\'s Encrypt Project'
author = u'Let\'s Encrypt Project'
# The version info for the project you're documenting, acts as replacement for

View File

@@ -1,2 +0,0 @@
python -m acme.standalone -p 1234
curl -k https://localhost:1234

View File

@@ -1 +0,0 @@
../../../acme/testdata/rsa2048_cert.pem

View File

@@ -1 +0,0 @@
../../../acme/testdata/rsa2048_key.pem

View File

@@ -7,4 +7,7 @@
# in --editable mode (-e), just "pip install acme[docs]" does not work as
# expected and "pip install -e acme[docs]" must be used instead
# We also pin our dependencies for increased stability.
-c ../tools/requirements.txt
-e acme[docs]

View File

@@ -3,33 +3,29 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '1.17.0.dev0'
version = '1.23.0.dev0'
install_requires = [
'cryptography>=2.1.4',
# formerly known as acme.jose:
# 1.1.0+ is required to avoid the warnings described at
# https://github.com/certbot/josepy/issues/13.
'josepy>=1.1.0',
'cryptography>=2.5.0',
'josepy>=1.9.0',
'PyOpenSSL>=17.3.0',
'pyrfc3339',
'pytz',
'requests>=2.6.0',
'requests>=2.14.2',
'requests-toolbelt>=0.3.0',
'setuptools>=39.0.1',
]
dev_extras = [
'pytest',
'pytest-xdist',
'tox',
]
docs_extras = [
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
'sphinx_rtd_theme',
]
test_extras = [
'pytest',
'pytest-xdist',
]
setup(
name='acme',
version=version,
@@ -49,6 +45,7 @@ setup(
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],
@@ -57,7 +54,7 @@ setup(
include_package_data=True,
install_requires=install_requires,
extras_require={
'dev': dev_extras,
'docs': docs_extras,
'test': test_extras,
},
)

View File

@@ -326,12 +326,12 @@ class TLSALPN01ResponseTest(unittest.TestCase):
self.response.probe_cert('foo.com')
mock_gethostbyname.assert_called_once_with('foo.com')
mock_probe_sni.assert_called_once_with(
host='127.0.0.1', port=self.response.PORT, name='foo.com',
host=b'127.0.0.1', port=self.response.PORT, name=b'foo.com',
alpn_protocols=['acme-tls/1'])
self.response.probe_cert('foo.com', host='8.8.8.8')
mock_probe_sni.assert_called_with(
host='8.8.8.8', port=mock.ANY, name='foo.com',
host=b'8.8.8.8', port=mock.ANY, name=b'foo.com',
alpn_protocols=['acme-tls/1'])
@mock.patch('acme.challenges.TLSALPN01Response.probe_cert')

View File

@@ -3,6 +3,7 @@
import copy
import datetime
import http.client as http_client
import ipaddress
import json
import unittest
from typing import Dict
@@ -23,6 +24,7 @@ import test_util
CERT_DER = test_util.load_vector('cert.der')
CERT_SAN_PEM = test_util.load_vector('cert-san.pem')
CSR_SAN_PEM = test_util.load_vector('csr-san.pem')
CSR_MIXED_PEM = test_util.load_vector('csr-mixed.pem')
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
KEY2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
@@ -740,7 +742,7 @@ class ClientV2Test(ClientTestBase):
self.orderr = messages.OrderResource(
body=self.order,
uri='https://www.letsencrypt-demo.org/acme/acct/1/order/1',
authorizations=[self.authzr, self.authzr2], csr_pem=CSR_SAN_PEM)
authorizations=[self.authzr, self.authzr2], csr_pem=CSR_MIXED_PEM)
def test_new_account(self):
self.response.status_code = http_client.CREATED
@@ -770,7 +772,7 @@ class ClientV2Test(ClientTestBase):
with mock.patch('acme.client.ClientV2._post_as_get') as mock_post_as_get:
mock_post_as_get.side_effect = (authz_response, authz_response2)
self.assertEqual(self.client.new_order(CSR_SAN_PEM), self.orderr)
self.assertEqual(self.client.new_order(CSR_MIXED_PEM), self.orderr)
@mock.patch('acme.client.datetime')
def test_poll_and_finalize(self, mock_datetime):

View File

@@ -1,5 +1,6 @@
"""Tests for acme.crypto_util."""
import itertools
import ipaddress
import socket
import socketserver
import threading
@@ -108,7 +109,6 @@ class PyOpenSSLCertOrReqAllNamesTest(unittest.TestCase):
class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
"""Test for acme.crypto_util._pyopenssl_cert_or_req_san."""
@classmethod
def _call(cls, loader, name):
# pylint: disable=protected-access
@@ -174,9 +174,50 @@ class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
['chicago-cubs.venafi.example', 'cubs.venafi.example'])
class PyOpenSSLCertOrReqSANIPTest(unittest.TestCase):
"""Test for acme.crypto_util._pyopenssl_cert_or_req_san_ip."""
class RandomSnTest(unittest.TestCase):
"""Test for random certificate serial numbers."""
@classmethod
def _call(cls, loader, name):
# pylint: disable=protected-access
from acme.crypto_util import _pyopenssl_cert_or_req_san_ip
return _pyopenssl_cert_or_req_san_ip(loader(name))
def _call_cert(self, name):
return self._call(test_util.load_cert, name)
def _call_csr(self, name):
return self._call(test_util.load_csr, name)
def test_cert_no_sans(self):
self.assertEqual(self._call_cert('cert.pem'), [])
def test_csr_no_sans(self):
self.assertEqual(self._call_csr('csr-nosans.pem'), [])
def test_cert_domain_sans(self):
self.assertEqual(self._call_cert('cert-san.pem'), [])
def test_csr_domain_sans(self):
self.assertEqual(self._call_csr('csr-san.pem'), [])
def test_cert_ip_two_sans(self):
self.assertEqual(self._call_cert('cert-ipsans.pem'), ['192.0.2.145', '203.0.113.1'])
def test_csr_ip_two_sans(self):
self.assertEqual(self._call_csr('csr-ipsans.pem'), ['192.0.2.145', '203.0.113.1'])
def test_csr_ipv6_sans(self):
self.assertEqual(self._call_csr('csr-ipv6sans.pem'),
['0:0:0:0:0:0:0:1', 'A3BE:32F3:206E:C75D:956:CEE:9858:5EC5'])
def test_cert_ipv6_sans(self):
self.assertEqual(self._call_cert('cert-ipv6sans.pem'),
['0:0:0:0:0:0:0:1', 'A3BE:32F3:206E:C75D:956:CEE:9858:5EC5'])
class GenSsCertTest(unittest.TestCase):
"""Test for gen_ss_cert (generation of self-signed cert)."""
def setUp(self):
@@ -187,11 +228,19 @@ class RandomSnTest(unittest.TestCase):
def test_sn_collisions(self):
from acme.crypto_util import gen_ss_cert
for _ in range(self.cert_count):
cert = gen_ss_cert(self.key, ['dummy'], force_san=True)
cert = gen_ss_cert(self.key, ['dummy'], force_san=True,
ips=[ipaddress.ip_address("10.10.10.10")])
self.serial_num.append(cert.get_serial_number())
self.assertGreater(len(set(self.serial_num)), 1)
self.assertGreaterEqual(len(set(self.serial_num)), self.cert_count)
def test_no_name(self):
from acme.crypto_util import gen_ss_cert
with self.assertRaises(AssertionError):
gen_ss_cert(self.key, ips=[ipaddress.ip_address("1.1.1.1")])
gen_ss_cert(self.key)
class MakeCSRTest(unittest.TestCase):
"""Test for standalone functions."""
@@ -223,6 +272,27 @@ class MakeCSRTest(unittest.TestCase):
).get_data(),
)
def test_make_csr_ip(self):
csr_pem = self._call_with_key(["a.example"], False, [ipaddress.ip_address('127.0.0.1'), ipaddress.ip_address('::1')])
self.assertIn(b'--BEGIN CERTIFICATE REQUEST--' , csr_pem)
self.assertIn(b'--END CERTIFICATE REQUEST--' , csr_pem)
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
# have a get_extensions() method, so we skip this test if the method
# isn't available.
if hasattr(csr, 'get_extensions'):
self.assertEqual(len(csr.get_extensions()), 1)
self.assertEqual(csr.get_extensions()[0].get_data(),
OpenSSL.crypto.X509Extension(
b'subjectAltName',
critical=False,
value=b'DNS:a.example, IP:127.0.0.1, IP:::1',
).get_data(),
)
# for IP san it's actually need to be octet-string,
# but somewhere downstream thankfully handle it for us
def test_make_csr_must_staple(self):
csr_pem = self._call_with_key(["a.example"], must_staple=True)
csr = OpenSSL.crypto.load_certificate_request(
@@ -241,6 +311,9 @@ class MakeCSRTest(unittest.TestCase):
self.assertEqual(len(must_staple_exts), 1,
"Expected exactly one Must Staple extension")
def test_make_csr_without_hostname(self):
self.assertRaises(ValueError, self._call_with_key)
class DumpPyopensslChainTest(unittest.TestCase):
"""Test for dump_pyopenssl_chain."""

View File

@@ -165,7 +165,6 @@ class TLSALPN01ServerTest(unittest.TestCase):
class BaseDualNetworkedServersTest(unittest.TestCase):
"""Test for acme.standalone.BaseDualNetworkedServers."""
class SingleProtocolServer(socketserver.TCPServer):
"""Server that only serves on a single protocol. FreeBSD has this behavior for AF_INET6."""
def __init__(self, *args, **kwargs):
@@ -175,7 +174,7 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
kwargs["bind_and_activate"] = False
else:
self.address_family = socket.AF_INET
socketserver.TCPServer.__init__(self, *args, **kwargs)
super().__init__(*args, **kwargs)
if ipv6:
# NB: On Windows, socket.IPPROTO_IPV6 constant may be missing.
# We use the corresponding value (41) instead.
@@ -190,12 +189,17 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
@mock.patch("socket.socket.bind")
def test_fail_to_bind(self, mock_bind):
mock_bind.side_effect = socket.error
from errno import EADDRINUSE
from acme.standalone import BaseDualNetworkedServers
self.assertRaises(socket.error, BaseDualNetworkedServers,
BaseDualNetworkedServersTest.SingleProtocolServer,
('', 0),
socketserver.BaseRequestHandler)
mock_bind.side_effect = socket.error(EADDRINUSE, "Fake addr in use error")
with self.assertRaises(socket.error) as em:
BaseDualNetworkedServers(
BaseDualNetworkedServersTest.SingleProtocolServer,
('', 0), socketserver.BaseRequestHandler)
self.assertEqual(em.exception.errno, EADDRINUSE)
def test_ports_equal(self):
from acme.standalone import BaseDualNetworkedServers

21
acme/tests/testdata/cert-ipsans.pem vendored Normal file
View File

@@ -0,0 +1,21 @@
-----BEGIN CERTIFICATE-----
MIIDizCCAnOgAwIBAgIIPNBLQXwhoUkwDQYJKoZIhvcNAQELBQAwKDEmMCQGA1UE
AxMdUGViYmxlIEludGVybWVkaWF0ZSBDQSAxNzNiMjYwHhcNMjAwNTI5MTkxODA5
WhcNMjUwNTI5MTkxODA5WjAWMRQwEgYDVQQDEwsxOTIuMC4yLjE0NTCCASIwDQYJ
KoZIhvcNAQEBBQADggEPADCCAQoCggEBALyChb+NDA26GF1AfC0nzEdfOTchKw0h
q41xEjonvg5UXgZf/aH/ntvugIkYP0MaFifNAjebOVVsemEVEtyWcUKTfBHKZGbZ
ukTDwFIjfTccCfo6U/B2H7ZLzJIywl8DcUw9DypadeQBm8PS0VVR2ncy73dvaqym
crhAwlASyXU0mhLqRDMMxfg5Bn/FWpcsIcDpLmPn8Q/FvdRc2t5ryBNw/aWOlwqT
Oy16nbfLj2T0zG1A3aPuD+eT/JFUe/o3K7R+FAx7wt+RziQO46wLVVF1SueZUrIU
zqN04Gl8Kt1WM2SniZ0gq/rORUNcPtT0NAEsEslTQfA+Trq6j2peqyMCAwEAAaOB
yjCBxzAOBgNVHQ8BAf8EBAMCBaAwHQYDVR0lBBYwFAYIKwYBBQUHAwEGCCsGAQUF
BwMCMAwGA1UdEwEB/wQCMAAwHQYDVR0OBBYEFHj1mwZzP//nMIH2i58NRUl/arHn
MB8GA1UdIwQYMBaAFF5DVAKabvIUvKFHGouscA2Qdpe6MDEGCCsGAQUFBwEBBCUw
IzAhBggrBgEFBQcwAYYVaHR0cDovLzEyNy4wLjAuMTo0MDAyMBUGA1UdEQQOMAyH
BMAAApGHBMsAcQEwDQYJKoZIhvcNAQELBQADggEBAHjSgDg76/UCIMSYddyhj18r
LdNKjA7p8ovnErSkebFT4lIZ9f3Sma9moNr0w64M33NamuFyHe/KTdk90mvoW8Uu
26aDekiRIeeMakzbAtDKn67tt2tbedKIYRATcSYVwsV46uZKbM621dZKIjjxOWpo
IY6rZYrku8LYhoXJXOqRduV3cTRVuTm5bBa9FfVNtt6N1T5JOtKKDEhuSaF4RSug
PDy3hQIiHrVvhPfVrXU3j6owz/8UCS5549inES9ONTFrvM9o0H1R/MsmGNXR5hF5
iJqHKC7n8LZujhVnoFIpHu2Dsiefbfr+yRYJS4I+ezy6Nq/Ok8rc8zp0eoX+uyY=
-----END CERTIFICATE-----

22
acme/tests/testdata/cert-ipv6sans.pem vendored Normal file
View File

@@ -0,0 +1,22 @@
-----BEGIN CERTIFICATE-----
MIIDmzCCAoOgAwIBAgIIFdxeZP+v2rgwDQYJKoZIhvcNAQELBQAwKDEmMCQGA1UE
AxMdUGViYmxlIEludGVybWVkaWF0ZSBDQSA0M2M5NTcwHhcNMjAwNTMwMDQwNzMw
WhcNMjUwNTMwMDQwNzMwWjAOMQwwCgYDVQQDEwM6OjEwggEiMA0GCSqGSIb3DQEB
AQUAA4IBDwAwggEKAoIBAQC7VidVduJvqKtrSH0fw6PjE0cqL4Kfzo7klexWUkHG
KVAa0fRVZFZ462jxKOt417V2U4WJQ6WHHO9PJ+3gW62d/MhCw8FRtUQS4nYFjqB6
32+RFU21VRN7cWoQEqSwnEPbh/v/zv/KS5JhQ+swWUo79AOLm1kjnZWCKtcqh1Lc
Ug5Tkpot6luoxTKp52MkchvXDpj0q2B/XpLJ8/pw5cqjv7mH12EDOK2HXllA+WwX
ZpstcEhaA4FqtaHOW/OHnwTX5MUbINXE5YYHVEDR6moVM31/W/3pe9NDUMTDE7Si
lVQnZbXM9NYbzZqlh+WhemDWwnIfGI6rtsfNEiirVEOlAgMBAAGjgeIwgd8wDgYD
VR0PAQH/BAQDAgWgMB0GA1UdJQQWMBQGCCsGAQUFBwMBBggrBgEFBQcDAjAMBgNV
HRMBAf8EAjAAMB0GA1UdDgQWBBS8DL+MZfDIy6AKky69Tgry2Vxq5DAfBgNVHSME
GDAWgBRAsFqVenRRKgB1YPzWKzb9bzZ/ozAxBggrBgEFBQcBAQQlMCMwIQYIKwYB
BQUHMAGGFWh0dHA6Ly8xMjcuMC4wLjE6NDAwMjAtBgNVHREEJjAkhxAAAAAAAAAA
AAAAAAAAAAABhxCjvjLzIG7HXQlWDO6YWF7FMA0GCSqGSIb3DQEBCwUAA4IBAQBY
M9UTZ3uaKMQ+He9kWR3p9jh6hTSD0FNi79ZdfkG0lgSzhhduhN7OhzQH2ihUUfa6
rtKTw74fGbszhizCd9UB8YPKlm3si1Xbg6ZUQlA1RtoQo7RUGEa6ZbR68PKGm9Go
hTTFIl/JS8jzxBR8jywZdyqtprUx+nnNUDiNk0hJtFLhw7OJH0AHlAUNqHsfD08m
HXRdaV6q14HXU5g31slBat9H4D6tCU/2uqBURwW0wVdnqh4QeRfAeqiatJS9EmSF
ctbc7n894Idy2Xce7NFoIy5cht3m6Rd42o/LmBsJopBmQcDPZT70/XzRtc2qE0cS
CzBIGQHUJ6BfmBjrCQnp
-----END CERTIFICATE-----

16
acme/tests/testdata/csr-ipsans.pem vendored Normal file
View File

@@ -0,0 +1,16 @@
-----BEGIN CERTIFICATE REQUEST-----
MIICbTCCAVUCAQIwADCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKT/
CE7Y5EYBvI4p7Frt763upIKHDHO/R5/TWMjG8Jm9qTMui8sbMgyh2Yh+lR/j/5Xd
tQrhgC6wx10MrW2+3JtYS88HP1p6si8zU1dbK34n3NyyklR2RivW0R7dXgnYNy7t
5YcDYLCrbRMIPINV/uHrmzIHWYUDNcZVdAfIM2AHfKYuV6Mepcn///5GR+l4GcAh
Nkf9CW8OdAIuKdbyLCxVr0mUW/vJz1b12uxPsgUdax9sjXgZdT4pfMXADsFd1NeF
atpsXU073inqtHru+2F9ijHTQ75TC+u/rr6eYl3BnBntac0gp/ADtDBii7/Q1JOO
Bhq7xJNqqxIEdiyM7zcCAwEAAaAoMCYGCSqGSIb3DQEJDjEZMBcwFQYDVR0RBA4w
DIcEwAACkYcEywBxATANBgkqhkiG9w0BAQsFAAOCAQEADG5g3zdbSCaXpZhWHkzE
Mek3f442TUE1pB+ITRpthmM4N3zZWETYmbLCIAO624uMrRnbCCMvAoLs/L/9ETg/
XMMFtonQC8u9i9tV8B1ceBh8lpIfa+8b9TMWH3bqnrbWQ+YIl+Yd0gXiCZWJ9vK4
eM1Gddu/2bR6s/k4h/XAWRgEexqk57EHr1z0N+T9OoX939n3mVcNI+u9kfd5VJ0z
VyA3R8WR6T6KlEl5P5pcWe5Kuyhi7xMmLVImXqBtvKq4O1AMfM+gQr/yn9aE8IRq
khP7JrMBLUIub1c/qu2TfvnynNPSM/ZcOX+6PHdHmRkR3nI0Ndpv7Ntv31FTplAm
Dw==
-----END CERTIFICATE REQUEST-----

16
acme/tests/testdata/csr-ipv6sans.pem vendored Normal file
View File

@@ -0,0 +1,16 @@
-----BEGIN CERTIFICATE REQUEST-----
MIIChTCCAW0CAQIwADCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAOIc
UAppcqJfTkSqqOFqGt1v7lIJZPOcF4bcKI3d5cHAGbOuVxbC7uMaDuObwYLzoiED
qnvs1NaEq2phO6KsgGESB7IE2LUjJivO7OnSZjNRpL5si/9egvBiNCn/50lULaWG
gLEuyMfk3awZy2mVAymy7Grhbx069A4TH8TqsHuq2RpKyuDL27e/jUt6yYecb3pu
hWMiWy3segif4tI46pkOW0/I6DpxyYD2OqOvzxm/voS9RMqE2+7YJA327H7bEi3N
lJZEZ1zy7clZ9ga5fBQaetzbg2RyxTrZ7F919NQXSFoXgxb10Eg64wIpz0L3ooCm
GEHehsZZexa3J5ccIvMCAwEAAaBAMD4GCSqGSIb3DQEJDjExMC8wLQYDVR0RBCYw
JIcQAAAAAAAAAAAAAAAAAAAAAYcQo74y8yBux10JVgzumFhexTANBgkqhkiG9w0B
AQsFAAOCAQEALvwVn0A/JPTCiNzcozHFnp5M23C9PXCplWc5u4k34d4XXzpSeFDz
fL4gy7NpYIueme2K2ppw2j3PNQUdR6vQ5a75sriegWYrosL+7Q6Joh51ZyEUZQoD
mNl4M4S4oX85EaChR6NFGBywTfjFarYi32XBTbFE7rK8N8KM+DQkNdwL1MXqaHWz
F1obQKpNXlLedbCBOteV5Eg4zG3565zu/Gw/NhwzzV3mQmgxUcd1sMJxAfHQz4Vl
ImLL+xMcR03nDsH2bgtDbK2tJm7WszSxA9tC+Xp2lRewxrnQloRWPYDz177WGQ5Q
SoGDzTTtA6uWZxG8h7CkNLOGvA8LtU2rNA==
-----END CERTIFICATE REQUEST-----

16
acme/tests/testdata/csr-mixed.pem vendored Normal file
View File

@@ -0,0 +1,16 @@
-----BEGIN CERTIFICATE REQUEST-----
MIICdjCCAV4CAQIwADCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAMXq
v1y8EIcCbaUIzCtOcLkLS0MJ35oS+6DmV5WB1A0cIk6YrjsHIsY2lwMm13BWIvmw
tY+Y6n0rr7eViNx5ZRGHpHEI/TL3Neb+VefTydL5CgvK3dd4ex2kSbTaed3fmpOx
qMajEduwNcZPCcmoEXPkfrCP8w2vKQUkQ+JRPcdX1nTuzticeRP5B7YCmJsmxkEh
Y0tzzZ+NIRDARoYNofefY86h3e5q66gtJxccNchmIM3YQahhg5n3Xoo8hGfM/TIc
R7ncCBCLO6vtqo0QFva/NQODrgOmOsmgvqPkUWQFdZfWM8yIaU826dktx0CPB78t
TudnJ1rBRvGsjHMsZikCAwEAAaAxMC8GCSqGSIb3DQEJDjEiMCAwHgYDVR0RBBcw
FYINYS5leGVtcGxlLmNvbYcEwAACbzANBgkqhkiG9w0BAQsFAAOCAQEAdGMcRCxq
1X09gn1TNdMt64XUv+wdJCKDaJ+AgyIJj7QvVw8H5k7dOnxS4I+a/yo4jE+LDl2/
AuHcBLFEI4ddewdJSMrTNZjuRYuOdr3KP7fL7MffICSBi45vw5EOXg0tnjJCEiKu
6gcJgbLSP5JMMd7Haf33Q/VWsmHofR3VwOMdrnakwAU3Ff5WTuXTNVhL1kT/uLFX
yW1ru6BF4unwNqSR2UeulljpNfRBsiN4zJK11W6n9KT0NkBr9zY5WCM4sW7i8k9V
TeypWGo3jBKzYAGeuxZsB97U77jZ2lrGdBLZKfbcjnTeRVqCvCRrui4El7UGYFmj
7s6OJyWx5DSV8w==
-----END CERTIFICATE REQUEST-----

View File

@@ -4,6 +4,13 @@ import fnmatch
import logging
import re
import subprocess
from typing import Any
from typing import Dict
from typing import Iterable
from typing import List
from typing import Optional
from typing import Sequence
from typing import Tuple
import pkg_resources
@@ -14,7 +21,7 @@ from certbot.compat import os
logger = logging.getLogger(__name__)
def get_mod_deps(mod_name):
def get_mod_deps(mod_name: str) -> Any:
"""Get known module dependencies.
.. note:: This does not need to be accurate in order for the client to
@@ -33,7 +40,7 @@ def get_mod_deps(mod_name):
return deps.get(mod_name, [])
def get_file_path(vhost_path):
def get_file_path(vhost_path: str) -> Optional[str]:
"""Get file path from augeas_vhost_path.
Takes in Augeas path and returns the file name
@@ -50,7 +57,7 @@ def get_file_path(vhost_path):
return _split_aug_path(vhost_path)[0]
def get_internal_aug_path(vhost_path):
def get_internal_aug_path(vhost_path: str) -> str:
"""Get the Augeas path for a vhost with the file path removed.
:param str vhost_path: Augeas virtual host path
@@ -62,7 +69,7 @@ def get_internal_aug_path(vhost_path):
return _split_aug_path(vhost_path)[1]
def _split_aug_path(vhost_path):
def _split_aug_path(vhost_path: str) -> Tuple[str, str]:
"""Splits an Augeas path into a file path and an internal path.
After removing "/files", this function splits vhost_path into the
@@ -76,7 +83,7 @@ def _split_aug_path(vhost_path):
"""
# Strip off /files
file_path = vhost_path[6:]
internal_path = []
internal_path: List[str] = []
# Remove components from the end of file_path until it becomes valid
while not os.path.exists(file_path):
@@ -86,7 +93,7 @@ def _split_aug_path(vhost_path):
return file_path, "/".join(reversed(internal_path))
def parse_define_file(filepath, varname):
def parse_define_file(filepath: str, varname: str) -> Dict[str, str]:
""" Parses Defines from a variable in configuration file
:param str filepath: Path of file to parse
@@ -96,7 +103,7 @@ def parse_define_file(filepath, varname):
:rtype: `dict`
"""
return_vars = {}
return_vars: Dict[str, str] = {}
# Get list of words in the variable
a_opts = util.get_var_from_file(varname, filepath).split()
for i, v in enumerate(a_opts):
@@ -111,19 +118,19 @@ def parse_define_file(filepath, varname):
return return_vars
def unique_id():
def unique_id() -> str:
""" Returns an unique id to be used as a VirtualHost identifier"""
return binascii.hexlify(os.urandom(16)).decode("utf-8")
def included_in_paths(filepath, paths):
def included_in_paths(filepath: str, paths: Iterable[str]) -> bool:
"""
Returns true if the filepath is included in the list of paths
that may contain full paths or wildcard paths that need to be
expanded.
:param str filepath: Filepath to check
:params list paths: List of paths to check against
:param list paths: List of paths to check against
:returns: True if included
:rtype: bool
@@ -132,7 +139,7 @@ def included_in_paths(filepath, paths):
return any(fnmatch.fnmatch(filepath, path) for path in paths)
def parse_defines(apachectl):
def parse_defines(apachectl: str) -> Dict[str, str]:
"""
Gets Defines from httpd process and returns a dictionary of
the defined variables.
@@ -143,7 +150,7 @@ def parse_defines(apachectl):
:rtype: dict
"""
variables = {}
variables: Dict[str, str] = {}
define_cmd = [apachectl, "-t", "-D",
"DUMP_RUN_CFG"]
matches = parse_from_subprocess(define_cmd, r"Define: ([^ \n]*)")
@@ -153,18 +160,15 @@ def parse_defines(apachectl):
return {}
for match in matches:
if match.count("=") > 1:
logger.error("Unexpected number of equal signs in "
"runtime config dump.")
raise errors.PluginError(
"Error parsing Apache runtime variables")
parts = match.partition("=")
variables[parts[0]] = parts[2]
# Value could also contain = so split only once
parts = match.split('=', 1)
value = parts[1] if len(parts) == 2 else ''
variables[parts[0]] = value
return variables
def parse_includes(apachectl):
def parse_includes(apachectl: str) -> List[str]:
"""
Gets Include directives from httpd process and returns a list of
their values.
@@ -175,12 +179,11 @@ def parse_includes(apachectl):
:rtype: list of str
"""
inc_cmd = [apachectl, "-t", "-D",
"DUMP_INCLUDES"]
inc_cmd: List[str] = [apachectl, "-t", "-D", "DUMP_INCLUDES"]
return parse_from_subprocess(inc_cmd, r"\(.*\) (.*)")
def parse_modules(apachectl):
def parse_modules(apachectl: str) -> List[str]:
"""
Get loaded modules from httpd process, and return the list
of loaded module names.
@@ -191,12 +194,11 @@ def parse_modules(apachectl):
:rtype: list of str
"""
mod_cmd = [apachectl, "-t", "-D",
"DUMP_MODULES"]
mod_cmd = [apachectl, "-t", "-D", "DUMP_MODULES"]
return parse_from_subprocess(mod_cmd, r"(.*)_module")
def parse_from_subprocess(command, regexp):
def parse_from_subprocess(command: List[str], regexp: str) -> List[str]:
"""Get values from stdout of subprocess command
:param list command: Command to run
@@ -210,7 +212,7 @@ def parse_from_subprocess(command, regexp):
return re.compile(regexp).findall(stdout)
def _get_runtime_cfg(command):
def _get_runtime_cfg(command: Sequence[str]) -> str:
"""
Get runtime configuration info.
@@ -245,7 +247,8 @@ def _get_runtime_cfg(command):
return stdout
def find_ssl_apache_conf(prefix):
def find_ssl_apache_conf(prefix: str) -> str:
"""
Find a TLS Apache config file in the dedicated storage.
:param str prefix: prefix of the TLS Apache config file to find

View File

@@ -1,4 +1,7 @@
""" apacheconfig implementation of the ParserNode interfaces """
from typing import Any
from typing import List
from typing import Optional
from typing import Tuple
from certbot_apache._internal import assertions
@@ -13,19 +16,21 @@ class ApacheParserNode(interfaces.ParserNode):
by parsing the equivalent configuration text using the apacheconfig library.
"""
def __init__(self, **kwargs):
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs) # pylint: disable=unused-variable
def __init__(self, **kwargs: Any):
# pylint: disable=unused-variable
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs)
super().__init__(**kwargs)
self.ancestor = ancestor
self.filepath = filepath
self.dirty = dirty
self.metadata = metadata
self._raw = self.metadata["ac_ast"]
self.ancestor: str = ancestor
self.filepath: str = filepath
self.dirty: bool = dirty
self.metadata: Any = metadata
self._raw: Any = self.metadata["ac_ast"]
def save(self, msg): # pragma: no cover
pass
def save(self, msg: str) -> None:
pass # pragma: no cover
def find_ancestors(self, name): # pylint: disable=unused-variable
# pylint: disable=unused-variable
def find_ancestors(self, name: str) -> List["ApacheBlockNode"]:
"""Find ancestor BlockNodes with a given name"""
return [ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
@@ -37,33 +42,33 @@ class ApacheParserNode(interfaces.ParserNode):
class ApacheCommentNode(ApacheParserNode):
""" apacheconfig implementation of CommentNode interface """
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
comment, kwargs = util.commentnode_kwargs(kwargs) # pylint: disable=unused-variable
super().__init__(**kwargs)
self.comment = comment
def __eq__(self, other): # pragma: no cover
def __eq__(self, other: Any):
if isinstance(other, self.__class__):
return (self.comment == other.comment and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata and
self.filepath == other.filepath)
return False
return False # pragma: no cover
class ApacheDirectiveNode(ApacheParserNode):
""" apacheconfig implementation of DirectiveNode interface """
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
name, parameters, enabled, kwargs = util.directivenode_kwargs(kwargs)
super().__init__(**kwargs)
self.name = name
self.parameters = parameters
self.enabled = enabled
self.include = None
self.name: str = name
self.parameters: List[str] = parameters
self.enabled: bool = enabled
self.include: Optional[str] = None
def __eq__(self, other): # pragma: no cover
def __eq__(self, other: Any) -> bool:
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
@@ -72,21 +77,21 @@ class ApacheDirectiveNode(ApacheParserNode):
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
return False # pragma: no cover
def set_parameters(self, _parameters): # pragma: no cover
def set_parameters(self, _parameters):
"""Sets the parameters for DirectiveNode"""
return
return # pragma: no cover
class ApacheBlockNode(ApacheDirectiveNode):
""" apacheconfig implementation of BlockNode interface """
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
super().__init__(**kwargs)
self.children: Tuple[ApacheParserNode, ...] = ()
def __eq__(self, other): # pragma: no cover
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
@@ -96,10 +101,12 @@ class ApacheBlockNode(ApacheDirectiveNode):
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
return False # pragma: no cover
# pylint: disable=unused-argument
def add_child_block(self, name, parameters=None, position=None): # pragma: no cover
def add_child_block(
self, name: str, parameters: Optional[str] = None, position: Optional[int] = None
) -> "ApacheBlockNode": # pragma: no cover
"""Adds a new BlockNode to the sequence of children"""
new_block = ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
@@ -110,7 +117,9 @@ class ApacheBlockNode(ApacheDirectiveNode):
return new_block
# pylint: disable=unused-argument
def add_child_directive(self, name, parameters=None, position=None): # pragma: no cover
def add_child_directive(
self, name: str, parameters: Optional[str] = None, position: Optional[int] = None
) -> ApacheDirectiveNode: # pragma: no cover
"""Adds a new DirectiveNode to the sequence of children"""
new_dir = ApacheDirectiveNode(name=assertions.PASS,
parameters=assertions.PASS,
@@ -121,7 +130,9 @@ class ApacheBlockNode(ApacheDirectiveNode):
return new_dir
# pylint: disable=unused-argument
def add_child_comment(self, comment="", position=None): # pragma: no cover
def add_child_comment(
self, name: str, parameters: Optional[int] = None, position: Optional[int] = None
) -> ApacheCommentNode: # pragma: no cover
"""Adds a new CommentNode to the sequence of children"""
new_comment = ApacheCommentNode(comment=assertions.PASS,
@@ -131,7 +142,8 @@ class ApacheBlockNode(ApacheDirectiveNode):
self.children += (new_comment,)
return new_comment
def find_blocks(self, name, exclude=True): # pylint: disable=unused-argument
# pylint: disable=unused-argument
def find_blocks(self, name, exclude: bool = True) -> List["ApacheBlockNode"]:
"""Recursive search of BlockNodes from the sequence of children"""
return [ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
@@ -139,7 +151,8 @@ class ApacheBlockNode(ApacheDirectiveNode):
filepath=assertions.PASS,
metadata=self.metadata)]
def find_directives(self, name, exclude=True): # pylint: disable=unused-argument
# pylint: disable=unused-argument
def find_directives(self, name: str, exclude: bool = True) -> List[ApacheDirectiveNode]:
"""Recursive search of DirectiveNodes from the sequence of children"""
return [ApacheDirectiveNode(name=assertions.PASS,
parameters=assertions.PASS,
@@ -148,22 +161,22 @@ class ApacheBlockNode(ApacheDirectiveNode):
metadata=self.metadata)]
# pylint: disable=unused-argument
def find_comments(self, comment, exact=False): # pragma: no cover
def find_comments(self, comment: str, exact: bool = False) -> List[ApacheCommentNode]:
"""Recursive search of DirectiveNodes from the sequence of children"""
return [ApacheCommentNode(comment=assertions.PASS,
return [ApacheCommentNode(comment=assertions.PASS, # pragma: no cover
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
def delete_child(self, child): # pragma: no cover
def delete_child(self, child: "ApacheBlockNode") -> None:
"""Deletes a ParserNode from the sequence of children"""
return
return # pragma: no cover
def unsaved_files(self): # pragma: no cover
def unsaved_files(self) -> List[str]:
"""Returns a list of unsaved filepaths"""
return [assertions.PASS]
return [assertions.PASS] # pragma: no cover
def parsed_paths(self): # pragma: no cover
def parsed_paths(self) -> List[str]:
"""Returns a list of parsed configuration file paths"""
return [assertions.PASS]

View File

@@ -1,12 +1,13 @@
"""Dual parser node assertions"""
import fnmatch
from typing import Any
from certbot_apache._internal import interfaces
PASS = "CERTBOT_PASS_ASSERT"
def assertEqual(first, second):
def assertEqual(first: Any, second: Any) -> None:
""" Equality assertion """
if isinstance(first, interfaces.CommentNode):
@@ -29,7 +30,9 @@ def assertEqual(first, second):
# (but identical) directory structures.
assert first.filepath == second.filepath
def assertEqualComment(first, second): # pragma: no cover
# pragma: no cover
def assertEqualComment(first: Any, second: Any) -> None:
""" Equality assertion for CommentNode """
assert isinstance(first, interfaces.CommentNode)
@@ -38,7 +41,8 @@ def assertEqualComment(first, second): # pragma: no cover
if not isPass(first.comment) and not isPass(second.comment): # type: ignore
assert first.comment == second.comment # type: ignore
def _assertEqualDirectiveComponents(first, second): # pragma: no cover
def _assertEqualDirectiveComponents(first: Any, second: Any) -> None: # pragma: no cover
""" Handles assertion for instance variables for DirectiveNode and BlockNode"""
# Enabled value cannot be asserted, because Augeas implementation
@@ -50,30 +54,34 @@ def _assertEqualDirectiveComponents(first, second): # pragma: no cover
if not isPass(first.parameters) and not isPass(second.parameters):
assert first.parameters == second.parameters
def assertEqualDirective(first, second):
def assertEqualDirective(first: Any, second: Any) -> None:
""" Equality assertion for DirectiveNode """
assert isinstance(first, interfaces.DirectiveNode)
assert isinstance(second, interfaces.DirectiveNode)
_assertEqualDirectiveComponents(first, second)
def isPass(value): # pragma: no cover
def isPass(value) -> bool: # pragma: no cover
"""Checks if the value is set to PASS"""
if isinstance(value, bool):
return True
return PASS in value
def isPassDirective(block):
""" Checks if BlockNode or DirectiveNode should pass the assertion """
if isPass(block.name):
return True
if isPass(block.parameters): # pragma: no cover
if isPass(block.parameters): # pragma: no cover
return True
if isPass(block.filepath): # pragma: no cover
if isPass(block.filepath): # pragma: no cover
return True
return False
def isPassComment(comment):
""" Checks if CommentNode should pass the assertion """
@@ -83,7 +91,8 @@ def isPassComment(comment):
return True
return False
def isPassNodeList(nodelist): # pragma: no cover
def isPassNodeList(nodelist): # pragma: no cover
""" Checks if a ParserNode in the nodelist should pass the assertion,
this function is used for results of find_* methods. Unimplemented find_*
methods should return a sequence containing a single ParserNode instance
@@ -101,12 +110,14 @@ def isPassNodeList(nodelist): # pragma: no cover
return isPassDirective(node)
return isPassComment(node)
def assertEqualSimple(first, second):
""" Simple assertion """
if not isPass(first) and not isPass(second):
assert first == second
def isEqualVirtualHost(first, second):
def isEqualVirtualHost(first, second) -> bool:
"""
Checks that two VirtualHost objects are similar. There are some built
in differences with the implementations: VirtualHost created by ParserNode
@@ -126,6 +137,7 @@ def isEqualVirtualHost(first, second):
first.ancestor == second.ancestor
)
def assertEqualPathsList(first, second): # pragma: no cover
"""
Checks that the two lists of file paths match. This assertion allows for wildcard

View File

@@ -64,7 +64,15 @@ Translates over to:
"/files/etc/apache2/apache2.conf/bLoCk[1]",
]
"""
from typing import Any
from typing import cast
from typing import Dict
from typing import Iterable
from typing import List
from typing import Optional
from typing import Set
from typing import Tuple
from typing import Union
from certbot import errors
from certbot.compat import os
@@ -78,14 +86,16 @@ from certbot_apache._internal import parsernode_util as util
class AugeasParserNode(interfaces.ParserNode):
""" Augeas implementation of ParserNode interface """
def __init__(self, **kwargs):
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs) # pylint: disable=unused-variable
def __init__(self, **kwargs: Any):
# pylint: disable=unused-variable
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs)
super().__init__(**kwargs)
self.ancestor = ancestor
self.filepath = filepath
self.dirty = dirty
self.metadata = metadata
self.parser = self.metadata.get("augeasparser")
self.ancestor: str = ancestor
self.filepath: str = filepath
self.dirty: bool = dirty
self.metadata: Dict[str, Any] = metadata
self.parser: parser.ApacheParser = cast(parser.ApacheParser,
self.metadata.get("augeasparser"))
try:
if self.metadata["augeaspath"].endswith("/"):
raise errors.PluginError(
@@ -96,10 +106,10 @@ class AugeasParserNode(interfaces.ParserNode):
except KeyError:
raise errors.PluginError("Augeas path is required")
def save(self, msg):
def save(self, msg: Iterable[str]) -> None:
self.parser.save(msg)
def find_ancestors(self, name):
def find_ancestors(self, name: str) -> List["AugeasBlockNode"]:
"""
Searches for ancestor BlockNodes with a given name.
@@ -109,7 +119,7 @@ class AugeasParserNode(interfaces.ParserNode):
:rtype: list of AugeasBlockNode
"""
ancestors = []
ancestors: List[AugeasBlockNode] = []
parent = self.metadata["augeaspath"]
while True:
@@ -124,7 +134,7 @@ class AugeasParserNode(interfaces.ParserNode):
return ancestors
def _create_blocknode(self, path):
def _create_blocknode(self, path: str) -> "AugeasBlockNode":
"""
Helper function to create a BlockNode from Augeas path. This is used by
AugeasParserNode.find_ancestors and AugeasBlockNode.
@@ -132,21 +142,25 @@ class AugeasParserNode(interfaces.ParserNode):
"""
name = self._aug_get_name(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
name: str = self._aug_get_name(path)
metadata: Dict[str, Union[parser.ApacheParser, str]] = {
"augeasparser": self.parser, "augeaspath": path
}
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(path)
)
file_path = apache_util.get_file_path(path)
if file_path is None:
raise ValueError(f"No file path found for vhost: {path}.") # pragma: no cover
enabled = self.parser.parsed_in_original(file_path)
return AugeasBlockNode(name=name,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(path),
filepath=file_path,
metadata=metadata)
def _aug_get_name(self, path):
def _aug_get_name(self, path: str) -> str:
"""
Helper function to get name of a configuration block or variable from path.
"""
@@ -160,20 +174,18 @@ class AugeasParserNode(interfaces.ParserNode):
# remove [...], it's not allowed in Apache configuration and is used
# for indexing within Augeas
name = name.split("[")[0]
return name
return name.split("[")[0]
class AugeasCommentNode(AugeasParserNode):
""" Augeas implementation of CommentNode interface """
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
comment, kwargs = util.commentnode_kwargs(kwargs) # pylint: disable=unused-variable
super().__init__(**kwargs)
# self.comment = comment
self.comment = comment
def __eq__(self, other):
def __eq__(self, other: Any) -> bool:
if isinstance(other, self.__class__):
return (self.comment == other.comment and
self.filepath == other.filepath and
@@ -186,15 +198,15 @@ class AugeasCommentNode(AugeasParserNode):
class AugeasDirectiveNode(AugeasParserNode):
""" Augeas implementation of DirectiveNode interface """
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
name, parameters, enabled, kwargs = util.directivenode_kwargs(kwargs)
super().__init__(**kwargs)
self.name = name
self.enabled = enabled
self.name: str = name
self.enabled: bool = enabled
if parameters:
self.set_parameters(parameters)
def __eq__(self, other):
def __eq__(self, other: Any) -> bool:
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
@@ -205,7 +217,7 @@ class AugeasDirectiveNode(AugeasParserNode):
self.metadata == other.metadata)
return False
def set_parameters(self, parameters):
def set_parameters(self, parameters: List[str]):
"""
Sets parameters of a DirectiveNode or BlockNode object.
@@ -224,7 +236,7 @@ class AugeasDirectiveNode(AugeasParserNode):
self.parser.aug.set(param_path, param)
@property
def parameters(self):
def parameters(self) -> Tuple[Optional[str], ...]:
"""
Fetches the parameters from Augeas tree, ensuring that the sequence always
represents the current state
@@ -234,7 +246,7 @@ class AugeasDirectiveNode(AugeasParserNode):
"""
return tuple(self._aug_get_params(self.metadata["augeaspath"]))
def _aug_get_params(self, path):
def _aug_get_params(self, path: str) -> List[Optional[str]]:
"""Helper function to get parameters for DirectiveNodes and BlockNodes"""
arg_paths = self.parser.aug.match(path + "/arg")
@@ -244,11 +256,11 @@ class AugeasDirectiveNode(AugeasParserNode):
class AugeasBlockNode(AugeasDirectiveNode):
""" Augeas implementation of BlockNode interface """
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
super().__init__(**kwargs)
self.children = ()
self.children: Tuple["AugeasBlockNode", ...] = ()
def __eq__(self, other):
def __eq__(self, other: Any) -> bool:
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
@@ -261,33 +273,39 @@ class AugeasBlockNode(AugeasDirectiveNode):
return False
# pylint: disable=unused-argument
def add_child_block(self, name, parameters=None, position=None): # pragma: no cover
def add_child_block(
self, name: str, parameters: Optional[str] = None, position: Optional[int] = None
) -> "AugeasBlockNode": # pragma: no cover
"""Adds a new BlockNode to the sequence of children"""
insertpath, realpath, before = self._aug_resolve_child_position(
name,
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
new_metadata: Dict[str, Any] = {"augeasparser": self.parser, "augeaspath": realpath}
# Create the new block
self.parser.aug.insert(insertpath, name, before)
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(realpath)
)
file_path = apache_util.get_file_path(realpath)
if file_path is None:
raise errors.Error(f"No file path found for vhost: {realpath}")
enabled = self.parser.parsed_in_original(file_path)
# Parameters will be set at the initialization of the new object
new_block = AugeasBlockNode(name=name,
parameters=parameters,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_block
return AugeasBlockNode(
name=name,
parameters=parameters,
enabled=enabled,
ancestor=assertions.PASS,
filepath=file_path,
metadata=new_metadata,
)
# pylint: disable=unused-argument
def add_child_directive(self, name, parameters=None, position=None): # pragma: no cover
def add_child_directive(
self, name: str, parameters=None, position=None
) -> "AugeasDirectiveNode": # pragma: no cover
"""Adds a new DirectiveNode to the sequence of children"""
if not parameters:
@@ -304,43 +322,50 @@ class AugeasBlockNode(AugeasDirectiveNode):
# Set the directive key
self.parser.aug.set(realpath, name)
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(realpath)
file_path = apache_util.get_file_path(realpath)
if file_path is None:
raise errors.Error(f"No file path found for vhost: {realpath}")
enabled = self.parser.parsed_in_original(file_path)
return AugeasDirectiveNode(
name=name,
parameters=parameters,
enabled=enabled,
ancestor=assertions.PASS,
filepath=file_path,
metadata=new_metadata,
)
new_dir = AugeasDirectiveNode(name=name,
parameters=parameters,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_dir
def add_child_comment(self, comment="", position=None):
def add_child_comment(
self, comment: str = "", position: Optional[int] = None
) -> "AugeasCommentNode":
"""Adds a new CommentNode to the sequence of children"""
insertpath, realpath, before = self._aug_resolve_child_position(
"#comment",
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
new_metadata: Dict[str, Any] = {
"augeasparser": self.parser, "augeaspath": realpath,
}
# Create the new comment
self.parser.aug.insert(insertpath, "#comment", before)
# Set the comment content
self.parser.aug.set(realpath, comment)
new_comment = AugeasCommentNode(comment=comment,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_comment
return AugeasCommentNode(
comment=comment,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata,
)
def find_blocks(self, name, exclude=True):
def find_blocks(self, name: str, exclude: bool = True) -> List["AugeasBlockNode"]:
"""Recursive search of BlockNodes from the sequence of children"""
nodes = []
paths = self._aug_find_blocks(name)
nodes: List["AugeasBlockNode"] = []
paths: Iterable[str] = self._aug_find_blocks(name)
if exclude:
paths = self.parser.exclude_dirs(paths)
for path in paths:
@@ -348,7 +373,7 @@ class AugeasBlockNode(AugeasDirectiveNode):
return nodes
def find_directives(self, name, exclude=True):
def find_directives(self, name: str, exclude: bool = True) -> List["AugeasDirectiveNode"]:
"""Recursive search of DirectiveNodes from the sequence of children"""
nodes = []
@@ -367,14 +392,14 @@ class AugeasBlockNode(AugeasDirectiveNode):
return nodes
def find_comments(self, comment):
def find_comments(self, comment: str) -> List["AugeasCommentNode"]:
"""
Recursive search of DirectiveNodes from the sequence of children.
:param str comment: Comment content to search for.
"""
nodes = []
nodes: List["AugeasCommentNode"] = []
ownpath = self.metadata.get("augeaspath")
comments = self.parser.find_comments(comment, start=ownpath)
@@ -383,11 +408,11 @@ class AugeasBlockNode(AugeasDirectiveNode):
return nodes
def delete_child(self, child):
def delete_child(self, child: "AugeasParserNode") -> None:
"""
Deletes a ParserNode from the sequence of children, and raises an
exception if it's unable to do so.
:param AugeasParserNode: child: A node to delete.
:param AugeasParserNode child: A node to delete.
"""
if not self.parser.aug.remove(child.metadata["augeaspath"]):
@@ -396,11 +421,11 @@ class AugeasBlockNode(AugeasDirectiveNode):
"seem to exist.").format(child.metadata["augeaspath"])
)
def unsaved_files(self):
def unsaved_files(self) -> Set[str]:
"""Returns a list of unsaved filepaths"""
return self.parser.unsaved_files()
def parsed_paths(self):
def parsed_paths(self) -> List[str]:
"""
Returns a list of file paths that have currently been parsed into the parser
tree. The returned list may include paths with wildcard characters, for
@@ -411,7 +436,7 @@ class AugeasBlockNode(AugeasDirectiveNode):
:returns: list of file paths of files that have been parsed
"""
res_paths = []
res_paths: List[str] = []
paths = self.parser.existing_paths
for directory in paths:
@@ -420,7 +445,7 @@ class AugeasBlockNode(AugeasDirectiveNode):
return res_paths
def _create_commentnode(self, path):
def _create_commentnode(self, path: str) -> "AugeasCommentNode":
"""Helper function to create a CommentNode from Augeas path"""
comment = self.parser.aug.get(path)
@@ -433,14 +458,16 @@ class AugeasBlockNode(AugeasDirectiveNode):
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _create_directivenode(self, path):
def _create_directivenode(self, path: str) -> "AugeasDirectiveNode":
"""Helper function to create a DirectiveNode from Augeas path"""
name = self.parser.get_arg(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
metadata: Dict[str, Union[parser.ApacheParser, str]] = {
"augeasparser": self.parser, "augeaspath": path,
}
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
enabled: bool = self.parser.parsed_in_original(
apache_util.get_file_path(path)
)
return AugeasDirectiveNode(name=name,
@@ -449,12 +476,12 @@ class AugeasBlockNode(AugeasDirectiveNode):
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _aug_find_blocks(self, name):
def _aug_find_blocks(self, name: str) -> Set[str]:
"""Helper function to perform a search to Augeas DOM tree to search
configuration blocks with a given name"""
# The code here is modified from configurator.get_virtual_hosts()
blk_paths = set()
blk_paths: Set[str] = set()
for vhost_path in list(self.parser.parser_paths):
paths = self.parser.aug.match(
("/files%s//*[label()=~regexp('%s')]" %
@@ -463,7 +490,8 @@ class AugeasBlockNode(AugeasDirectiveNode):
name.lower() in os.path.basename(path).lower()])
return blk_paths
def _aug_resolve_child_position(self, name, position):
def _aug_resolve_child_position(
self, name: str, position: Optional[int]) -> Tuple[str, str, bool]:
"""
Helper function that iterates through the immediate children and figures
out the insertion path for a new AugeasParserNode.
@@ -488,16 +516,16 @@ class AugeasBlockNode(AugeasDirectiveNode):
"""
# Default to appending
before = False
before: bool = False
all_children = self.parser.aug.match("{}/*".format(
all_children: str = self.parser.aug.match("{}/*".format(
self.metadata["augeaspath"])
)
# Calculate resulting_path
# Augeas indices start at 1. We use counter to calculate the index to
# be used in resulting_path.
counter = 1
counter: int = 1
for i, child in enumerate(all_children):
if position is not None and i >= position:
# We're not going to insert the new node to an index after this
@@ -506,7 +534,7 @@ class AugeasBlockNode(AugeasDirectiveNode):
if name == childname:
counter += 1
resulting_path = "{}/{}[{}]".format(
resulting_path: str = "{}/{}[{}]".format(
self.metadata["augeaspath"],
name,
counter
@@ -530,7 +558,7 @@ class AugeasBlockNode(AugeasDirectiveNode):
position
)
return (insert_path, resulting_path, before)
return insert_path, resulting_path, before
interfaces.CommentNode.register(AugeasCommentNode)

File diff suppressed because it is too large Load Diff

View File

@@ -1,17 +1,21 @@
"""Apache plugin constants."""
from typing import List, Dict
import pkg_resources
from certbot.compat import os
MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
"""Name of the mod_ssl config file as saved in `IConfig.config_dir`."""
"""Name of the mod_ssl config file as saved
in `certbot.configuration.NamespaceConfig.config_dir`."""
UPDATED_MOD_SSL_CONF_DIGEST = ".updated-options-ssl-apache-conf-digest.txt"
"""Name of the hash of the updated or informed mod_ssl_conf as saved in `IConfig.config_dir`."""
"""Name of the hash of the updated or informed mod_ssl_conf as saved
in `certbot.configuration.NamespaceConfig.config_dir`."""
# NEVER REMOVE A SINGLE HASH FROM THIS LIST UNLESS YOU KNOW EXACTLY WHAT YOU ARE DOING!
ALL_SSL_OPTIONS_HASHES = [
ALL_SSL_OPTIONS_HASHES: List[str] = [
'2086bca02db48daf93468332543c60ac6acdb6f0b58c7bfdf578a5d47092f82a',
'4844d36c9a0f587172d9fa10f4f1c9518e3bcfa1947379f155e16a70a728c21a',
'5a922826719981c0a234b1fbcd495f3213e49d2519e845ea0748ba513044b65b',
@@ -34,39 +38,39 @@ AUGEAS_LENS_DIR = pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "augeas_lens"))
"""Path to the Augeas lens directory"""
REWRITE_HTTPS_ARGS = [
REWRITE_HTTPS_ARGS: List[str] = [
"^", "https://%{SERVER_NAME}%{REQUEST_URI}", "[L,NE,R=permanent]"]
"""Apache version<2.3.9 rewrite rule arguments used for redirections to
https vhost"""
REWRITE_HTTPS_ARGS_WITH_END = [
REWRITE_HTTPS_ARGS_WITH_END: List[str] = [
"^", "https://%{SERVER_NAME}%{REQUEST_URI}", "[END,NE,R=permanent]"]
"""Apache version >= 2.3.9 rewrite rule arguments used for redirections to
https vhost"""
OLD_REWRITE_HTTPS_ARGS = [
OLD_REWRITE_HTTPS_ARGS: List[List[str]] = [
["^", "https://%{SERVER_NAME}%{REQUEST_URI}", "[L,QSA,R=permanent]"],
["^", "https://%{SERVER_NAME}%{REQUEST_URI}", "[END,QSA,R=permanent]"]]
HSTS_ARGS = ["always", "set", "Strict-Transport-Security",
HSTS_ARGS: List[str] = ["always", "set", "Strict-Transport-Security",
"\"max-age=31536000\""]
"""Apache header arguments for HSTS"""
UIR_ARGS = ["always", "set", "Content-Security-Policy",
"upgrade-insecure-requests"]
UIR_ARGS: List[str] = ["always", "set", "Content-Security-Policy", "upgrade-insecure-requests"]
HEADER_ARGS = {"Strict-Transport-Security": HSTS_ARGS,
"Upgrade-Insecure-Requests": UIR_ARGS}
HEADER_ARGS: Dict[str, List[str]] = {
"Strict-Transport-Security": HSTS_ARGS, "Upgrade-Insecure-Requests": UIR_ARGS,
}
AUTOHSTS_STEPS = [60, 300, 900, 3600, 21600, 43200, 86400]
AUTOHSTS_STEPS: List[int] = [60, 300, 900, 3600, 21600, 43200, 86400]
"""AutoHSTS increase steps: 1min, 5min, 15min, 1h, 6h, 12h, 24h"""
AUTOHSTS_PERMANENT = 31536000
AUTOHSTS_PERMANENT: int = 31536000
"""Value for the last max-age of HSTS"""
AUTOHSTS_FREQ = 172800
AUTOHSTS_FREQ: int = 172800
"""Minimum time since last increase to perform a new one: 48h"""
MANAGED_COMMENT = "DO NOT REMOVE - Managed by Certbot"
MANAGED_COMMENT_ID = MANAGED_COMMENT+", VirtualHost id: {0}"
MANAGED_COMMENT: str = "DO NOT REMOVE - Managed by Certbot"
MANAGED_COMMENT_ID: str = MANAGED_COMMENT + ", VirtualHost id: {0}"
"""Managed by Certbot comments and the VirtualHost identification template"""

View File

@@ -1,21 +1,23 @@
"""Contains UI methods for Apache operations."""
import logging
import zope.component
from typing import Iterable
from typing import List
from typing import Optional
from typing import Tuple
from certbot import errors
from certbot import interfaces
from certbot.compat import os
import certbot.display.util as display_util
from certbot.display import util as display_util
from certbot_apache._internal import obj
logger = logging.getLogger(__name__)
def select_vhost_multiple(vhosts):
def select_vhost_multiple(vhosts: Optional[List[obj.VirtualHost]]) -> List[obj.VirtualHost]:
"""Select multiple Vhosts to install the certificate for
:param vhosts: Available Apache VirtualHosts
:type vhosts: :class:`list` of type `~obj.Vhost`
:type vhosts: :class:`list` of type `~obj.VirtualHost`
:returns: List of VirtualHosts
:rtype: :class:`list`of type `~obj.Vhost`
@@ -26,7 +28,7 @@ def select_vhost_multiple(vhosts):
# Remove the extra newline from the last entry
if tags_list:
tags_list[-1] = tags_list[-1][:-1]
code, names = zope.component.getUtility(interfaces.IDisplay).checklist(
code, names = display_util.checklist(
"Which VirtualHosts would you like to install the wildcard certificate for?",
tags=tags_list, force_interactive=True)
if code == display_util.OK:
@@ -34,7 +36,8 @@ def select_vhost_multiple(vhosts):
return return_vhosts
return []
def _reversemap_vhosts(names, vhosts):
def _reversemap_vhosts(names: Iterable[str], vhosts: List[obj.VirtualHost]):
"""Helper function for select_vhost_multiple for mapping string
representations back to actual vhost objects"""
return_vhosts = []
@@ -45,9 +48,11 @@ def _reversemap_vhosts(names, vhosts):
return_vhosts.append(vhost)
return return_vhosts
def select_vhost(domain, vhosts):
def select_vhost(domain: str, vhosts: List[obj.VirtualHost]) -> Optional[obj.VirtualHost]:
"""Select an appropriate Apache Vhost.
:param domain: Domain to select
:param vhosts: Available Apache VirtualHosts
:type vhosts: :class:`list` of type `~obj.Vhost`
@@ -62,7 +67,8 @@ def select_vhost(domain, vhosts):
return vhosts[tag]
return None
def _vhost_menu(domain, vhosts):
def _vhost_menu(domain: str, vhosts: List[obj.VirtualHost]) -> Tuple[str, int]:
"""Select an appropriate Apache Vhost.
:param vhosts: Available Apache Virtual Hosts
@@ -103,22 +109,22 @@ def _vhost_menu(domain, vhosts):
https="HTTPS" if vhost.ssl else "",
active="Enabled" if vhost.enabled else "",
fn_size=filename_size,
name_size=disp_name_size)
name_size=disp_name_size),
)
try:
code, tag = zope.component.getUtility(interfaces.IDisplay).menu(
"We were unable to find a vhost with a ServerName "
"or Address of {0}.{1}Which virtual host would you "
"like to choose?".format(domain, os.linesep),
code, tag = display_util.menu(
f"We were unable to find a vhost with a ServerName "
f"or Address of {domain}.{os.linesep}Which virtual host would you "
f"like to choose?",
choices, force_interactive=True)
except errors.MissingCommandlineFlag:
msg = (
"Encountered vhost ambiguity when trying to find a vhost for "
"{0} but was unable to ask for user "
"guidance in non-interactive mode. Certbot may need "
"vhosts to be explicitly labelled with ServerName or "
"ServerAlias directives.".format(domain))
f"Encountered vhost ambiguity when trying to find a vhost for "
f"{domain} but was unable to ask for user "
f"guidance in non-interactive mode. Certbot may need "
f"vhosts to be explicitly labelled with ServerName or "
f"ServerAlias directives.")
logger.error(msg)
raise errors.MissingCommandlineFlag(msg)

View File

@@ -1,7 +1,16 @@
""" Dual ParserNode implementation """
from typing import Any
from typing import Callable
from typing import List
from typing import Optional
from typing import Sequence
from typing import Set
from typing import Tuple
from certbot_apache._internal import apacheparser
from certbot_apache._internal import assertions
from certbot_apache._internal import augeasparser
from certbot_apache._internal import interfaces
class DualNodeBase:
@@ -9,12 +18,12 @@ class DualNodeBase:
base class for dual parser interface classes. This class handles runtime
attribute value assertions."""
def save(self, msg): # pragma: no cover
def save(self, msg: str): # pragma: no cover
""" Call save for both parsers """
self.primary.save(msg)
self.secondary.save(msg)
def __getattr__(self, aname):
def __getattr__(self, aname: str) -> Any:
""" Attribute value assertion """
firstval = getattr(self.primary, aname)
secondval = getattr(self.secondary, aname)
@@ -28,11 +37,13 @@ class DualNodeBase:
assertions.assertEqualSimple(firstval, secondval)
return firstval
def find_ancestors(self, name):
def find_ancestors(self, name: str) -> Sequence[interfaces.ParserNode]:
""" Traverses the ancestor tree and returns ancestors matching name """
return self._find_helper(DualBlockNode, "find_ancestors", name)
def _find_helper(self, nodeclass, findfunc, search, **kwargs):
def _find_helper(
self, nodeclass: Callable, findfunc: str, search: str, **kwargs: Any
) -> List[apacheparser.ApacheBlockNode]:
"""A helper for find_* functions. The function specific attributes should
be passed as keyword arguments.
@@ -75,7 +86,7 @@ class DualNodeBase:
class DualCommentNode(DualNodeBase):
""" Dual parser implementation of CommentNode interface """
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
""" This initialization implementation allows ordinary initialization
of CommentNode objects as well as creating a DualCommentNode object
using precreated or fetched CommentNode objects if provided as optional
@@ -107,7 +118,7 @@ class DualCommentNode(DualNodeBase):
class DualDirectiveNode(DualNodeBase):
""" Dual parser implementation of DirectiveNode interface """
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
""" This initialization implementation allows ordinary initialization
of DirectiveNode objects as well as creating a DualDirectiveNode object
using precreated or fetched DirectiveNode objects if provided as optional
@@ -118,8 +129,6 @@ class DualDirectiveNode(DualNodeBase):
:param DirectiveNode primary: Primary pre-created DirectiveNode, mainly
used when creating new DualParser nodes using add_* methods.
:param DirectiveNode secondary: Secondary pre-created DirectiveNode
"""
kwargs.setdefault("primary", None)
@@ -132,8 +141,12 @@ class DualDirectiveNode(DualNodeBase):
self.primary = primary
self.secondary = secondary
else:
self.primary = augeasparser.AugeasDirectiveNode(**kwargs)
self.secondary = apacheparser.ApacheDirectiveNode(**kwargs)
self.primary = augeasparser.AugeasDirectiveNode(
**kwargs
)
self.secondary = apacheparser.ApacheDirectiveNode(
**kwargs
)
assertions.assertEqual(self.primary, self.secondary)
@@ -149,7 +162,7 @@ class DualDirectiveNode(DualNodeBase):
class DualBlockNode(DualNodeBase):
""" Dual parser implementation of BlockNode interface """
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
""" This initialization implementation allows ordinary initialization
of BlockNode objects as well as creating a DualBlockNode object
using precreated or fetched BlockNode objects if provided as optional
@@ -164,8 +177,8 @@ class DualBlockNode(DualNodeBase):
kwargs.setdefault("primary", None)
kwargs.setdefault("secondary", None)
primary = kwargs.pop("primary")
secondary = kwargs.pop("secondary")
primary: Optional[augeasparser.AugeasBlockNode] = kwargs.pop("primary")
secondary: Optional[apacheparser.ApacheBlockNode] = kwargs.pop("secondary")
if primary or secondary:
assert primary and secondary
@@ -177,7 +190,9 @@ class DualBlockNode(DualNodeBase):
assertions.assertEqual(self.primary, self.secondary)
def add_child_block(self, name, parameters=None, position=None):
def add_child_block(
self, name: str, parameters: Optional[str] = None, position: Optional[int] = None
) -> "DualBlockNode":
""" Creates a new child BlockNode, asserts that both implementations
did it in a similar way, and returns a newly created DualBlockNode object
encapsulating both of the newly created objects """
@@ -185,10 +200,11 @@ class DualBlockNode(DualNodeBase):
primary_new = self.primary.add_child_block(name, parameters, position)
secondary_new = self.secondary.add_child_block(name, parameters, position)
assertions.assertEqual(primary_new, secondary_new)
new_block = DualBlockNode(primary=primary_new, secondary=secondary_new)
return new_block
return DualBlockNode(primary=primary_new, secondary=secondary_new)
def add_child_directive(self, name, parameters=None, position=None):
def add_child_directive(
self, name: str, parameters: Optional[str] = None, position: Optional[int] = None
) -> DualDirectiveNode:
""" Creates a new child DirectiveNode, asserts that both implementations
did it in a similar way, and returns a newly created DualDirectiveNode
object encapsulating both of the newly created objects """
@@ -196,21 +212,25 @@ class DualBlockNode(DualNodeBase):
primary_new = self.primary.add_child_directive(name, parameters, position)
secondary_new = self.secondary.add_child_directive(name, parameters, position)
assertions.assertEqual(primary_new, secondary_new)
new_dir = DualDirectiveNode(primary=primary_new, secondary=secondary_new)
return new_dir
return DualDirectiveNode(primary=primary_new, secondary=secondary_new)
def add_child_comment(self, comment="", position=None):
def add_child_comment(
self, comment: str = "", position: Optional[int] = None
) -> DualCommentNode:
""" Creates a new child CommentNode, asserts that both implementations
did it in a similar way, and returns a newly created DualCommentNode
object encapsulating both of the newly created objects """
primary_new = self.primary.add_child_comment(comment, position)
secondary_new = self.secondary.add_child_comment(comment, position)
primary_new = self.primary.add_child_comment(comment=comment, position=position)
secondary_new = self.secondary.add_child_comment(name=comment, position=position)
assertions.assertEqual(primary_new, secondary_new)
new_comment = DualCommentNode(primary=primary_new, secondary=secondary_new)
return new_comment
return DualCommentNode(primary=primary_new, secondary=secondary_new)
def _create_matching_list(self, primary_list, secondary_list):
def _create_matching_list(
self,
primary_list: List[interfaces.ParserNode],
secondary_list: List[interfaces.ParserNode],
) -> List[Tuple[interfaces.ParserNode, interfaces.ParserNode]]:
""" Matches the list of primary_list to a list of secondary_list and
returns a list of tuples. This is used to create results for find_
methods.
@@ -237,7 +257,7 @@ class DualBlockNode(DualNodeBase):
raise AssertionError("Could not find a matching node.")
return matched
def find_blocks(self, name, exclude=True):
def find_blocks(self, name: str, exclude: bool = True) -> List[apacheparser.ApacheBlockNode]:
"""
Performs a search for BlockNodes using both implementations and does simple
checks for results. This is built upon the assumption that unimplemented
@@ -249,7 +269,8 @@ class DualBlockNode(DualNodeBase):
return self._find_helper(DualBlockNode, "find_blocks", name,
exclude=exclude)
def find_directives(self, name, exclude=True):
def find_directives(self, name: str, exclude: bool = True
) -> Sequence[apacheparser.ApacheDirectiveNode]:
"""
Performs a search for DirectiveNodes using both implementations and
checks the results. This is built upon the assumption that unimplemented
@@ -261,7 +282,7 @@ class DualBlockNode(DualNodeBase):
return self._find_helper(DualDirectiveNode, "find_directives", name,
exclude=exclude)
def find_comments(self, comment):
def find_comments(self, comment: str) -> Sequence[apacheparser.ApacheParserNode]:
"""
Performs a search for CommentNodes using both implementations and
checks the results. This is built upon the assumption that unimplemented
@@ -272,7 +293,7 @@ class DualBlockNode(DualNodeBase):
return self._find_helper(DualCommentNode, "find_comments", comment)
def delete_child(self, child):
def delete_child(self, child: "DualBlockNode"):
"""Deletes a child from the ParserNode implementations. The actual
ParserNode implementations are used here directly in order to be able
to match a child to the list of children."""
@@ -280,7 +301,7 @@ class DualBlockNode(DualNodeBase):
self.primary.delete_child(child.primary)
self.secondary.delete_child(child.secondary)
def unsaved_files(self):
def unsaved_files(self) -> Set[str]:
""" Fetches the list of unsaved file paths and asserts that the lists
match """
primary_files = self.primary.unsaved_files()
@@ -289,7 +310,7 @@ class DualBlockNode(DualNodeBase):
return primary_files
def parsed_paths(self):
def parsed_paths(self) -> List[str]:
"""
Returns a list of file paths that have currently been parsed into the parser
tree. The returned list may include paths with wildcard characters, for

View File

@@ -1,5 +1,6 @@
""" Entry point for Apache Plugin """
from distutils.version import LooseVersion
from typing import Callable
from typing import Dict
from certbot import util
from certbot_apache._internal import configurator
@@ -10,8 +11,9 @@ from certbot_apache._internal import override_debian
from certbot_apache._internal import override_fedora
from certbot_apache._internal import override_gentoo
from certbot_apache._internal import override_suse
from certbot_apache._internal import override_void
OVERRIDE_CLASSES = {
OVERRIDE_CLASSES: Dict[str, Callable] = {
"arch": override_arch.ArchConfigurator,
"cloudlinux": override_centos.CentOSConfigurator,
"darwin": override_darwin.DarwinConfigurator,
@@ -35,6 +37,7 @@ OVERRIDE_CLASSES = {
"sles": override_suse.OpenSUSEConfigurator,
"scientific": override_centos.CentOSConfigurator,
"scientific linux": override_centos.CentOSConfigurator,
"void": override_void.VoidConfigurator,
}
@@ -45,7 +48,8 @@ def get_configurator():
override_class = None
# Special case for older Fedora versions
if os_name == 'fedora' and LooseVersion(os_version) < LooseVersion('29'):
min_version = util.parse_loose_version('29')
if os_name == 'fedora' and util.parse_loose_version(os_version) < min_version:
os_name = 'fedora_old'
try:
@@ -55,8 +59,7 @@ def get_configurator():
os_like = util.get_systemd_os_like()
if os_like:
for os_name in os_like:
if os_name in OVERRIDE_CLASSES.keys():
override_class = OVERRIDE_CLASSES[os_name]
override_class = OVERRIDE_CLASSES.get(os_name)
if not override_class:
# No override class found, return the generic configurator
override_class = configurator.ApacheConfigurator

View File

@@ -1,16 +1,23 @@
"""A class that performs HTTP-01 challenges for Apache"""
import errno
import logging
from typing import Any
from typing import List
from typing import Set
from typing import TYPE_CHECKING
from acme.challenges import HTTP01Response
from certbot import errors
from certbot.achallenges import KeyAuthorizationAnnotatedChallenge
from certbot.compat import filesystem
from certbot.compat import os
from certbot.plugins import common
from certbot_apache._internal.obj import VirtualHost # pylint: disable=unused-import
from certbot_apache._internal.obj import VirtualHost
from certbot_apache._internal.parser import get_aug_path
if TYPE_CHECKING:
from certbot_apache._internal.configurator import ApacheConfigurator # pragma: no cover
logger = logging.getLogger(__name__)
@@ -46,8 +53,9 @@ class ApacheHttp01(common.ChallengePerformer):
</Location>
"""
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
def __init__(self, configurator: "ApacheConfigurator") -> None:
super().__init__(configurator)
self.configurator: "ApacheConfigurator"
self.challenge_conf_pre = os.path.join(
self.configurator.conf("challenge-location"),
"le_http_01_challenge_pre.conf")
@@ -59,7 +67,7 @@ class ApacheHttp01(common.ChallengePerformer):
"http_challenges")
self.moded_vhosts: Set[VirtualHost] = set()
def perform(self):
def perform(self) -> List[KeyAuthorizationAnnotatedChallenge]:
"""Perform all HTTP-01 challenges."""
if not self.achalls:
return []
@@ -67,8 +75,7 @@ class ApacheHttp01(common.ChallengePerformer):
# About to make temporary changes to the config
self.configurator.save("Changes before challenge setup", True)
self.configurator.ensure_listen(str(
self.configurator.config.http01_port))
self.configurator.ensure_listen(str(self.configurator.config.http01_port))
self.prepare_http01_modules()
responses = self._set_up_challenges()
@@ -79,7 +86,7 @@ class ApacheHttp01(common.ChallengePerformer):
return responses
def prepare_http01_modules(self):
def prepare_http01_modules(self) -> None:
"""Make sure that we have the needed modules available for http01"""
if self.configurator.conf("handle-modules"):
@@ -92,13 +99,13 @@ class ApacheHttp01(common.ChallengePerformer):
if mod + "_module" not in self.configurator.parser.modules:
self.configurator.enable_mod(mod, temp=True)
def _mod_config(self):
def _mod_config(self) -> None:
selected_vhosts: List[VirtualHost] = []
http_port = str(self.configurator.config.http01_port)
# Search for VirtualHosts matching by name
for chall in self.achalls:
# Search for matching VirtualHosts
for vh in self._matching_vhosts(chall.domain):
selected_vhosts.append(vh)
selected_vhosts += self._matching_vhosts(chall.domain)
# Ensure that we have one or more VirtualHosts that we can continue
# with. (one that listens to port configured with --http-01-port)
@@ -107,9 +114,13 @@ class ApacheHttp01(common.ChallengePerformer):
if any(a.is_wildcard() or a.get_port() == http_port for a in vhost.addrs):
found = True
if not found:
for vh in self._relevant_vhosts():
selected_vhosts.append(vh)
# If there's at least one eligible VirtualHost, also add all unnamed VirtualHosts
# because they might match at runtime (#8890)
if found:
selected_vhosts += self._unnamed_vhosts()
# Otherwise, add every Virtualhost which listens on the right port
else:
selected_vhosts += self._relevant_vhosts()
# Add the challenge configuration
for vh in selected_vhosts:
@@ -137,7 +148,7 @@ class ApacheHttp01(common.ChallengePerformer):
with open(self.challenge_conf_post, "w") as new_conf:
new_conf.write(config_text_post)
def _matching_vhosts(self, domain):
def _matching_vhosts(self, domain: str) -> List[VirtualHost]:
"""Return all VirtualHost objects that have the requested domain name or
a wildcard name that would match the domain in ServerName or ServerAlias
directive.
@@ -151,9 +162,9 @@ class ApacheHttp01(common.ChallengePerformer):
return matching_vhosts
def _relevant_vhosts(self):
def _relevant_vhosts(self) -> List[VirtualHost]:
http01_port = str(self.configurator.config.http01_port)
relevant_vhosts = []
relevant_vhosts: List[VirtualHost] = []
for vhost in self.configurator.vhosts:
if any(a.is_wildcard() or a.get_port() == http01_port for a in vhost.addrs):
if not vhost.ssl:
@@ -167,7 +178,11 @@ class ApacheHttp01(common.ChallengePerformer):
return relevant_vhosts
def _set_up_challenges(self):
def _unnamed_vhosts(self) -> List[VirtualHost]:
"""Return all VirtualHost objects with no ServerName"""
return [vh for vh in self.configurator.vhosts if vh.name is None]
def _set_up_challenges(self) -> List[HTTP01Response]:
if not os.path.isdir(self.challenge_dir):
old_umask = filesystem.umask(0o022)
try:
@@ -185,10 +200,12 @@ class ApacheHttp01(common.ChallengePerformer):
return responses
def _set_up_challenge(self, achall):
def _set_up_challenge(self, achall: KeyAuthorizationAnnotatedChallenge) -> HTTP01Response:
response: HTTP01Response
validation: Any
response, validation = achall.response_and_validation()
name = os.path.join(self.challenge_dir, achall.chall.encode("token"))
name: str = os.path.join(self.challenge_dir, achall.chall.encode("token"))
self.configurator.reverter.register_file_creation(True, name)
with open(name, 'wb') as f:
@@ -197,7 +214,7 @@ class ApacheHttp01(common.ChallengePerformer):
return response
def _set_up_include_directives(self, vhost):
def _set_up_include_directives(self, vhost: VirtualHost) -> None:
"""Includes override configuration to the beginning and to the end of
VirtualHost. Note that this include isn't added to Augeas search tree"""

View File

@@ -100,9 +100,12 @@ For this reason the internal representation of data should not ignore the case.
"""
import abc
from typing import Any
from typing import List
from typing import Optional
class ParserNode(object, metaclass=abc.ABCMeta):
class ParserNode(metaclass=abc.ABCMeta):
"""
ParserNode is the basic building block of the tree of such nodes,
representing the structure of the configuration. It is largely meant to keep
@@ -146,7 +149,7 @@ class ParserNode(object, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
"""
Initializes the ParserNode instance, and sets the ParserNode specific
instance variables. This is not meant to be used directly, but through
@@ -170,7 +173,7 @@ class ParserNode(object, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def save(self, msg):
def save(self, msg: str) -> None:
"""
Save traverses the children, and attempts to write the AST to disk for
all the objects that are marked dirty. The actual operation of course
@@ -189,7 +192,7 @@ class ParserNode(object, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def find_ancestors(self, name):
def find_ancestors(self, name: str):
"""
Traverses the ancestor tree up, searching for BlockNodes with a specific
name.
@@ -220,7 +223,7 @@ class CommentNode(ParserNode, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any):
"""
Initializes the CommentNode instance and sets its instance variables.
@@ -238,10 +241,12 @@ class CommentNode(ParserNode, metaclass=abc.ABCMeta):
created or changed after the last save. Default: False.
:type dirty: bool
"""
super().__init__(ancestor=kwargs['ancestor'],
dirty=kwargs.get('dirty', False),
filepath=kwargs['filepath'],
metadata=kwargs.get('metadata', {})) # pragma: no cover
super().__init__( # pragma: no cover
ancestor=kwargs['ancestor'],
dirty=kwargs.get('dirty', False),
filepath=kwargs['filepath'],
metadata=kwargs.get('metadata', {}),
)
class DirectiveNode(ParserNode, metaclass=abc.ABCMeta):
@@ -272,7 +277,7 @@ class DirectiveNode(ParserNode, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def __init__(self, **kwargs):
def __init__(self, **kwargs: Any) -> None:
"""
Initializes the DirectiveNode instance and sets its instance variables.
@@ -302,17 +307,19 @@ class DirectiveNode(ParserNode, metaclass=abc.ABCMeta):
:type enabled: bool
"""
super().__init__(ancestor=kwargs['ancestor'],
dirty=kwargs.get('dirty', False),
filepath=kwargs['filepath'],
metadata=kwargs.get('metadata', {})) # pragma: no cover
super().__init__( # pragma: no cover
ancestor=kwargs['ancestor'],
dirty=kwargs.get('dirty', False),
filepath=kwargs['filepath'],
metadata=kwargs.get('metadata', {}),
)
@abc.abstractmethod
def set_parameters(self, parameters):
def set_parameters(self, parameters: List[str]) -> None:
"""
Sets the sequence of parameters for this ParserNode object without
whitespaces. While the whitespaces for parameters are discarded when using
this method, the whitespacing preceeding the ParserNode itself should be
this method, the whitespacing preceding the ParserNode itself should be
kept intact.
:param list parameters: sequence of parameters
@@ -361,10 +368,12 @@ class BlockNode(DirectiveNode, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def add_child_block(self, name, parameters=None, position=None):
def add_child_block(
self, name: str, parameters: List[str] = None, position: int = None
) -> "BlockNode":
"""
Adds a new BlockNode child node with provided values and marks the callee
BlockNode dirty. This is used to add new children to the AST. The preceeding
BlockNode dirty. This is used to add new children to the AST. The preceding
whitespaces should not be added based on the ancestor or siblings for the
newly created object. This is to match the current behavior of the legacy
parser implementation.
@@ -381,11 +390,13 @@ class BlockNode(DirectiveNode, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def add_child_directive(self, name, parameters=None, position=None):
def add_child_directive(
self, name: str, parameters: Optional[List[str]] = None, position: Optional[int] = None
) -> "DirectiveNode":
"""
Adds a new DirectiveNode child node with provided values and marks the
callee BlockNode dirty. This is used to add new children to the AST. The
preceeding whitespaces should not be added based on the ancestor or siblings
preceding whitespaces should not be added based on the ancestor or siblings
for the newly created object. This is to match the current behavior of the
legacy parser implementation.
@@ -402,11 +413,11 @@ class BlockNode(DirectiveNode, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def add_child_comment(self, comment="", position=None):
def add_child_comment(self, comment: str = "", position: Optional[int] = None) -> "CommentNode":
"""
Adds a new CommentNode child node with provided value and marks the
callee BlockNode dirty. This is used to add new children to the AST. The
preceeding whitespaces should not be added based on the ancestor or siblings
preceding whitespaces should not be added based on the ancestor or siblings
for the newly created object. This is to match the current behavior of the
legacy parser implementation.
@@ -422,7 +433,7 @@ class BlockNode(DirectiveNode, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def find_blocks(self, name, exclude=True):
def find_blocks(self, name: str, exclude: bool = True) -> List["BlockNode"]:
"""
Find a configuration block by name. This method walks the child tree of
ParserNodes under the instance it was called from. This way it is possible
@@ -439,7 +450,23 @@ class BlockNode(DirectiveNode, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def find_directives(self, name, exclude=True):
def find_comments(self, comment: str) -> List["CommentNode"]:
"""
Find comments with value containing the search term.
This method walks the child tree of ParserNodes under the instance it was
called from. This way it is possible to search for the whole configuration
tree, when starting from root node, or to do a partial search when starting
from a specified branch. The lookup should be case sensitive.
:param str comment: The content of comment to search for
:returns: A list of found CommentNode objects.
"""
@abc.abstractmethod
def find_directives(self, name: str, exclude: bool = True):
"""
Find a directive by name. This method walks the child tree of ParserNodes
under the instance it was called from. This way it is possible to search
@@ -457,23 +484,7 @@ class BlockNode(DirectiveNode, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def find_comments(self, comment):
"""
Find comments with value containing the search term.
This method walks the child tree of ParserNodes under the instance it was
called from. This way it is possible to search for the whole configuration
tree, when starting from root node, or to do a partial search when starting
from a specified branch. The lookup should be case sensitive.
:param str comment: The content of comment to search for
:returns: A list of found CommentNode objects.
"""
@abc.abstractmethod
def delete_child(self, child):
def delete_child(self, child: "ParserNode") -> None:
"""
Remove a specified child node from the list of children of the called
BlockNode object.
@@ -483,7 +494,7 @@ class BlockNode(DirectiveNode, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def unsaved_files(self):
def unsaved_files(self) -> List[str]:
"""
Returns a list of file paths that have been changed since the last save
(or the initial configuration parse). The intended use for this method
@@ -496,7 +507,7 @@ class BlockNode(DirectiveNode, metaclass=abc.ABCMeta):
"""
@abc.abstractmethod
def parsed_paths(self):
def parsed_paths(self) -> List[str]:
"""
Returns a list of file paths that have currently been parsed into the parser
tree. The returned list may include paths with wildcard characters, for

View File

@@ -1,14 +1,19 @@
"""Module contains classes used by the Apache Configurator."""
import re
from typing import Any
from typing import Iterable
from typing import Optional
from typing import Pattern
from typing import Set
from certbot.plugins import common
from certbot_apache._internal import interfaces
class Addr(common.Addr):
"""Represents an Apache address."""
def __eq__(self, other):
def __eq__(self, other: Any):
"""This is defined as equivalent within Apache.
ip_addr:* == ip_addr
@@ -21,19 +26,19 @@ class Addr(common.Addr):
return False
def __repr__(self):
return "certbot_apache._internal.obj.Addr(" + repr(self.tup) + ")"
return f"certbot_apache._internal.obj.Addr({repr(self.tup)})"
def __hash__(self): # pylint: disable=useless-super-delegation
# Python 3 requires explicit overridden for __hash__ if __eq__ or
# __cmp__ is overridden. See https://bugs.python.org/issue2235
return super().__hash__()
def _addr_less_specific(self, addr):
def _addr_less_specific(self, addr: "Addr") -> bool:
"""Returns if addr.get_addr() is more specific than self.get_addr()."""
# pylint: disable=protected-access
return addr._rank_specific_addr() > self._rank_specific_addr()
def _rank_specific_addr(self):
def _rank_specific_addr(self) -> int:
"""Returns numerical rank for get_addr()
:returns: 2 - FQ, 1 - wildcard, 0 - _default_
@@ -46,7 +51,7 @@ class Addr(common.Addr):
return 1
return 2
def conflicts(self, addr):
def conflicts(self, addr: "Addr") -> bool:
r"""Returns if address could conflict with correct function of self.
Could addr take away service provided by self within Apache?
@@ -74,11 +79,11 @@ class Addr(common.Addr):
return True
return False
def is_wildcard(self):
def is_wildcard(self) -> bool:
"""Returns if address has a wildcard port."""
return self.tup[1] == "*" or not self.tup[1]
def get_sni_addr(self, port):
def get_sni_addr(self, port: str) -> common.Addr:
"""Returns the least specific address that resolves on the port.
Examples:
@@ -118,13 +123,16 @@ class VirtualHost:
"""
# ?: is used for not returning enclosed characters
strip_name = re.compile(r"^(?:.+://)?([^ :$]*)")
strip_name: Pattern = re.compile(r"^(?:.+://)?([^ :$]*)")
def __init__(self, filep, path, addrs, ssl, enabled, name=None,
aliases=None, modmacro=False, ancestor=None, node=None):
def __init__(
self, filepath: str, path: str, addrs: Set["Addr"],
ssl: bool, enabled: bool, name: Optional[str] = None,
aliases: Optional[Set[str]] = None, modmacro: bool = False,
ancestor: Optional["VirtualHost"] = None, node = None):
"""Initialize a VH."""
self.filep = filep
self.filep = filepath
self.path = path
self.addrs = addrs
self.name = name
@@ -133,9 +141,9 @@ class VirtualHost:
self.enabled = enabled
self.modmacro = modmacro
self.ancestor = ancestor
self.node = node
self.node: interfaces.BlockNode = node
def get_names(self):
def get_names(self) -> Set[str]:
"""Return a set of all names."""
all_names: Set[str] = set()
all_names.update(self.aliases)
@@ -147,37 +155,26 @@ class VirtualHost:
def __str__(self):
return (
"File: {filename}\n"
"Vhost path: {vhpath}\n"
"Addresses: {addrs}\n"
"Name: {name}\n"
"Aliases: {aliases}\n"
"TLS Enabled: {tls}\n"
"Site Enabled: {active}\n"
"mod_macro Vhost: {modmacro}".format(
filename=self.filep,
vhpath=self.path,
addrs=", ".join(str(addr) for addr in self.addrs),
name=self.name if self.name is not None else "",
aliases=", ".join(name for name in self.aliases),
tls="Yes" if self.ssl else "No",
active="Yes" if self.enabled else "No",
modmacro="Yes" if self.modmacro else "No"))
f"File: {self.filep}\n"
f"Vhost path: {self.path}\n"
f"Addresses: {', '.join(str(addr) for addr in self.addrs)}\n"
f"Name: {self.name if self.name is not None else ''}\n"
f"Aliases: {', '.join(name for name in self.aliases)}\n"
f"TLS Enabled: {'Yes' if self.ssl else 'No'}\n"
f"Site Enabled: {'Yes' if self.enabled else 'No'}\n"
f"mod_macro Vhost: {'Yes' if self.modmacro else 'No'}"
)
def display_repr(self):
def display_repr(self) -> str:
"""Return a representation of VHost to be used in dialog"""
return (
"File: {filename}\n"
"Addresses: {addrs}\n"
"Names: {names}\n"
"HTTPS: {https}\n".format(
filename=self.filep,
addrs=", ".join(str(addr) for addr in self.addrs),
names=", ".join(self.get_names()),
https="Yes" if self.ssl else "No"))
f"File: {self.filep}\n"
f"Addresses: {', '.join(str(addr) for addr in self.addrs)}\n"
f"Names: {', '.join(self.get_names())}\n"
f"HTTPS: {'Yes' if self.ssl else 'No'}\n"
)
def __eq__(self, other):
def __eq__(self, other: Any) -> bool:
if isinstance(other, self.__class__):
return (self.filep == other.filep and self.path == other.path and
self.addrs == other.addrs and
@@ -193,7 +190,7 @@ class VirtualHost:
tuple(self.addrs), tuple(self.get_names()),
self.ssl, self.enabled, self.modmacro))
def conflicts(self, addrs):
def conflicts(self, addrs: Iterable[Addr]) -> bool:
"""See if vhost conflicts with any of the addrs.
This determines whether or not these addresses would/could overwrite
@@ -212,7 +209,7 @@ class VirtualHost:
return True
return False
def same_server(self, vhost, generic=False):
def same_server(self, vhost: "VirtualHost", generic: bool = False) -> bool:
"""Determines if the vhost is the same 'server'.
Used in redirection - indicates whether or not the two virtual hosts

View File

@@ -1,12 +1,8 @@
""" Distribution specific override class for Arch Linux """
import zope.interface
from certbot import interfaces
from certbot_apache._internal import configurator
from certbot_apache._internal.configurator import OsOptions
@zope.interface.provider(interfaces.IPluginFactory)
class ArchConfigurator(configurator.ApacheConfigurator):
"""Arch Linux specific ApacheConfigurator override class"""

View File

@@ -1,12 +1,10 @@
""" Distribution specific override class for CentOS family (RHEL, Fedora) """
import logging
from typing import Any
from typing import cast
from typing import List
import zope.interface
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.errors import MisconfigurationError
from certbot_apache._internal import apache_util
@@ -17,7 +15,6 @@ from certbot_apache._internal.configurator import OsOptions
logger = logging.getLogger(__name__)
@zope.interface.provider(interfaces.IPluginFactory)
class CentOSConfigurator(configurator.ApacheConfigurator):
"""CentOS specific ApacheConfigurator override class"""
@@ -34,7 +31,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
challenge_location="/etc/httpd/conf.d",
)
def config_test(self):
def config_test(self) -> None:
"""
Override config_test to mitigate configtest error in vanilla installation
of mod_ssl in Fedora. The error is caused by non-existent self-signed
@@ -53,9 +50,9 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
else:
raise
def _try_restart_fedora(self):
def _try_restart_fedora(self) -> None:
"""
Tries to restart httpd using systemctl to generate the self signed keypair.
Tries to restart httpd using systemctl to generate the self signed key pair.
"""
try:
@@ -66,7 +63,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
# Finish with actual config check to see if systemctl restart helped
super().config_test()
def _prepare_options(self):
def _prepare_options(self) -> None:
"""
Override the options dictionary initialization in order to support
alternative restart cmd used in CentOS.
@@ -76,13 +73,12 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
raise ValueError("OS option restart_cmd_alt must be set for CentOS.")
self.options.restart_cmd_alt[0] = self.options.ctl
def get_parser(self):
def get_parser(self) -> "CentOSParser":
"""Initializes the ApacheParser"""
return CentOSParser(
self.options.server_root, self.options.vhost_root,
self.version, configurator=self)
self.options.server_root, self, self.options.vhost_root, self.version)
def _deploy_cert(self, *args, **kwargs): # pylint: disable=arguments-differ
def _deploy_cert(self, *args: Any, **kwargs: Any): # pylint: disable=arguments-differ
"""
Override _deploy_cert in order to ensure that the Apache configuration
has "LoadModule ssl_module..." before parsing the VirtualHost configuration
@@ -92,7 +88,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
if self.version < (2, 4, 0):
self._deploy_loadmodule_ssl_if_needed()
def _deploy_loadmodule_ssl_if_needed(self):
def _deploy_loadmodule_ssl_if_needed(self) -> None:
"""
Add "LoadModule ssl_module <pre-existing path>" to main httpd.conf if
it doesn't exist there already.
@@ -114,14 +110,13 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
"use, and run Certbot again.")
raise MisconfigurationError(msg)
else:
loadmod_args = path_args
loadmod_args = [arg for arg in path_args if arg]
centos_parser: CentOSParser = cast(CentOSParser, self.parser)
if centos_parser.not_modssl_ifmodule(noarg_path):
if centos_parser.loc["default"] in noarg_path:
# LoadModule already in the main configuration file
if ("ifmodule/" in noarg_path.lower() or
"ifmodule[1]" in noarg_path.lower()):
if "ifmodule/" in noarg_path.lower() or "ifmodule[1]" in noarg_path.lower():
# It's the first or only IfModule in the file
return
# Populate the list of known !mod_ssl.c IfModules
@@ -152,8 +147,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
self.parser.aug.remove(loadmod_path)
# Create a new IfModule !mod_ssl.c if not already found on path
ssl_ifmod = self.parser.get_ifmod(nodir_path, "!mod_ssl.c",
beginning=True)[:-1]
ssl_ifmod = self.parser.get_ifmod(nodir_path, "!mod_ssl.c", beginning=True)[:-1]
if ssl_ifmod not in correct_ifmods:
self.parser.add_dir(ssl_ifmod, "LoadModule", loadmod_args)
correct_ifmods.append(ssl_ifmod)
@@ -163,24 +157,24 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
class CentOSParser(parser.ApacheParser):
"""CentOS specific ApacheParser override class"""
def __init__(self, *args, **kwargs):
def __init__(self, *args: Any, **kwargs: Any) -> None:
# CentOS specific configuration file for Apache
self.sysconfig_filep = "/etc/sysconfig/httpd"
self.sysconfig_filep: str = "/etc/sysconfig/httpd"
super().__init__(*args, **kwargs)
def update_runtime_variables(self):
def update_runtime_variables(self) -> None:
""" Override for update_runtime_variables for custom parsing """
# Opportunistic, works if SELinux not enforced
super().update_runtime_variables()
self.parse_sysconfig_var()
def parse_sysconfig_var(self):
def parse_sysconfig_var(self) -> None:
""" Parses Apache CLI options from CentOS configuration file """
defines = apache_util.parse_define_file(self.sysconfig_filep, "OPTIONS")
for k in defines:
self.variables[k] = defines[k]
for k, v in defines.items():
self.variables[k] = v
def not_modssl_ifmodule(self, path):
def not_modssl_ifmodule(self, path: str) -> bool:
"""Checks if the provided Augeas path has argument !mod_ssl"""
if "ifmodule" not in path.lower():

View File

@@ -1,12 +1,8 @@
""" Distribution specific override class for macOS """
import zope.interface
from certbot import interfaces
from certbot_apache._internal import configurator
from certbot_apache._internal.configurator import OsOptions
@zope.interface.provider(interfaces.IPluginFactory)
class DarwinConfigurator(configurator.ApacheConfigurator):
"""macOS specific ApacheConfigurator override class"""

View File

@@ -1,21 +1,18 @@
""" Distribution specific override class for Debian family (Ubuntu/Debian) """
import logging
import zope.interface
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache._internal.configurator import OsOptions
from certbot_apache._internal.obj import VirtualHost
logger = logging.getLogger(__name__)
@zope.interface.provider(interfaces.IPluginFactory)
class DebianConfigurator(configurator.ApacheConfigurator):
"""Debian specific ApacheConfigurator override class"""
@@ -26,7 +23,7 @@ class DebianConfigurator(configurator.ApacheConfigurator):
handle_sites=True,
)
def enable_site(self, vhost):
def enable_site(self, vhost: VirtualHost) -> None:
"""Enables an available site, Apache reload required.
.. note:: Does not make sure that the site correctly works or that all
@@ -71,7 +68,7 @@ class DebianConfigurator(configurator.ApacheConfigurator):
self.save_notes += "Enabled site %s\n" % vhost.filep
return None
def enable_mod(self, mod_name, temp=False):
def enable_mod(self, mod_name: str, temp: bool = False) -> None:
"""Enables module in Apache.
Both enables and reloads Apache so module is active.
@@ -117,16 +114,16 @@ class DebianConfigurator(configurator.ApacheConfigurator):
# Reload is not necessary as DUMP_RUN_CFG uses latest config.
self.parser.update_runtime_variables()
def _enable_mod_debian(self, mod_name, temp):
def _enable_mod_debian(self, mod_name: str, temp: bool) -> None:
"""Assumes mods-available, mods-enabled layout."""
# Generate reversal command.
# Try to be safe here... check that we can probably reverse before
# applying enmod command
if not util.exe_exists(self.options.dismod):
if (self.options.dismod is None or self.options.enmod is None
or not util.exe_exists(self.options.dismod)):
raise errors.MisconfigurationError(
"Unable to find a2dismod, please make sure a2enmod and "
"a2dismod are configured correctly for certbot.")
self.reverter.register_undo_command(
temp, [self.options.dismod, "-f", mod_name])
self.reverter.register_undo_command(temp, [self.options.dismod, "-f", mod_name])
util.run_script([self.options.enmod, mod_name])

View File

@@ -1,8 +1,5 @@
""" Distribution specific override class for Fedora 29+ """
import zope.interface
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
@@ -10,7 +7,6 @@ from certbot_apache._internal import parser
from certbot_apache._internal.configurator import OsOptions
@zope.interface.provider(interfaces.IPluginFactory)
class FedoraConfigurator(configurator.ApacheConfigurator):
"""Fedora 29+ specific ApacheConfigurator override class"""
@@ -27,7 +23,7 @@ class FedoraConfigurator(configurator.ApacheConfigurator):
challenge_location="/etc/httpd/conf.d",
)
def config_test(self):
def config_test(self) -> None:
"""
Override config_test to mitigate configtest error in vanilla installation
of mod_ssl in Fedora. The error is caused by non-existent self-signed
@@ -39,15 +35,14 @@ class FedoraConfigurator(configurator.ApacheConfigurator):
except errors.MisconfigurationError:
self._try_restart_fedora()
def get_parser(self):
def get_parser(self) -> "FedoraParser":
"""Initializes the ApacheParser"""
return FedoraParser(
self.options.server_root, self.options.vhost_root,
self.version, configurator=self)
self.options.server_root, self, self.options.vhost_root, self.version)
def _try_restart_fedora(self):
def _try_restart_fedora(self) -> None:
"""
Tries to restart httpd using systemctl to generate the self signed keypair.
Tries to restart httpd using systemctl to generate the self signed key pair.
"""
try:
util.run_script(['systemctl', 'restart', 'httpd'])
@@ -57,7 +52,7 @@ class FedoraConfigurator(configurator.ApacheConfigurator):
# Finish with actual config check to see if systemctl restart helped
super().config_test()
def _prepare_options(self):
def _prepare_options(self) -> None:
"""
Override the options dictionary initialization to keep using apachectl
instead of httpd and so take advantages of this new bash script in newer versions
@@ -73,19 +68,19 @@ class FedoraConfigurator(configurator.ApacheConfigurator):
class FedoraParser(parser.ApacheParser):
"""Fedora 29+ specific ApacheParser override class"""
def __init__(self, *args, **kwargs):
def __init__(self, *args, **kwargs) -> None:
# Fedora 29+ specific configuration file for Apache
self.sysconfig_filep = "/etc/sysconfig/httpd"
super().__init__(*args, **kwargs)
def update_runtime_variables(self):
def update_runtime_variables(self) -> None:
""" Override for update_runtime_variables for custom parsing """
# Opportunistic, works if SELinux not enforced
super().update_runtime_variables()
self._parse_sysconfig_var()
def _parse_sysconfig_var(self):
def _parse_sysconfig_var(self) -> None:
""" Parses Apache CLI options from Fedora configuration file """
defines = apache_util.parse_define_file(self.sysconfig_filep, "OPTIONS")
for k in defines:
self.variables[k] = defines[k]
for k, v in defines.items():
self.variables[k] = v

View File

@@ -1,14 +1,12 @@
""" Distribution specific override class for Gentoo Linux """
import zope.interface
from typing import Any
from certbot import interfaces
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache._internal import parser
from certbot_apache._internal.configurator import OsOptions
@zope.interface.provider(interfaces.IPluginFactory)
class GentooConfigurator(configurator.ApacheConfigurator):
"""Gentoo specific ApacheConfigurator override class"""
@@ -20,7 +18,7 @@ class GentooConfigurator(configurator.ApacheConfigurator):
challenge_location="/etc/apache2/vhosts.d",
)
def _prepare_options(self):
def _prepare_options(self) -> None:
"""
Override the options dictionary initialization in order to support
alternative restart cmd used in Gentoo.
@@ -30,33 +28,32 @@ class GentooConfigurator(configurator.ApacheConfigurator):
raise ValueError("OS option restart_cmd_alt must be set for Gentoo.")
self.options.restart_cmd_alt[0] = self.options.ctl
def get_parser(self):
def get_parser(self) -> "GentooParser":
"""Initializes the ApacheParser"""
return GentooParser(
self.options.server_root, self.options.vhost_root,
self.version, configurator=self)
self.options.server_root, self, self.options.vhost_root, self.version)
class GentooParser(parser.ApacheParser):
"""Gentoo specific ApacheParser override class"""
def __init__(self, *args, **kwargs):
def __init__(self, *args: Any, **kwargs: Any) -> None:
# Gentoo specific configuration file for Apache2
self.apacheconfig_filep = "/etc/conf.d/apache2"
super().__init__(*args, **kwargs)
def update_runtime_variables(self):
def update_runtime_variables(self) -> None:
""" Override for update_runtime_variables for custom parsing """
self.parse_sysconfig_var()
self.update_modules()
def parse_sysconfig_var(self):
def parse_sysconfig_var(self) -> None:
""" Parses Apache CLI options from Gentoo configuration file """
defines = apache_util.parse_define_file(self.apacheconfig_filep,
"APACHE2_OPTS")
for k in defines:
self.variables[k] = defines[k]
for k, v in defines.items():
self.variables[k] = v
def update_modules(self):
def update_modules(self) -> None:
"""Get loaded modules from httpd process, and add them to DOM"""
mod_cmd = [self.configurator.options.ctl, "modules"]
matches = apache_util.parse_from_subprocess(mod_cmd, r"(.*)_module")

View File

@@ -1,12 +1,8 @@
""" Distribution specific override class for OpenSUSE """
import zope.interface
from certbot import interfaces
from certbot_apache._internal import configurator
from certbot_apache._internal.configurator import OsOptions
@zope.interface.provider(interfaces.IPluginFactory)
class OpenSUSEConfigurator(configurator.ApacheConfigurator):
"""OpenSUSE specific ApacheConfigurator override class"""

View File

@@ -0,0 +1,19 @@
""" Distribution specific override class for Void Linux """
from certbot_apache._internal import configurator
from certbot_apache._internal.configurator import OsOptions
class VoidConfigurator(configurator.ApacheConfigurator):
"""Void Linux specific ApacheConfigurator override class"""
OS_DEFAULTS = OsOptions(
server_root="/etc/apache",
vhost_root="/etc/apache/extra",
vhost_files="*.conf",
logs_root="/var/log/httpd",
ctl="apachectl",
version_cmd=['apachectl', '-v'],
restart_cmd=['apachectl', 'graceful'],
conftest_cmd=['apachectl', 'configtest'],
challenge_location="/etc/apache/extra",
)

View File

@@ -3,19 +3,30 @@ import copy
import fnmatch
import logging
import re
from typing import Collection
from typing import Dict
from typing import Iterable
from typing import List
from typing import Mapping
from typing import Optional
from typing import Pattern
from typing import Set
from typing import TYPE_CHECKING
from typing import Tuple
from typing import Union
from certbot import errors
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import constants
if TYPE_CHECKING:
from certbot_apache._internal.configurator import ApacheConfigurator # pragma: no cover
try:
from augeas import Augeas
except ImportError: # pragma: no cover
Augeas = None # type: ignore
Augeas = None
logger = logging.getLogger(__name__)
@@ -32,11 +43,11 @@ class ApacheParser:
default - user config file, name - NameVirtualHost,
"""
arg_var_interpreter = re.compile(r"\$\{[^ \}]*}")
fnmatch_chars = {"*", "?", "\\", "[", "]"}
arg_var_interpreter: Pattern = re.compile(r"\$\{[^ \}]*}")
fnmatch_chars: Set[str] = {"*", "?", "\\", "[", "]"}
def __init__(self, root, vhostroot=None, version=(2, 4),
configurator=None):
def __init__(self, root: str, configurator: "ApacheConfigurator",
vhostroot: str, version: Tuple[int, ...] = (2, 4)) -> None:
# Note: Order is important here.
# Needed for calling save() with reverter functionality that resides in
@@ -45,7 +56,7 @@ class ApacheParser:
self.configurator = configurator
# Initialize augeas
self.aug = init_augeas()
self.aug: Augeas = init_augeas()
if not self.check_aug_version():
raise errors.NotSupportedError(
@@ -58,8 +69,8 @@ class ApacheParser:
self.variables: Dict[str, str] = {}
# Find configuration root and make sure augeas can parse it.
self.root = os.path.abspath(root)
self.loc = {"root": self._find_config_root()}
self.root: str = os.path.abspath(root)
self.loc: Dict[str, str] = {"root": self._find_config_root()}
self.parse_file(self.loc["root"])
if version >= (2, 4):
@@ -88,7 +99,7 @@ class ApacheParser:
if self.find_dir("Define", exclude=False):
raise errors.PluginError("Error parsing runtime variables")
def check_parsing_errors(self, lens):
def check_parsing_errors(self, lens: str) -> None:
"""Verify Augeas can parse all of the lens files.
:param str lens: lens to check for errors
@@ -114,7 +125,7 @@ class ApacheParser:
self.aug.get(path + "/message")))
raise errors.PluginError(msg)
def check_aug_version(self):
def check_aug_version(self) -> bool:
""" Checks that we have recent enough version of libaugeas.
If augeas version is recent enough, it will support case insensitive
regexp matching"""
@@ -129,7 +140,7 @@ class ApacheParser:
self.aug.remove("/test/path")
return matches
def unsaved_files(self):
def unsaved_files(self) -> Set[str]:
"""Lists files that have modified Augeas DOM but the changes have not
been written to the filesystem yet, used by `self.save()` and
ApacheConfigurator to check the file state.
@@ -168,7 +179,7 @@ class ApacheParser:
save_files.add(self.aug.get(path)[6:])
return save_files
def ensure_augeas_state(self):
def ensure_augeas_state(self) -> None:
"""Makes sure that all Augeas dom changes are written to files to avoid
loss of configuration directives when doing additional augeas parsing,
causing a possible augeas.load() resulting dom reset
@@ -178,7 +189,7 @@ class ApacheParser:
self.configurator.save_notes += "(autosave)"
self.configurator.save()
def save(self, save_files):
def save(self, save_files: Iterable[str]) -> None:
"""Saves all changes to the configuration files.
save() is called from ApacheConfigurator to handle the parser specific
@@ -197,7 +208,7 @@ class ApacheParser:
self.aug.remove("/files/"+sf)
self.aug.load()
def _log_save_errors(self, ex_errs):
def _log_save_errors(self, ex_errs: List[str]) -> None:
"""Log errors due to bad Augeas save.
:param list ex_errs: Existing errors before save
@@ -211,7 +222,7 @@ class ApacheParser:
# Only new errors caused by recent save
if err not in ex_errs), self.configurator.save_notes)
def add_include(self, main_config, inc_path):
def add_include(self, main_config: str, inc_path: str) -> None:
"""Add Include for a new configuration file if one does not exist
:param str main_config: file path to main Apache config file
@@ -230,21 +241,21 @@ class ApacheParser:
new_file = os.path.basename(inc_path)
self.existing_paths.setdefault(new_dir, []).append(new_file)
def add_mod(self, mod_name):
def add_mod(self, mod_name: str) -> None:
"""Shortcut for updating parser modules."""
if mod_name + "_module" not in self.modules:
self.modules[mod_name + "_module"] = None
if "mod_" + mod_name + ".c" not in self.modules:
self.modules["mod_" + mod_name + ".c"] = None
def reset_modules(self):
def reset_modules(self) -> None:
"""Reset the loaded modules list. This is called from cleanup to clear
temporarily loaded modules."""
self.modules = {}
self.update_modules()
self.parse_modules()
def parse_modules(self):
def parse_modules(self) -> None:
"""Iterates on the configuration until no new modules are loaded.
..todo:: This should be attempted to be done with a binary to avoid
@@ -272,19 +283,18 @@ class ApacheParser:
match_name[6:])
self.modules.update(mods)
def update_runtime_variables(self):
def update_runtime_variables(self) -> None:
"""Update Includes, Defines and Includes from httpd config dump data"""
self.update_defines()
self.update_includes()
self.update_modules()
def update_defines(self):
def update_defines(self) -> None:
"""Updates the dictionary of known variables in the configuration"""
self.variables = apache_util.parse_defines(self.configurator.options.ctl)
def update_includes(self):
def update_includes(self) -> None:
"""Get includes from httpd process, and add them to DOM if needed"""
# Find_dir iterates over configuration for Include and IncludeOptional
@@ -298,28 +308,28 @@ class ApacheParser:
if not self.parsed_in_current(i):
self.parse_file(i)
def update_modules(self):
def update_modules(self) -> None:
"""Get loaded modules from httpd process, and add them to DOM"""
matches = apache_util.parse_modules(self.configurator.options.ctl)
for mod in matches:
self.add_mod(mod.strip())
def filter_args_num(self, matches, args):
def filter_args_num(self, matches: str, args: int) -> List[str]:
"""Filter out directives with specific number of arguments.
This function makes the assumption that all related arguments are given
in order. Thus /files/apache/directive[5]/arg[2] must come immediately
after /files/apache/directive[5]/arg[1]. Runs in 1 linear pass.
:param string matches: Matches of all directives with arg nodes
:param str matches: Matches of all directives with arg nodes
:param int args: Number of args you would like to filter
:returns: List of directives that contain # of arguments.
(arg is stripped off)
"""
filtered = []
filtered: List[str] = []
if args == 1:
for i, match in enumerate(matches):
if match.endswith("/arg"):
@@ -336,7 +346,7 @@ class ApacheParser:
return filtered
def add_dir_to_ifmodssl(self, aug_conf_path, directive, args):
def add_dir_to_ifmodssl(self, aug_conf_path: str, directive: str, args: List[str]) -> None:
"""Adds directive and value to IfMod ssl block.
Adds given directive and value along configuration path within
@@ -362,7 +372,7 @@ class ApacheParser:
for i, arg in enumerate(args):
self.aug.set("%s/arg[%d]" % (nvh_path, i + 1), arg)
def get_ifmod(self, aug_conf_path, mod, beginning=False):
def get_ifmod(self, aug_conf_path: str, mod: str, beginning: bool = False) -> str:
"""Returns the path to <IfMod mod> and creates one if it doesn't exist.
:param str aug_conf_path: Augeas configuration path
@@ -384,7 +394,7 @@ class ApacheParser:
# Strip off "arg" at end of first ifmod path
return if_mods[0].rpartition("arg")[0]
def create_ifmod(self, aug_conf_path, mod, beginning=False):
def create_ifmod(self, aug_conf_path: str, mod: str, beginning: bool = False) -> str:
"""Creates a new <IfMod mod> and returns its path.
:param str aug_conf_path: Augeas configuration path
@@ -411,7 +421,9 @@ class ApacheParser:
self.aug.set(c_path_arg, mod)
return retpath
def add_dir(self, aug_conf_path, directive, args):
def add_dir(
self, aug_conf_path: str, directive: Optional[str], args: Union[List[str], str]
) -> None:
"""Appends directive to the end fo the file given by aug_conf_path.
.. note:: Not added to AugeasConfigurator because it may depend
@@ -431,7 +443,8 @@ class ApacheParser:
else:
self.aug.set(aug_conf_path + "/directive[last()]/arg", args)
def add_dir_beginning(self, aug_conf_path, dirname, args):
def add_dir_beginning(self, aug_conf_path: str, dirname: str,
args: Union[List[str], str]) -> None:
"""Adds the directive to the beginning of defined aug_conf_path.
:param str aug_conf_path: Augeas configuration path to add directive
@@ -440,7 +453,11 @@ class ApacheParser:
:type args: list or str
"""
first_dir = aug_conf_path + "/directive[1]"
self.aug.insert(first_dir, "directive", True)
if self.aug.get(first_dir):
self.aug.insert(first_dir, "directive", True)
else:
self.aug.set(first_dir, "directive")
self.aug.set(first_dir, dirname)
if isinstance(args, list):
for i, value in enumerate(args, 1):
@@ -448,7 +465,7 @@ class ApacheParser:
else:
self.aug.set(first_dir + "/arg", args)
def add_comment(self, aug_conf_path, comment):
def add_comment(self, aug_conf_path: str, comment: str) -> None:
"""Adds the comment to the augeas path
:param str aug_conf_path: Augeas configuration path to add directive
@@ -457,7 +474,7 @@ class ApacheParser:
"""
self.aug.set(aug_conf_path + "/#comment[last() + 1]", comment)
def find_comments(self, arg, start=None):
def find_comments(self, arg: str, start: Optional[str] = None) -> List[str]:
"""Finds a comment with specified content from the provided DOM path
:param str arg: Comment content to search
@@ -479,7 +496,8 @@ class ApacheParser:
results.append(comment)
return results
def find_dir(self, directive, arg=None, start=None, exclude=True):
def find_dir(self, directive: str, arg: Optional[str] = None,
start: Optional[str] = None, exclude: bool = True) -> List[str]:
"""Finds directive in the configuration.
Recursively searches through config files to find directives
@@ -507,6 +525,8 @@ class ApacheParser:
:param bool exclude: Whether or not to exclude directives based on
variables and enabled modules
:rtype list
"""
# Cannot place member variable in the definition of the function so...
if not start:
@@ -555,7 +575,7 @@ class ApacheParser:
return ordered_matches
def get_all_args(self, match):
def get_all_args(self, match: str) -> List[Optional[str]]:
"""
Tries to fetch all arguments for a directive. See get_arg.
@@ -565,11 +585,11 @@ class ApacheParser:
"""
if match[-1] != "/":
match = match+"/"
match = match + "/"
allargs = self.aug.match(match + '*')
return [self.get_arg(arg) for arg in allargs]
def get_arg(self, match):
def get_arg(self, match: Optional[str]) -> Optional[str]:
"""Uses augeas.get to get argument value and interprets result.
This also converts all variables and parameters appropriately.
@@ -584,6 +604,7 @@ class ApacheParser:
# e.g. strip now, not later
if not value:
return None
value = value.strip("'\"")
variables = ApacheParser.arg_var_interpreter.findall(value)
@@ -597,13 +618,13 @@ class ApacheParser:
return value
def get_root_augpath(self):
def get_root_augpath(self) -> str:
"""
Returns the Augeas path of root configuration.
"""
return get_aug_path(self.loc["root"])
def exclude_dirs(self, matches):
def exclude_dirs(self, matches: Iterable[str]) -> List[str]:
"""Exclude directives that are not loaded into the configuration."""
filters = [("ifmodule", self.modules.keys()), ("ifdefine", self.variables)]
@@ -617,7 +638,7 @@ class ApacheParser:
valid_matches.append(match)
return valid_matches
def _pass_filter(self, match, filter_):
def _pass_filter(self, match: str, filter_: Tuple[str, Collection[str]]) -> bool:
"""Determine if directive passes a filter.
:param str match: Augeas path
@@ -646,7 +667,7 @@ class ApacheParser:
return True
def standard_path_from_server_root(self, arg):
def standard_path_from_server_root(self, arg: str) -> str:
"""Ensure paths are consistent and absolute
:param str arg: Argument of directive
@@ -665,7 +686,7 @@ class ApacheParser:
arg = os.path.normpath(arg)
return arg
def _get_include_path(self, arg):
def _get_include_path(self, arg: Optional[str]) -> Optional[str]:
"""Converts an Apache Include directive into Augeas path.
Converts an Apache Include directive argument into an Augeas
@@ -685,6 +706,8 @@ class ApacheParser:
# if matchObj.group() != arg:
# logger.error("Error: Invalid regexp characters in %s", arg)
# return []
if arg is None:
return None # pragma: no cover
arg = self.standard_path_from_server_root(arg)
# Attempts to add a transform to the file if one does not already exist
@@ -709,7 +732,7 @@ class ApacheParser:
return get_aug_path(arg)
def fnmatch_to_re(self, clean_fn_match):
def fnmatch_to_re(self, clean_fn_match: str) -> str:
"""Method converts Apache's basic fnmatch to regular expression.
Assumption - Configs are assumed to be well-formed and only writable by
@@ -726,7 +749,7 @@ class ApacheParser:
# Since Python 3.6, it returns a different pattern like (?s:.*\.load)\Z
return fnmatch.translate(clean_fn_match)[4:-3] # pragma: no cover
def parse_file(self, filepath):
def parse_file(self, filepath: str) -> None:
"""Parse file with Augeas
Checks to see if file_path is parsed by Augeas
@@ -753,7 +776,7 @@ class ApacheParser:
self._add_httpd_transform(filepath)
self.aug.load()
def parsed_in_current(self, filep):
def parsed_in_current(self, filep: Optional[str]) -> bool:
"""Checks if the file path is parsed by current Augeas parser config
ie. returns True if the file is found on a path that's found in live
Augeas configuration.
@@ -763,9 +786,11 @@ class ApacheParser:
:returns: True if file is parsed in existing configuration tree
:rtype: bool
"""
if not filep:
return False # pragma: no cover
return self._parsed_by_parser_paths(filep, self.parser_paths)
def parsed_in_original(self, filep):
def parsed_in_original(self, filep: Optional[str]) -> bool:
"""Checks if the file path is parsed by existing Apache config.
ie. returns True if the file is found on a path that matches Include or
IncludeOptional statement in the Apache configuration.
@@ -775,9 +800,11 @@ class ApacheParser:
:returns: True if file is parsed in existing configuration tree
:rtype: bool
"""
if not filep:
return False # pragma: no cover
return self._parsed_by_parser_paths(filep, self.existing_paths)
def _parsed_by_parser_paths(self, filep, paths):
def _parsed_by_parser_paths(self, filep: str, paths: Mapping[str, List[str]]) -> bool:
"""Helper function that searches through provided paths and returns
True if file path is found in the set"""
for directory in paths:
@@ -786,7 +813,7 @@ class ApacheParser:
return True
return False
def _check_path_actions(self, filepath):
def _check_path_actions(self, filepath: str) -> Tuple[bool, bool]:
"""Determine actions to take with a new augeas path
This helper function will return a tuple that defines
@@ -811,7 +838,7 @@ class ApacheParser:
remove_old = False
return use_new, remove_old
def _remove_httpd_transform(self, filepath):
def _remove_httpd_transform(self, filepath: str) -> None:
"""Remove path from Augeas transform
:param str filepath: filepath to remove
@@ -826,7 +853,7 @@ class ApacheParser:
self.aug.remove(remove_inc[0])
self.parser_paths.pop(remove_dirname)
def _add_httpd_transform(self, incl):
def _add_httpd_transform(self, incl: str) -> None:
"""Add a transform to Augeas.
This function will correctly add a transform to augeas
@@ -836,7 +863,7 @@ class ApacheParser:
:param str incl: filepath to include for transform
"""
last_include = self.aug.match("/augeas/load/Httpd/incl [last()]")
last_include: str = self.aug.match("/augeas/load/Httpd/incl [last()]")
if last_include:
# Insert a new node immediately after the last incl
self.aug.insert(last_include[0], "incl", False)
@@ -854,7 +881,7 @@ class ApacheParser:
self.parser_paths[os.path.dirname(incl)] = [
os.path.basename(incl)]
def standardize_excl(self):
def standardize_excl(self) -> None:
"""Standardize the excl arguments for the Httpd lens in Augeas.
Note: Hack!
@@ -886,16 +913,16 @@ class ApacheParser:
self.aug.load()
def _set_locations(self):
def _set_locations(self) -> Dict[str, str]:
"""Set default location for directives.
Locations are given as file_paths
.. todo:: Make sure that files are included
"""
default = self.loc["root"]
default: str = self.loc["root"]
temp = os.path.join(self.root, "ports.conf")
temp: str = os.path.join(self.root, "ports.conf")
if os.path.isfile(temp):
listen = temp
name = temp
@@ -905,7 +932,7 @@ class ApacheParser:
return {"default": default, "listen": listen, "name": name}
def _find_config_root(self):
def _find_config_root(self) -> str:
"""Find the Apache Configuration Root file."""
location = ["apache2.conf", "httpd.conf", "conf/httpd.conf"]
for name in location:
@@ -914,7 +941,7 @@ class ApacheParser:
raise errors.NoInstallationError("Could not find configuration root")
def case_i(string):
def case_i(string: str) -> str:
"""Returns case insensitive regex.
Returns a sloppy, but necessary version of a case insensitive regex.
@@ -930,7 +957,7 @@ def case_i(string):
if c.isalpha() else c for c in re.escape(string))
def get_aug_path(file_path):
def get_aug_path(file_path: str) -> str:
"""Return augeas path for full filepath.
:param str file_path: Full filepath

View File

@@ -1,7 +1,12 @@
"""ParserNode utils"""
from typing import Dict
from typing import Any
from typing import List
from typing import Optional
from typing import Tuple
def validate_kwargs(kwargs, required_names):
def validate_kwargs(kwargs: Dict[str, Any], required_names: List[str]) -> Dict[str, Any]:
"""
Ensures that the kwargs dict has all the expected values. This function modifies
the kwargs dictionary, and hence the returned dictionary should be used instead
@@ -11,7 +16,7 @@ def validate_kwargs(kwargs, required_names):
:param list required_names: List of required parameter names.
"""
validated_kwargs = {}
validated_kwargs: Dict[str, Any] = {}
for name in required_names:
try:
validated_kwargs[name] = kwargs.pop(name)
@@ -25,7 +30,7 @@ def validate_kwargs(kwargs, required_names):
return validated_kwargs
def parsernode_kwargs(kwargs):
def parsernode_kwargs(kwargs: Dict[str, Any]) -> Tuple[Any, ...]:
"""
Validates keyword arguments for ParserNode. This function modifies the kwargs
dictionary, and hence the returned dictionary should be used instead in the
@@ -55,7 +60,7 @@ def parsernode_kwargs(kwargs):
return kwargs["ancestor"], kwargs["dirty"], kwargs["filepath"], kwargs["metadata"]
def commentnode_kwargs(kwargs):
def commentnode_kwargs(kwargs: Dict[str, Any]) -> Tuple[Optional[str], Dict[str, str]]:
"""
Validates keyword arguments for CommentNode and sets the default values for
optional kwargs. This function modifies the kwargs dictionary, and hence the
@@ -90,7 +95,7 @@ def commentnode_kwargs(kwargs):
return comment, kwargs
def directivenode_kwargs(kwargs):
def directivenode_kwargs(kwargs: Dict[str, Any]) -> Tuple[Any, Any, Any, Dict]:
"""
Validates keyword arguments for DirectiveNode and BlockNode and sets the
default values for optional kwargs. This function modifies the kwargs

View File

View File

@@ -1,7 +1,7 @@
from setuptools import find_packages
from setuptools import setup
version = '1.17.0.dev0'
version = '1.23.0.dev0'
install_requires = [
# We specify the minimum acme and certbot version as the current plugin
@@ -11,8 +11,6 @@ install_requires = [
f'certbot>={version}',
'python-augeas',
'setuptools>=39.0.1',
'zope.component',
'zope.interface',
]
dev_extras = [
@@ -40,6 +38,7 @@ setup(
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -25,24 +25,29 @@ def _get_augeasnode_mock(filepath):
metadata=metadata)
return augeasnode_mock
class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-methods
"""Test AugeasParserNode using available test configurations"""
def setUp(self): # pylint: disable=arguments-differ
super().setUp()
with mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.get_parsernode_root") as mock_parsernode:
with mock.patch(
"certbot_apache._internal.configurator.ApacheConfigurator.get_parsernode_root"
) as mock_parsernode:
mock_parsernode.side_effect = _get_augeasnode_mock(
os.path.join(self.config_path, "apache2.conf"))
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir, use_parsernode=True)
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
use_parsernode=True,
)
self.vh_truth = util.get_vh_truth(
self.temp_dir, "debian_apache_2_4/multiple_vhosts")
def test_save(self):
with mock.patch('certbot_apache._internal.parser.ApacheParser.save') as mock_save:
self.config.parser_root.save("A save message")
self.assertTrue(mock_save.called)
self.assertIs(mock_save.called, True)
self.assertEqual(mock_save.call_args[0][0], "A save message")
def test_unsaved_files(self):
@@ -67,7 +72,8 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
"/Anything": "Anything",
}
for test in testcases:
self.assertEqual(block._aug_get_name(test), testcases[test]) # pylint: disable=protected-access
# pylint: disable=protected-access
self.assertEqual(block._aug_get_name(test), testcases[test])
def test_find_blocks(self):
blocks = self.config.parser_root.find_blocks("VirtualHost", exclude=False)
@@ -81,7 +87,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
def test_find_directive_found(self):
directives = self.config.parser_root.find_directives("Listen")
self.assertEqual(len(directives), 1)
self.assertTrue(directives[0].filepath.endswith("/apache2/ports.conf"))
self.assertIs(directives[0].filepath.endswith("/apache2/ports.conf"), True)
self.assertEqual(directives[0].parameters, (u'80',))
def test_find_directive_notfound(self):
@@ -96,29 +102,29 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
servername = vh.find_directives("servername")
self.assertEqual(servername[0].parameters[0], "certbot.demo")
found = True
self.assertTrue(found)
self.assertIs(found, True)
def test_find_comments(self):
rootcomment = self.config.parser_root.find_comments(
"This is the main Apache server configuration file. "
)
self.assertEqual(len(rootcomment), 1)
self.assertTrue(rootcomment[0].filepath.endswith(
self.assertIs(rootcomment[0].filepath.endswith(
"debian_apache_2_4/multiple_vhosts/apache2/apache2.conf"
))
), True)
def test_set_parameters(self):
servernames = self.config.parser_root.find_directives("servername")
names: List[str] = []
for servername in servernames:
names += servername.parameters
self.assertFalse("going_to_set_this" in names)
self.assertNotIn("going_to_set_this", names)
servernames[0].set_parameters(["something", "going_to_set_this"])
servernames = self.config.parser_root.find_directives("servername")
names = []
for servername in servernames:
names += servername.parameters
self.assertTrue("going_to_set_this" in names)
self.assertIn("going_to_set_this", names)
def test_set_parameters_atinit(self):
from certbot_apache._internal.augeasparser import AugeasDirectiveNode
@@ -131,7 +137,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
ancestor=assertions.PASS,
metadata=servernames[0].metadata
)
self.assertTrue(mock_set.called)
self.assertIs(mock_set.called, True)
self.assertEqual(
mock_set.call_args_list[0][0][0],
["test", "setting", "these"]
@@ -151,7 +157,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
self.assertEqual(len(servername.parameters), 3)
servername.set_parameters(["thisshouldnotexistpreviously"])
found = True
self.assertTrue(found)
self.assertIs(found, True)
# Verify params
servernames = self.config.parser_root.find_directives("servername")
@@ -161,7 +167,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
self.assertEqual(len(servername.parameters), 1)
servername.set_parameters(["thisshouldnotexistpreviously"])
found = True
self.assertTrue(found)
self.assertIs(found, True)
def test_add_child_comment(self):
newc = self.config.parser_root.add_child_comment("The content")
@@ -201,7 +207,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
rpath,
self.config.parser_root.metadata["augeaspath"]
)
self.assertTrue(directive.startswith("NewBlock"))
self.assertIs(directive.startswith("NewBlock"), True)
def test_add_child_block_beginning(self):
self.config.parser_root.add_child_block(
@@ -212,7 +218,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
root_path = self.config.parser_root.metadata["augeaspath"]
# Get first child
first = parser.aug.match("{}/*[1]".format(root_path))
self.assertTrue(first[0].endswith("Beginning"))
self.assertIs(first[0].endswith("Beginning"), True)
def test_add_child_block_append(self):
self.config.parser_root.add_child_block(
@@ -222,7 +228,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
root_path = self.config.parser_root.metadata["augeaspath"]
# Get last child
last = parser.aug.match("{}/*[last()]".format(root_path))
self.assertTrue(last[0].endswith("VeryLast"))
self.assertIs(last[0].endswith("VeryLast"), True)
def test_add_child_block_append_alt(self):
self.config.parser_root.add_child_block(
@@ -233,7 +239,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
root_path = self.config.parser_root.metadata["augeaspath"]
# Get last child
last = parser.aug.match("{}/*[last()]".format(root_path))
self.assertTrue(last[0].endswith("VeryLastAlt"))
self.assertIs(last[0].endswith("VeryLastAlt"), True)
def test_add_child_block_middle(self):
self.config.parser_root.add_child_block(
@@ -244,7 +250,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
root_path = self.config.parser_root.metadata["augeaspath"]
# Augeas indices start at 1 :(
middle = parser.aug.match("{}/*[6]".format(root_path))
self.assertTrue(middle[0].endswith("Middle"))
self.assertIs(middle[0].endswith("Middle"), True)
def test_add_child_block_existing_name(self):
parser = self.config.parser_root.parser
@@ -257,7 +263,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
)
new_block = parser.aug.match("{}/VirtualHost[2]".format(root_path))
self.assertEqual(len(new_block), 1)
self.assertTrue(vh.metadata["augeaspath"].endswith("VirtualHost[2]"))
self.assertIs(vh.metadata["augeaspath"].endswith("VirtualHost[2]"), True)
def test_node_init_error_bad_augeaspath(self):
from certbot_apache._internal.augeasparser import AugeasBlockNode
@@ -302,7 +308,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
self.assertEqual(len(dirs), 1)
self.assertEqual(dirs[0].parameters, ("with", "parameters"))
# The new directive was added to the very first line of the config
self.assertTrue(dirs[0].metadata["augeaspath"].endswith("[1]"))
self.assertIs(dirs[0].metadata["augeaspath"].endswith("[1]"), True)
def test_add_child_directive_exception(self):
self.assertRaises(
@@ -328,8 +334,8 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
ancs = vh.find_ancestors("Macro")
self.assertEqual(len(ancs), 0)
nonmacro_test = True
self.assertTrue(macro_test)
self.assertTrue(nonmacro_test)
self.assertIs(macro_test, True)
self.assertIs(nonmacro_test, True)
def test_find_ancestors_bad_path(self):
self.config.parser_root.metadata["augeaspath"] = ""

View File

@@ -47,7 +47,7 @@ class AutoHSTSTest(util.ApacheTest):
self.config.parser.modules.pop("headers_module", None)
self.config.parser.modules.pop("mod_header.c", None)
self.config.enable_autohsts(mock.MagicMock(), ["ocspvhost.com"])
self.assertTrue(mock_enable.called)
self.assertIs(mock_enable.called, True)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.restart")
def test_autohsts_deploy_already_exists(self, _restart):
@@ -74,7 +74,7 @@ class AutoHSTSTest(util.ApacheTest):
# Verify increased value
self.assertEqual(self.get_autohsts_value(self.vh_truth[7].path),
inc_val)
self.assertTrue(mock_prepare.called)
self.assertIs(mock_prepare.called, True)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.restart")
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator._autohsts_increase")
@@ -88,7 +88,7 @@ class AutoHSTSTest(util.ApacheTest):
self.config.update_autohsts(mock.MagicMock())
# Freq not patched, so value shouldn't increase
self.assertFalse(mock_increase.called)
self.assertIs(mock_increase.called, False)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.restart")
@@ -135,13 +135,13 @@ class AutoHSTSTest(util.ApacheTest):
# Time mock is used to make sure that the execution does not
# continue when no autohsts entries exist in pluginstorage
self.config.update_autohsts(mock.MagicMock())
self.assertFalse(mock_time.called)
self.assertIs(mock_time.called, False)
def test_autohsts_make_permanent_noop(self):
self.config.storage.put = mock.MagicMock()
self.config.deploy_autohsts(mock.MagicMock())
# Make sure that the execution does not continue when no entries in store
self.assertFalse(self.config.storage.put.called)
self.assertIs(self.config.storage.put.called, False)
@mock.patch("certbot_apache._internal.display_ops.select_vhost")
def test_autohsts_no_ssl_vhost(self, mock_select):
@@ -150,15 +150,13 @@ class AutoHSTSTest(util.ApacheTest):
self.assertRaises(errors.PluginError,
self.config.enable_autohsts,
mock.MagicMock(), "invalid.example.com")
self.assertTrue(
"Certbot was not able to find SSL" in mock_log.call_args[0][0])
self.assertIn("Certbot was not able to find SSL", mock_log.call_args[0][0])
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.restart")
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.add_vhost_id")
def test_autohsts_dont_enhance_twice(self, mock_id, _restart):
mock_id.return_value = "1234567"
self.config.enable_autohsts(mock.MagicMock(),
["ocspvhost.com", "ocspvhost.com"])
self.config.enable_autohsts(mock.MagicMock(), ["ocspvhost.com", "ocspvhost.com"])
self.assertEqual(mock_id.call_count, 1)
def test_autohsts_remove_orphaned(self):
@@ -168,7 +166,7 @@ class AutoHSTSTest(util.ApacheTest):
self.config._autohsts_save_state()
self.config.update_autohsts(mock.MagicMock())
self.assertFalse("orphan_id" in self.config._autohsts)
self.assertNotIn("orphan_id", self.config._autohsts)
# Make sure it's removed from the pluginstorage file as well
self.config._autohsts = None
self.config._autohsts_fetch_state()
@@ -181,9 +179,8 @@ class AutoHSTSTest(util.ApacheTest):
self.config._autohsts_save_state()
with mock.patch("certbot_apache._internal.configurator.logger.error") as mock_log:
self.config.deploy_autohsts(mock.MagicMock())
self.assertTrue(mock_log.called)
self.assertTrue(
"VirtualHost with id orphan_id was not" in mock_log.call_args[0][0])
self.assertIs(mock_log.called, True)
self.assertIn("VirtualHost with id orphan_id was not", mock_log.call_args[0][0])
if __name__ == "__main__":

View File

@@ -48,8 +48,7 @@ class CentOS6Tests(util.ApacheTest):
self.temp_dir, "centos6_apache/apache")
def test_get_parser(self):
self.assertTrue(isinstance(self.config.parser,
override_centos.CentOSParser))
self.assertIsInstance(self.config.parser, override_centos.CentOSParser)
def test_get_virtual_hosts(self):
"""Make sure all vhosts are being properly found."""
@@ -72,9 +71,9 @@ class CentOS6Tests(util.ApacheTest):
"LoadModule", "ssl_module", exclude=False)
self.assertEqual(len(ssl_loadmods), 1)
# Make sure the LoadModule ssl_module is in ssl.conf (default)
self.assertTrue("ssl.conf" in ssl_loadmods[0])
self.assertIn("ssl.conf", ssl_loadmods[0])
# ...and that it's not inside of <IfModule>
self.assertFalse("IfModule" in ssl_loadmods[0])
self.assertNotIn("IfModule", ssl_loadmods[0])
# Get the example vhost
self.config.assoc["test.example.com"] = self.vh_truth[0]
@@ -95,7 +94,7 @@ class CentOS6Tests(util.ApacheTest):
# ...and both of them should be wrapped in <IfModule !mod_ssl.c>
# lm[:-17] strips off /directive/arg[1] from the path.
ifmod_args = self.config.parser.get_all_args(lm[:-17])
self.assertTrue("!mod_ssl.c" in ifmod_args)
self.assertIn("!mod_ssl.c", ifmod_args)
@mock.patch("certbot_apache._internal.configurator.display_util.notify")
def test_loadmod_multiple(self, unused_mock_notify):
@@ -107,7 +106,7 @@ class CentOS6Tests(util.ApacheTest):
pre_loadmods = self.config.parser.find_dir(
"LoadModule", "ssl_module", exclude=False)
# LoadModules are not within IfModule blocks
self.assertFalse(any("ifmodule" in m.lower() for m in pre_loadmods))
self.assertIs(any("ifmodule" in m.lower() for m in pre_loadmods), False)
self.config.assoc["test.example.com"] = self.vh_truth[0]
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",
@@ -116,7 +115,9 @@ class CentOS6Tests(util.ApacheTest):
"LoadModule", "ssl_module", exclude=False)
for mod in post_loadmods:
self.assertTrue(self.config.parser.not_modssl_ifmodule(mod)) #pylint: disable=no-member
with self.subTest(mod=mod):
# pylint: disable=no-member
self.assertIs(self.config.parser.not_modssl_ifmodule(mod), True)
@mock.patch("certbot_apache._internal.configurator.display_util.notify")
def test_loadmod_rootconf_exists(self, unused_mock_notify):
@@ -207,20 +208,20 @@ class CentOS6Tests(util.ApacheTest):
post_loadmods = self.config.parser.find_dir("LoadModule",
"ssl_module",
exclude=False)
self.assertFalse(post_loadmods)
self.assertEqual(post_loadmods, [])
def test_no_ifmod_search_false(self):
#pylint: disable=no-member
self.assertFalse(self.config.parser.not_modssl_ifmodule(
self.assertIs(self.config.parser.not_modssl_ifmodule(
"/path/does/not/include/ifmod"
))
self.assertFalse(self.config.parser.not_modssl_ifmodule(
), False)
self.assertIs(self.config.parser.not_modssl_ifmodule(
""
))
self.assertFalse(self.config.parser.not_modssl_ifmodule(
), False)
self.assertIs(self.config.parser.not_modssl_ifmodule(
"/path/includes/IfModule/but/no/arguments"
))
), False)
if __name__ == "__main__":

View File

@@ -34,6 +34,7 @@ def get_vh_truth(temp_dir, config_name):
]
return vh_truth
class FedoraRestartTest(util.ApacheTest):
"""Tests for Fedora specific self-signed certificate override"""
@@ -140,8 +141,8 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
self.assertEqual(mock_get.call_count, 3)
self.assertEqual(len(self.config.parser.modules), 4)
self.assertEqual(len(self.config.parser.variables), 2)
self.assertTrue("TEST2" in self.config.parser.variables)
self.assertTrue("mod_another.c" in self.config.parser.modules)
self.assertIn("TEST2", self.config.parser.variables)
self.assertIn("mod_another.c", self.config.parser.modules)
def test_get_virtual_hosts(self):
"""Make sure all vhosts are being properly found."""
@@ -172,11 +173,11 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
mock_osi.return_value = ("centos", "7")
self.config.parser.update_runtime_variables()
self.assertTrue("mock_define" in self.config.parser.variables)
self.assertTrue("mock_define_too" in self.config.parser.variables)
self.assertTrue("mock_value" in self.config.parser.variables)
self.assertIn("mock_define", self.config.parser.variables)
self.assertIn("mock_define_too", self.config.parser.variables)
self.assertIn("mock_value", self.config.parser.variables)
self.assertEqual("TRUE", self.config.parser.variables["mock_value"])
self.assertTrue("MOCK_NOSEP" in self.config.parser.variables)
self.assertIn("MOCK_NOSEP", self.config.parser.variables)
self.assertEqual("NOSEP_VAL", self.config.parser.variables["NOSEP_TWO"])
@mock.patch("certbot_apache._internal.configurator.util.run_script")

View File

@@ -11,8 +11,7 @@ class ComplexParserTest(util.ParserTest):
"""Apache Parser Test."""
def setUp(self): # pylint: disable=arguments-differ
super().setUp(
"complex_parsing", "complex_parsing")
super().setUp("complex_parsing", "complex_parsing")
self.setup_variables()
# This needs to happen after due to setup_variables not being run
@@ -78,12 +77,12 @@ class ComplexParserTest(util.ParserTest):
def test_load_modules(self):
"""If only first is found, there is bad variable parsing."""
self.assertTrue("status_module" in self.parser.modules)
self.assertTrue("mod_status.c" in self.parser.modules)
self.assertIn("status_module", self.parser.modules)
self.assertIn("mod_status.c", self.parser.modules)
# This is in an IfDefine
self.assertTrue("ssl_module" in self.parser.modules)
self.assertTrue("mod_ssl.c" in self.parser.modules)
self.assertIn("ssl_module", self.parser.modules)
self.assertIn("mod_ssl.c", self.parser.modules)
def verify_fnmatch(self, arg, hit=True):
"""Test if Include was correctly parsed."""

View File

@@ -14,15 +14,13 @@ import util
class ConfiguratorReverterTest(util.ApacheTest):
"""Test for ApacheConfigurator reverter methods"""
def setUp(self): # pylint: disable=arguments-differ
super().setUp()
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir)
self.vh_truth = util.get_vh_truth(
self.temp_dir, "debian_apache_2_4/multiple_vhosts")
self.vh_truth = util.get_vh_truth(self.temp_dir, "debian_apache_2_4/multiple_vhosts")
def tearDown(self):
shutil.rmtree(self.config_dir)
@@ -30,17 +28,13 @@ class ConfiguratorReverterTest(util.ApacheTest):
shutil.rmtree(self.temp_dir)
def test_bad_save_checkpoint(self):
self.config.reverter.add_to_checkpoint = mock.Mock(
side_effect=errors.ReverterError)
self.config.parser.add_dir(
self.vh_truth[0].path, "Test", "bad_save_ckpt")
self.config.reverter.add_to_checkpoint = mock.Mock(side_effect=errors.ReverterError)
self.config.parser.add_dir(self.vh_truth[0].path, "Test", "bad_save_ckpt")
self.assertRaises(errors.PluginError, self.config.save)
def test_bad_save_finalize_checkpoint(self):
self.config.reverter.finalize_checkpoint = mock.Mock(
side_effect=errors.ReverterError)
self.config.parser.add_dir(
self.vh_truth[0].path, "Test", "bad_save_ckpt")
self.config.reverter.finalize_checkpoint = mock.Mock(side_effect=errors.ReverterError)
self.config.parser.add_dir(self.vh_truth[0].path, "Test", "bad_save_ckpt")
self.assertRaises(errors.PluginError, self.config.save, "Title")
def test_finalize_save(self):
@@ -72,8 +66,7 @@ class ConfiguratorReverterTest(util.ApacheTest):
self.assertEqual(mock_load.call_count, 1)
def test_rollback_error(self):
self.config.reverter.rollback_checkpoints = mock.Mock(
side_effect=errors.ReverterError)
self.config.reverter.rollback_checkpoints = mock.Mock(side_effect=errors.ReverterError)
self.assertRaises(errors.PluginError, self.config.rollback_checkpoints)
def test_recovery_routine_reload(self):

View File

@@ -83,8 +83,8 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.prepare()
except errors.PluginError as err:
err_msg = str(err)
self.assertTrue("lock" in err_msg)
self.assertTrue(self.config.conf("server-root") in err_msg)
self.assertIn("lock", err_msg)
self.assertIn(self.config.conf("server-root"), err_msg)
else: # pragma: no cover
self.fail("Exception wasn't raised!")
@@ -116,7 +116,8 @@ class MultipleVhostsTest(util.ApacheTest):
# Make sure that all (and only) the expected values exist
self.assertEqual(len(mock_add.call_args_list), len(found))
for e in exp:
self.assertTrue(e in found)
with self.subTest(e=e):
self.assertIn(e, found)
del os.environ["CERTBOT_DOCS"]
@@ -130,13 +131,12 @@ class MultipleVhostsTest(util.ApacheTest):
from certbot_apache._internal.configurator import ApacheConfigurator
parameters = set(ApacheConfigurator.OS_DEFAULTS.__dict__.keys())
for cls in OVERRIDE_CLASSES.values():
self.assertTrue(parameters.issubset(set(cls.OS_DEFAULTS.__dict__.keys())))
self.assertIs(parameters.issubset(set(cls.OS_DEFAULTS.__dict__.keys())), True)
def test_constant(self):
self.assertTrue("debian_apache_2_4/multiple_vhosts/apache" in
self.config.options.server_root)
self.assertIn("debian_apache_2_4/multiple_vhosts/apache", self.config.options.server_root)
@certbot_util.patch_get_utility()
@certbot_util.patch_display_util()
def test_get_all_names(self, mock_getutility):
mock_utility = mock_getutility()
mock_utility.notification = mock.MagicMock(return_value=True)
@@ -145,7 +145,7 @@ class MultipleVhostsTest(util.ApacheTest):
"nonsym.link", "vhost.in.rootconf", "www.certbot.demo",
"duplicate.example.com"})
@certbot_util.patch_get_utility()
@certbot_util.patch_display_util()
@mock.patch("certbot_apache._internal.configurator.socket.gethostbyaddr")
def test_get_all_names_addrs(self, mock_gethost, mock_getutility):
mock_gethost.side_effect = [("google.com", "", ""), socket.error]
@@ -162,9 +162,9 @@ class MultipleVhostsTest(util.ApacheTest):
names = self.config.get_all_names()
self.assertEqual(len(names), 9)
self.assertTrue("zombo.com" in names)
self.assertTrue("google.com" in names)
self.assertTrue("certbot.demo" in names)
self.assertIn("zombo.com", names)
self.assertIn("google.com", names)
self.assertIn("certbot.demo", names)
def test_get_bad_path(self):
self.assertEqual(apache_util.get_file_path(None), None)
@@ -188,16 +188,14 @@ class MultipleVhostsTest(util.ApacheTest):
True, False)
# pylint: disable=protected-access
self.config._add_servernames(ssl_vh1)
self.assertTrue(
self.config._add_servername_alias("oy_vey", ssl_vh1) is None)
self.assertIsNone(self.config._add_servername_alias("oy_vey", ssl_vh1))
def test_add_servernames_alias(self):
self.config.parser.add_dir(
self.vh_truth[2].path, "ServerAlias", ["*.le.co"])
# pylint: disable=protected-access
self.config._add_servernames(self.vh_truth[2])
self.assertEqual(
self.vh_truth[2].get_names(), {"*.le.co", "ip-172-30-0-17"})
self.assertEqual(self.vh_truth[2].get_names(), {"*.le.co", "ip-172-30-0-17"})
def test_get_virtual_hosts(self):
"""Make sure all vhosts are being properly found."""
@@ -246,8 +244,8 @@ class MultipleVhostsTest(util.ApacheTest):
self.vh_truth[0].get_names(), chosen_vhost.get_names())
# Make sure we go from HTTP -> HTTPS
self.assertFalse(self.vh_truth[0].ssl)
self.assertTrue(chosen_vhost.ssl)
self.assertIs(self.vh_truth[0].ssl, False)
self.assertIs(chosen_vhost.ssl, True)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator._find_best_vhost")
@mock.patch("certbot_apache._internal.parser.ApacheParser.add_dir")
@@ -256,7 +254,7 @@ class MultipleVhostsTest(util.ApacheTest):
ret_vh.enabled = False
mock_find.return_value = self.vh_truth[8]
self.config.choose_vhost("whatever.com")
self.assertTrue(mock_add.called)
self.assertIs(mock_add.called, True)
@mock.patch("certbot_apache._internal.display_ops.select_vhost")
def test_choose_vhost_select_vhost_with_temp(self, mock_select):
@@ -291,23 +289,17 @@ class MultipleVhostsTest(util.ApacheTest):
def test_findbest_continues_on_short_domain(self):
# pylint: disable=protected-access
chosen_vhost = self.config._find_best_vhost("purple.com")
self.assertEqual(None, chosen_vhost)
self.assertIsNone(self.config._find_best_vhost("purple.com"))
def test_findbest_continues_on_long_domain(self):
# pylint: disable=protected-access
chosen_vhost = self.config._find_best_vhost("green.red.purple.com")
self.assertEqual(None, chosen_vhost)
self.assertIsNone(self.config._find_best_vhost("green.red.purple.com"))
def test_find_best_vhost(self):
# pylint: disable=protected-access
self.assertEqual(
self.vh_truth[3], self.config._find_best_vhost("certbot.demo"))
self.assertEqual(
self.vh_truth[0],
self.config._find_best_vhost("encryption-example.demo"))
self.assertEqual(
self.config._find_best_vhost("does-not-exist.com"), None)
self.assertEqual(self.vh_truth[3], self.config._find_best_vhost("certbot.demo"))
self.assertEqual(self.vh_truth[0], self.config._find_best_vhost("encryption-example.demo"))
self.assertEqual(self.config._find_best_vhost("does-not-exist.com"), None)
def test_find_best_vhost_variety(self):
# pylint: disable=protected-access
@@ -345,11 +337,11 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["socache_shmcb_module"] = None
self.assertFalse(ssl_vhost.enabled)
self.assertIs(ssl_vhost.enabled, False)
self.config.deploy_cert(
"encryption-example.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.assertTrue(ssl_vhost.enabled)
self.assertIs(ssl_vhost.enabled, True)
def test_no_duplicate_include(self):
def mock_find_dir(directive, argument, _):
@@ -366,7 +358,7 @@ class MultipleVhostsTest(util.ApacheTest):
if a[0][1] == "Include" and a[0][2] == self.config.mod_ssl_conf:
tried_to_add = True
# Include should be added, find_dir is not patched, and returns falsy
self.assertTrue(tried_to_add)
self.assertIs(tried_to_add, True)
self.config.parser.find_dir = mock_find_dir
mock_add.reset_mock()
@@ -395,20 +387,16 @@ class MultipleVhostsTest(util.ApacheTest):
f_args.append(self.config.parser.get_arg(d))
return f_args
# Verify that the dummy directives do not exist
self.assertFalse(
"insert_cert_file_path" in find_args(vhostpath,
"SSLCertificateFile"))
self.assertFalse(
"insert_key_file_path" in find_args(vhostpath,
"SSLCertificateKeyFile"))
self.assertNotIn(
"insert_cert_file_path", find_args(vhostpath, "SSLCertificateFile"))
self.assertNotIn(
"insert_key_file_path", find_args(vhostpath, "SSLCertificateKeyFile"))
orig_add_dummy(vhostpath)
# Verify that the dummy directives exist
self.assertTrue(
"insert_cert_file_path" in find_args(vhostpath,
"SSLCertificateFile"))
self.assertTrue(
"insert_key_file_path" in find_args(vhostpath,
"SSLCertificateKeyFile"))
self.assertIn(
"insert_cert_file_path", find_args(vhostpath, "SSLCertificateFile"))
self.assertIn(
"insert_key_file_path", find_args(vhostpath, "SSLCertificateKeyFile"))
# pylint: disable=protected-access
self.config._add_dummy_ssl_directives = mock_add_dummy_ssl
@@ -420,8 +408,8 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.save()
# Verify ssl_module was enabled.
self.assertTrue(self.vh_truth[1].enabled)
self.assertTrue("ssl_module" in self.config.parser.modules)
self.assertIs(self.vh_truth[1].enabled, True)
self.assertIn("ssl_module", self.config.parser.modules)
loc_cert = self.config.parser.find_dir(
"sslcertificatefile", "example/cert.pem", self.vh_truth[1].path)
@@ -457,17 +445,15 @@ class MultipleVhostsTest(util.ApacheTest):
def test_is_name_vhost(self):
addr = obj.Addr.fromstring("*:80")
self.assertTrue(self.config.is_name_vhost(addr))
self.assertIs(self.config.is_name_vhost(addr), True)
self.config.version = (2, 2)
self.assertFalse(self.config.is_name_vhost(addr))
self.assertEqual(self.config.is_name_vhost(addr), [])
def test_add_name_vhost(self):
self.config.add_name_vhost(obj.Addr.fromstring("*:443"))
self.config.add_name_vhost(obj.Addr.fromstring("*:80"))
self.assertTrue(self.config.parser.find_dir(
"NameVirtualHost", "*:443", exclude=False))
self.assertTrue(self.config.parser.find_dir(
"NameVirtualHost", "*:80"))
self.assertTrue(self.config.parser.find_dir("NameVirtualHost", "*:443", exclude=False))
self.assertTrue(self.config.parser.find_dir("NameVirtualHost", "*:80"))
def test_add_listen_80(self):
mock_find = mock.Mock()
@@ -476,8 +462,8 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.parser.find_dir = mock_find
self.config.parser.add_dir = mock_add_dir
self.config.ensure_listen("80")
self.assertTrue(mock_add_dir.called)
self.assertTrue(mock_find.called)
self.assertIs(mock_add_dir.called, True)
self.assertIs(mock_find.called, True)
self.assertEqual(mock_add_dir.call_args[0][1], "Listen")
self.assertEqual(mock_add_dir.call_args[0][2], "80")
@@ -502,13 +488,13 @@ class MultipleVhostsTest(util.ApacheTest):
# Test
self.config.ensure_listen("8080")
self.assertEqual(mock_add_dir.call_count, 3)
self.assertTrue(mock_add_dir.called)
self.assertIs(mock_add_dir.called, True)
self.assertEqual(mock_add_dir.call_args[0][1], "Listen")
call_found = False
for mock_call in mock_add_dir.mock_calls:
if mock_call[1][2] == ['1.2.3.4:8080']:
call_found = True
self.assertTrue(call_found)
self.assertIs(call_found, True)
@mock.patch("certbot_apache._internal.parser.ApacheParser.reset_modules")
def test_prepare_server_https(self, mock_reset):
@@ -631,8 +617,8 @@ class MultipleVhostsTest(util.ApacheTest):
def test_make_vhost_ssl_nonsymlink(self):
ssl_vhost_slink = self.config.make_vhost_ssl(self.vh_truth[8])
self.assertTrue(ssl_vhost_slink.ssl)
self.assertTrue(ssl_vhost_slink.enabled)
self.assertIs(ssl_vhost_slink.ssl, True)
self.assertIs(ssl_vhost_slink.enabled, True)
self.assertEqual(ssl_vhost_slink.name, "nonsym.link")
def test_make_vhost_ssl_nonexistent_vhost_path(self):
@@ -653,8 +639,8 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertEqual(len(ssl_vhost.addrs), 1)
self.assertEqual({obj.Addr.fromstring("*:443")}, ssl_vhost.addrs)
self.assertEqual(ssl_vhost.name, "encryption-example.demo")
self.assertTrue(ssl_vhost.ssl)
self.assertFalse(ssl_vhost.enabled)
self.assertIs(ssl_vhost.ssl, True)
self.assertIs(ssl_vhost.enabled, False)
self.assertEqual(self.config.is_name_vhost(self.vh_truth[0]),
self.config.is_name_vhost(ssl_vhost))
@@ -733,15 +719,14 @@ class MultipleVhostsTest(util.ApacheTest):
def test_get_ssl_vhost_path(self):
# pylint: disable=protected-access
self.assertTrue(
self.config._get_ssl_vhost_path("example_path").endswith(".conf"))
self.assertIs(self.config._get_ssl_vhost_path("example_path").endswith(".conf"), True)
def test_add_name_vhost_if_necessary(self):
# pylint: disable=protected-access
self.config.add_name_vhost = mock.Mock()
self.config.version = (2, 2)
self.config._add_name_vhost_if_necessary(self.vh_truth[0])
self.assertTrue(self.config.add_name_vhost.called)
self.assertIs(self.config.add_name_vhost.called, True)
new_addrs = set()
for addr in self.vh_truth[0].addrs:
@@ -780,9 +765,9 @@ class MultipleVhostsTest(util.ApacheTest):
for i, achall in enumerate(achalls):
self.config.cleanup([achall])
if i == len(achalls) - 1:
self.assertTrue(mock_restart.called)
self.assertIs(mock_restart.called, True)
else:
self.assertFalse(mock_restart.called)
self.assertIs(mock_restart.called, False)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.restart")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
@@ -795,10 +780,10 @@ class MultipleVhostsTest(util.ApacheTest):
self.config._chall_out.add(achall) # pylint: disable=protected-access
self.config.cleanup([achalls[-1]])
self.assertFalse(mock_restart.called)
self.assertIs(mock_restart.called, False)
self.config.cleanup(achalls)
self.assertTrue(mock_restart.called)
self.assertIs(mock_restart.called, True)
@mock.patch("certbot.util.run_script")
def test_get_version(self, mock_script):
@@ -847,18 +832,18 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertTrue(self.config.more_info())
def test_get_chall_pref(self):
self.assertTrue(isinstance(self.config.get_chall_pref(""), list))
self.assertIsInstance(self.config.get_chall_pref(""), list)
def test_install_ssl_options_conf(self):
path = os.path.join(self.work_dir, "test_it")
other_path = os.path.join(self.work_dir, "other_test_it")
self.config.install_ssl_options_conf(path, other_path)
self.assertTrue(os.path.isfile(path))
self.assertTrue(os.path.isfile(other_path))
self.assertIs(os.path.isfile(path), True)
self.assertIs(os.path.isfile(other_path), True)
# TEST ENHANCEMENTS
def test_supported_enhancements(self):
self.assertTrue(isinstance(self.config.supported_enhancements(), list))
self.assertIsInstance(self.config.supported_enhancements(), list)
def test_find_http_vhost_without_ancestor(self):
# pylint: disable=protected-access
@@ -897,16 +882,16 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertRaises(errors.PluginError, self.config.enhance,
"certbot.demo", "redirect")
# Check that correct logger.warning was printed
self.assertTrue("not able to find" in mock_log.call_args[0][0])
self.assertTrue("\"redirect\"" in mock_log.call_args[0][0])
self.assertIn("not able to find", mock_log.call_args[0][0])
self.assertIn("\"redirect\"", mock_log.call_args[0][0])
mock_log.reset_mock()
self.assertRaises(errors.PluginError, self.config.enhance,
"certbot.demo", "ensure-http-header", "Test")
# Check that correct logger.warning was printed
self.assertTrue("not able to find" in mock_log.call_args[0][0])
self.assertTrue("Test" in mock_log.call_args[0][0])
self.assertIn("not able to find", mock_log.call_args[0][0])
self.assertIn("Test", mock_log.call_args[0][0])
@mock.patch("certbot.util.exe_exists")
def test_ocsp_stapling(self, mock_exe):
@@ -984,7 +969,7 @@ class MultipleVhostsTest(util.ApacheTest):
# pylint: disable=protected-access
http_vh = self.config._get_http_vhost(ssl_vh)
self.assertFalse(http_vh.ssl)
self.assertIs(http_vh.ssl, False)
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
@@ -1039,7 +1024,7 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.enhance("certbot.demo", "ensure-http-header",
"Upgrade-Insecure-Requests")
self.assertTrue("headers_module" in self.config.parser.modules)
self.assertIn("headers_module", self.config.parser.modules)
# Get the ssl vhost for certbot.demo
ssl_vhost = self.config.assoc["certbot.demo"]
@@ -1091,8 +1076,8 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertEqual(len(rw_rule), 3)
# [:-3] to remove the vhost index number
self.assertTrue(rw_engine[0].startswith(self.vh_truth[3].path[:-3]))
self.assertTrue(rw_rule[0].startswith(self.vh_truth[3].path[:-3]))
self.assertIs(rw_engine[0].startswith(self.vh_truth[3].path[:-3]), True)
self.assertIs(rw_rule[0].startswith(self.vh_truth[3].path[:-3]), True)
def test_rewrite_rule_exists(self):
# Skip the enable mod
@@ -1101,7 +1086,7 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.parser.add_dir(
self.vh_truth[3].path, "RewriteRule", ["Unknown"])
# pylint: disable=protected-access
self.assertTrue(self.config._is_rewrite_exists(self.vh_truth[3]))
self.assertIs(self.config._is_rewrite_exists(self.vh_truth[3]), True)
def test_rewrite_engine_exists(self):
# Skip the enable mod
@@ -1141,10 +1126,10 @@ class MultipleVhostsTest(util.ApacheTest):
# three args to rw_rule + 1 arg for the pre existing rewrite
self.assertEqual(len(rw_rule), 5)
# [:-3] to remove the vhost index number
self.assertTrue(rw_engine[0].startswith(self.vh_truth[3].path[:-3]))
self.assertTrue(rw_rule[0].startswith(self.vh_truth[3].path[:-3]))
self.assertIs(rw_engine[0].startswith(self.vh_truth[3].path[:-3]), True)
self.assertIs(rw_rule[0].startswith(self.vh_truth[3].path[:-3]), True)
self.assertTrue("rewrite_module" in self.config.parser.modules)
self.assertIn("rewrite_module", self.config.parser.modules)
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
@@ -1202,7 +1187,7 @@ class MultipleVhostsTest(util.ApacheTest):
"ApacheConfigurator._verify_no_certbot_redirect")
with mock.patch(verify_no_redirect) as mock_verify:
self.config.enhance("green.blue.purple.com", "redirect")
self.assertFalse(mock_verify.called)
self.assertIs(mock_verify.called, False)
def test_redirect_from_previous_run(self):
# Skip the enable mod
@@ -1243,16 +1228,16 @@ class MultipleVhostsTest(util.ApacheTest):
def test_sift_rewrite_rule(self):
# pylint: disable=protected-access
small_quoted_target = "RewriteRule ^ \"http://\""
self.assertFalse(self.config._sift_rewrite_rule(small_quoted_target))
self.assertIs(self.config._sift_rewrite_rule(small_quoted_target), False)
https_target = "RewriteRule ^ https://satoshi"
self.assertTrue(self.config._sift_rewrite_rule(https_target))
self.assertIs(self.config._sift_rewrite_rule(https_target), True)
normal_target = "RewriteRule ^/(.*) http://www.a.com:1234/$1 [L,R]"
self.assertFalse(self.config._sift_rewrite_rule(normal_target))
self.assertIs(self.config._sift_rewrite_rule(normal_target), False)
not_rewriterule = "NotRewriteRule ^ ..."
self.assertFalse(self.config._sift_rewrite_rule(not_rewriterule))
self.assertIs(self.config._sift_rewrite_rule(not_rewriterule), False)
def get_key_and_achalls(self):
"""Return testing achallenges."""
@@ -1281,15 +1266,13 @@ class MultipleVhostsTest(util.ApacheTest):
vhost = self.vh_truth[0]
vhost.enabled = False
vhost.filep = inc_path
self.assertFalse(self.config.parser.find_dir("Include", inc_path))
self.assertFalse(
os.path.dirname(inc_path) in self.config.parser.existing_paths)
self.assertEqual(self.config.parser.find_dir("Include", inc_path), [])
self.assertNotIn(os.path.dirname(inc_path), self.config.parser.existing_paths)
self.config.enable_site(vhost)
self.assertTrue(self.config.parser.find_dir("Include", inc_path))
self.assertTrue(
os.path.dirname(inc_path) in self.config.parser.existing_paths)
self.assertTrue(
os.path.basename(inc_path) in self.config.parser.existing_paths[
self.assertGreaterEqual(len(self.config.parser.find_dir("Include", inc_path)), 1)
self.assertIn(os.path.dirname(inc_path), self.config.parser.existing_paths)
self.assertIn(
os.path.basename(inc_path), self.config.parser.existing_paths[
os.path.dirname(inc_path)])
@mock.patch('certbot_apache._internal.configurator.display_util.notify')
@@ -1312,7 +1295,7 @@ class MultipleVhostsTest(util.ApacheTest):
"example/cert.pem", "example/key.pem",
"example/cert_chain.pem")
# Test that we actually called add_include
self.assertTrue(mock_add.called)
self.assertIs(mock_add.called, True)
shutil.rmtree(tmp_path)
def test_deploy_cert_no_mod_ssl(self):
@@ -1331,7 +1314,7 @@ class MultipleVhostsTest(util.ApacheTest):
ret_vh.enabled = True
self.config.enable_site(ret_vh)
# Make sure that we return early
self.assertFalse(mock_parsed.called)
self.assertIs(mock_parsed.called, False)
def test_enable_mod_unsupported(self):
self.assertRaises(errors.MisconfigurationError,
@@ -1352,7 +1335,7 @@ class MultipleVhostsTest(util.ApacheTest):
# And the actual returned values
self.assertEqual(len(vhs), 1)
self.assertEqual(vhs[0].name, "certbot.demo")
self.assertTrue(vhs[0].ssl)
self.assertIs(vhs[0].ssl, True)
self.assertNotEqual(vhs[0], self.vh_truth[3])
@@ -1364,7 +1347,7 @@ class MultipleVhostsTest(util.ApacheTest):
mock_select_vhs.return_value = [self.vh_truth[1]]
vhs = self.config._choose_vhosts_wildcard("*.certbot.demo",
create_ssl=False)
self.assertFalse(mock_makessl.called)
self.assertIs(mock_makessl.called, False)
self.assertEqual(vhs[0], self.vh_truth[1])
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator._vhosts_for_wildcard")
@@ -1381,14 +1364,13 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertEqual(mock_select_vhs.call_args[0][0][0], self.vh_truth[7])
self.assertEqual(len(mock_select_vhs.call_args_list), 1)
# Ensure that make_vhost_ssl was not called, vhost.ssl == true
self.assertFalse(mock_makessl.called)
self.assertIs(mock_makessl.called, False)
# And the actual returned values
self.assertEqual(len(vhs), 1)
self.assertTrue(vhs[0].ssl)
self.assertIs(vhs[0].ssl, True)
self.assertEqual(vhs[0], self.vh_truth[7])
@mock.patch('certbot_apache._internal.configurator.display_util.notify')
def test_deploy_cert_wildcard(self, unused_mock_notify):
# pylint: disable=protected-access
@@ -1399,7 +1381,7 @@ class MultipleVhostsTest(util.ApacheTest):
with mock.patch(mock_d) as mock_dep:
self.config.deploy_cert("*.wildcard.example.org", "/tmp/path",
"/tmp/path", "/tmp/path", "/tmp/path")
self.assertTrue(mock_dep.called)
self.assertIs(mock_dep.called, True)
self.assertEqual(len(mock_dep.call_args_list), 1)
self.assertEqual(self.vh_truth[7], mock_dep.call_args_list[0][0][0])
@@ -1421,7 +1403,7 @@ class MultipleVhostsTest(util.ApacheTest):
self.config._wildcard_vhosts["*.certbot.demo"] = [self.vh_truth[3]]
self.config.enhance("*.certbot.demo", "ensure-http-header",
"Upgrade-Insecure-Requests")
self.assertFalse(mock_choose.called)
self.assertIs(mock_choose.called, False)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator._choose_vhosts_wildcard")
def test_enhance_wildcard_no_install(self, mock_choose):
@@ -1431,7 +1413,7 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.parser.modules["headers_module"] = None
self.config.enhance("*.certbot.demo", "ensure-http-header",
"Upgrade-Insecure-Requests")
self.assertTrue(mock_choose.called)
self.assertIs(mock_choose.called, True)
def test_add_vhost_id(self):
for vh in [self.vh_truth[0], self.vh_truth[1], self.vh_truth[2]]:
@@ -1510,7 +1492,8 @@ class AugeasVhostsTest(util.ApacheTest):
names = (
"an.example.net", "another.example.net", "an.other.example.net")
for name in names:
self.assertFalse(name in self.config.choose_vhost(name).aliases)
with self.subTest(name=name):
self.assertNotIn(name, self.config.choose_vhost(name).aliases)
@mock.patch("certbot_apache._internal.obj.VirtualHost.conflicts")
def test_choose_vhost_without_matching_wildcard(self, mock_conflicts):
@@ -1518,7 +1501,7 @@ class AugeasVhostsTest(util.ApacheTest):
mock_path = "certbot_apache._internal.display_ops.select_vhost"
with mock.patch(mock_path, lambda _, vhosts: vhosts[0]):
for name in ("a.example.net", "other.example.net"):
self.assertTrue(name in self.config.choose_vhost(name).aliases)
self.assertIn(name, self.config.choose_vhost(name).aliases)
@mock.patch("certbot_apache._internal.obj.VirtualHost.conflicts")
def test_choose_vhost_wildcard_not_found(self, mock_conflicts):
@@ -1551,6 +1534,7 @@ class AugeasVhostsTest(util.ApacheTest):
self.assertRaises(errors.PluginError, self.config.make_vhost_ssl,
broken_vhost)
class MultiVhostsTest(util.ApacheTest):
"""Test configuration with multiple virtualhosts in a single file."""
# pylint: disable=protected-access
@@ -1559,9 +1543,7 @@ class MultiVhostsTest(util.ApacheTest):
td = "debian_apache_2_4/multi_vhosts"
cr = "debian_apache_2_4/multi_vhosts/apache2"
vr = "debian_apache_2_4/multi_vhosts/apache2/sites-available"
super().setUp(test_dir=td,
config_root=cr,
vhost_root=vr)
super().setUp(test_dir=td, config_root=cr, vhost_root=vr)
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path,
@@ -1582,9 +1564,8 @@ class MultiVhostsTest(util.ApacheTest):
self.assertEqual(len(ssl_vhost.addrs), 1)
self.assertEqual({obj.Addr.fromstring("*:443")}, ssl_vhost.addrs)
self.assertEqual(ssl_vhost.name, "banana.vomit.com")
self.assertTrue(ssl_vhost.ssl)
self.assertFalse(ssl_vhost.enabled)
self.assertIs(ssl_vhost.ssl, True)
self.assertIs(ssl_vhost.enabled, False)
self.assertEqual(self.config.is_name_vhost(self.vh_truth[1]),
self.config.is_name_vhost(ssl_vhost))
@@ -1616,8 +1597,7 @@ class MultiVhostsTest(util.ApacheTest):
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[4])
self.assertTrue(self.config.parser.find_dir(
"RewriteEngine", "on", ssl_vhost.path, False))
self.assertTrue(self.config.parser.find_dir("RewriteEngine", "on", ssl_vhost.path, False))
with open(ssl_vhost.filep) as the_file:
conf_text = the_file.read()
@@ -1625,8 +1605,8 @@ class MultiVhostsTest(util.ApacheTest):
"\"https://new.example.com/docs/$1\" [R,L]")
uncommented_rewrite_rule = ("RewriteRule \"^/docs/(.+)\" "
"\"http://new.example.com/docs/$1\" [R,L]")
self.assertTrue(commented_rewrite_rule in conf_text)
self.assertTrue(uncommented_rewrite_rule in conf_text)
self.assertIn(commented_rewrite_rule, conf_text)
self.assertIn(uncommented_rewrite_rule, conf_text)
self.assertEqual(mock_notify.call_count, 1)
self.assertIn("Some rewrite rules", mock_notify.call_args[0][0])
@@ -1650,12 +1630,12 @@ class MultiVhostsTest(util.ApacheTest):
"https://%{SERVER_NAME}%{REQUEST_URI} "
"[L,NE,R=permanent]")
self.assertTrue(not_commented_cond1 in conf_line_set)
self.assertTrue(not_commented_rewrite_rule in conf_line_set)
self.assertIn(not_commented_cond1, conf_line_set)
self.assertIn(not_commented_rewrite_rule, conf_line_set)
self.assertTrue(commented_cond1 in conf_line_set)
self.assertTrue(commented_cond2 in conf_line_set)
self.assertTrue(commented_rewrite_rule in conf_line_set)
self.assertIn(commented_cond1, conf_line_set)
self.assertIn(commented_cond2, conf_line_set)
self.assertIn(commented_rewrite_rule, conf_line_set)
self.assertEqual(mock_notify.call_count, 1)
self.assertIn("Some rewrite rules", mock_notify.call_args[0][0])
@@ -1677,7 +1657,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
return crypto_util.sha256sum(self.config.pick_apache_config())
def _assert_current_file(self):
self.assertTrue(os.path.isfile(self.config.mod_ssl_conf))
self.assertIs(os.path.isfile(self.config.mod_ssl_conf), True)
self.assertEqual(crypto_util.sha256sum(self.config.mod_ssl_conf),
self._current_ssl_options_hash())
@@ -1685,7 +1665,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
# prepare should have placed a file there
self._assert_current_file()
os.remove(self.config.mod_ssl_conf)
self.assertFalse(os.path.isfile(self.config.mod_ssl_conf))
self.assertIs(os.path.isfile(self.config.mod_ssl_conf), False)
self._call()
self._assert_current_file()
@@ -1707,8 +1687,8 @@ class InstallSslOptionsConfTest(util.ApacheTest):
mod_ssl_conf.write("a new line for the wrong hash\n")
with mock.patch("certbot.plugins.common.logger") as mock_logger:
self._call()
self.assertFalse(mock_logger.warning.called)
self.assertTrue(os.path.isfile(self.config.mod_ssl_conf))
self.assertIs(mock_logger.warning.called, False)
self.assertIs(os.path.isfile(self.config.mod_ssl_conf), True)
self.assertEqual(crypto_util.sha256sum(
self.config.pick_apache_config()),
self._current_ssl_options_hash())
@@ -1731,7 +1711,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
# only print warning once
with mock.patch("certbot.plugins.common.logger") as mock_logger:
self._call()
self.assertFalse(mock_logger.warning.called)
self.assertIs(mock_logger.warning.called, False)
def test_ssl_config_files_hash_in_all_hashes(self):
"""
@@ -1747,12 +1727,14 @@ class InstallSslOptionsConfTest(util.ApacheTest):
"certbot_apache", os.path.join("_internal", "tls_configs"))
all_files = [os.path.join(tls_configs_dir, name) for name in os.listdir(tls_configs_dir)
if name.endswith('options-ssl-apache.conf')]
self.assertTrue(all_files)
self.assertGreaterEqual(len(all_files), 1)
for one_file in all_files:
file_hash = crypto_util.sha256sum(one_file)
self.assertTrue(file_hash in ALL_SSL_OPTIONS_HASHES,
"Constants.ALL_SSL_OPTIONS_HASHES must be appended with the sha256 "
"hash of {0} when it is updated.".format(one_file))
self.assertIn(
file_hash, ALL_SSL_OPTIONS_HASHES,
f"Constants.ALL_SSL_OPTIONS_HASHES must be appended with the sha256 "
f"hash of {one_file} when it is updated."
)
def test_openssl_version(self):
self.config._openssl_version = None
@@ -1786,14 +1768,14 @@ class InstallSslOptionsConfTest(util.ApacheTest):
def test_current_version(self):
self.config.version = (2, 4, 10)
self.config._openssl_version = '1.0.2m'
self.assertTrue('old' in self.config.pick_apache_config())
self.assertIn('old', self.config.pick_apache_config())
self.config.version = (2, 4, 11)
self.config._openssl_version = '1.0.2m'
self.assertTrue('current' in self.config.pick_apache_config())
self.assertIn('current', self.config.pick_apache_config())
self.config._openssl_version = '1.0.2a'
self.assertTrue('old' in self.config.pick_apache_config())
self.assertIn('old', self.config.pick_apache_config())
def test_openssl_version_warns(self):
self.config._openssl_version = '1.0.2a'
@@ -1802,14 +1784,14 @@ class InstallSslOptionsConfTest(util.ApacheTest):
self.config._openssl_version = None
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
self.assertEqual(self.config.openssl_version(), None)
self.assertTrue("Could not find ssl_module" in mock_log.call_args[0][0])
self.assertIn("Could not find ssl_module", mock_log.call_args[0][0])
# When no ssl_module is present at all
self.config._openssl_version = None
self.assertTrue("ssl_module" not in self.config.parser.modules)
self.assertNotIn("ssl_module", self.config.parser.modules)
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
self.assertEqual(self.config.openssl_version(), None)
self.assertTrue("Could not find ssl_module" in mock_log.call_args[0][0])
self.assertIn("Could not find ssl_module", mock_log.call_args[0][0])
# When ssl_module is statically linked but --apache-bin not provided
self.config._openssl_version = None
@@ -1817,13 +1799,13 @@ class InstallSslOptionsConfTest(util.ApacheTest):
self.config.parser.modules['ssl_module'] = None
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
self.assertEqual(self.config.openssl_version(), None)
self.assertTrue("ssl_module is statically linked but" in mock_log.call_args[0][0])
self.assertIn("ssl_module is statically linked but", mock_log.call_args[0][0])
self.config.parser.modules['ssl_module'] = "/fake/path"
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
# Check that correct logger.warning was printed
self.assertEqual(self.config.openssl_version(), None)
self.assertTrue("Unable to read" in mock_log.call_args[0][0])
self.assertIn("Unable to read", mock_log.call_args[0][0])
contents_missing_openssl = b"these contents won't match the regex"
with mock.patch("certbot_apache._internal.configurator."
@@ -1832,7 +1814,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
# Check that correct logger.warning was printed
self.assertEqual(self.config.openssl_version(), None)
self.assertTrue("Could not find OpenSSL" in mock_log.call_args[0][0])
self.assertIn("Could not find OpenSSL", mock_log.call_args[0][0])
def test_open_module_file(self):
mock_open = mock.mock_open(read_data="testing 12 3")

View File

@@ -9,6 +9,7 @@ except ImportError: # pragma: no cover
from certbot import errors
from certbot.compat import os
from certbot.tests import util as certbot_util
from certbot_apache._internal import apache_util
from certbot_apache._internal import obj
import util
@@ -44,8 +45,7 @@ class MultipleVhostsTestDebian(util.ApacheTest):
def test_enable_mod_unsupported_dirs(self):
shutil.rmtree(os.path.join(self.config.parser.root, "mods-enabled"))
self.assertRaises(
errors.NotSupportedError, self.config.enable_mod, "ssl")
self.assertRaises(errors.NotSupportedError, self.config.enable_mod, "ssl")
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
@@ -57,28 +57,29 @@ class MultipleVhostsTestDebian(util.ApacheTest):
mock_exe_exists.return_value = True
self.config.enable_mod("ssl")
self.assertTrue("ssl_module" in self.config.parser.modules)
self.assertTrue("mod_ssl.c" in self.config.parser.modules)
self.assertIn("ssl_module", self.config.parser.modules)
self.assertIn("mod_ssl.c", self.config.parser.modules)
self.assertTrue(mock_run_script.called)
self.assertIs(mock_run_script.called, True)
def test_deploy_cert_enable_new_vhost(self):
# Create
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[0])
self.config.parser.modules["ssl_module"] = None
self.config.parser.modules["mod_ssl.c"] = None
self.assertFalse(ssl_vhost.enabled)
self.config.deploy_cert(
"encryption-example.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.assertTrue(ssl_vhost.enabled)
# Make sure that we don't error out if symlink already exists
ssl_vhost.enabled = False
self.assertFalse(ssl_vhost.enabled)
self.config.deploy_cert(
"encryption-example.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.assertTrue(ssl_vhost.enabled)
self.assertIs(ssl_vhost.enabled, False)
with certbot_util.patch_display_util():
self.config.deploy_cert(
"encryption-example.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.assertIs(ssl_vhost.enabled, True)
# Make sure that we don't error out if symlink already exists
ssl_vhost.enabled = False
self.assertIs(ssl_vhost.enabled, False)
self.config.deploy_cert(
"encryption-example.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.assertIs(ssl_vhost.enabled, True)
def test_enable_site_failure(self):
self.config.parser.root = "/tmp/nonexistent"
@@ -101,14 +102,15 @@ class MultipleVhostsTestDebian(util.ApacheTest):
# Get the default 443 vhost
self.config.assoc["random.demo"] = self.vh_truth[1]
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
with certbot_util.patch_display_util():
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.config.save()
# Verify ssl_module was enabled.
self.assertTrue(self.vh_truth[1].enabled)
self.assertTrue("ssl_module" in self.config.parser.modules)
self.assertIs(self.vh_truth[1].enabled, True)
self.assertIn("ssl_module", self.config.parser.modules)
loc_cert = self.config.parser.find_dir(
"sslcertificatefile", "example/fullchain.pem",
@@ -167,7 +169,7 @@ class MultipleVhostsTestDebian(util.ApacheTest):
# This will create an ssl vhost for certbot.demo
self.config.choose_vhost("certbot.demo")
self.config.enhance("certbot.demo", "staple-ocsp")
self.assertTrue("socache_shmcb_module" in self.config.parser.modules)
self.assertIn("socache_shmcb_module", self.config.parser.modules)
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
@@ -180,7 +182,7 @@ class MultipleVhostsTestDebian(util.ApacheTest):
self.config.choose_vhost("certbot.demo")
self.config.enhance("certbot.demo", "ensure-http-header",
"Strict-Transport-Security")
self.assertTrue("headers_module" in self.config.parser.modules)
self.assertIn("headers_module", self.config.parser.modules)
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
@@ -191,10 +193,10 @@ class MultipleVhostsTestDebian(util.ApacheTest):
# This will create an ssl vhost for certbot.demo
self.config.choose_vhost("certbot.demo")
self.config.enhance("certbot.demo", "redirect")
self.assertTrue("rewrite_module" in self.config.parser.modules)
self.assertIn("rewrite_module", self.config.parser.modules)
def test_enable_site_already_enabled(self):
self.assertTrue(self.vh_truth[1].enabled)
self.assertIs(self.vh_truth[1].enabled, True)
self.config.enable_site(self.vh_truth[1])
def test_enable_site_call_parent(self):
@@ -204,7 +206,7 @@ class MultipleVhostsTestDebian(util.ApacheTest):
vh = self.vh_truth[0]
vh.enabled = False
self.config.enable_site(vh)
self.assertTrue(e_s.called)
self.assertIs(e_s.called, True)
@mock.patch("certbot.util.exe_exists")
def test_enable_mod_no_disable(self, mock_exe_exists):

View File

@@ -3,8 +3,8 @@ import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot import errors
from certbot.display import util as display_util
@@ -23,9 +23,9 @@ class SelectVhostMultiTest(unittest.TestCase):
self.base_dir, "debian_apache_2_4/multiple_vhosts")
def test_select_no_input(self):
self.assertFalse(select_vhost_multiple([]))
self.assertEqual(len(select_vhost_multiple([])), 0)
@certbot_util.patch_get_utility()
@certbot_util.patch_display_util()
def test_select_correct(self, mock_util):
mock_util().checklist.return_value = (
display_util.OK, [self.vhosts[3].display_repr(),
@@ -33,15 +33,16 @@ class SelectVhostMultiTest(unittest.TestCase):
vhs = select_vhost_multiple([self.vhosts[3],
self.vhosts[2],
self.vhosts[1]])
self.assertTrue(self.vhosts[2] in vhs)
self.assertTrue(self.vhosts[3] in vhs)
self.assertFalse(self.vhosts[1] in vhs)
self.assertIn(self.vhosts[2], vhs)
self.assertIn(self.vhosts[3], vhs)
self.assertNotIn(self.vhosts[1], vhs)
@certbot_util.patch_get_utility()
@certbot_util.patch_display_util()
def test_select_cancel(self, mock_util):
mock_util().checklist.return_value = (display_util.CANCEL, "whatever")
vhs = select_vhost_multiple([self.vhosts[2], self.vhosts[3]])
self.assertFalse(vhs)
self.assertEqual(vhs, [])
class SelectVhostTest(unittest.TestCase):
"""Tests for certbot_apache._internal.display_ops.select_vhost."""
@@ -56,41 +57,40 @@ class SelectVhostTest(unittest.TestCase):
from certbot_apache._internal.display_ops import select_vhost
return select_vhost("example.com", vhosts)
@certbot_util.patch_get_utility()
@certbot_util.patch_display_util()
def test_successful_choice(self, mock_util):
mock_util().menu.return_value = (display_util.OK, 3)
self.assertEqual(self.vhosts[3], self._call(self.vhosts))
@certbot_util.patch_get_utility()
@certbot_util.patch_display_util()
def test_noninteractive(self, mock_util):
mock_util().menu.side_effect = errors.MissingCommandlineFlag("no vhost default")
try:
self._call(self.vhosts)
except errors.MissingCommandlineFlag as e:
self.assertTrue("vhost ambiguity" in str(e))
self.assertIn("vhost ambiguity", str(e))
@certbot_util.patch_get_utility()
@certbot_util.patch_display_util()
def test_more_info_cancel(self, mock_util):
mock_util().menu.side_effect = [
(display_util.CANCEL, -1),
]
self.assertEqual(None, self._call(self.vhosts))
self.assertIsNone(self._call(self.vhosts))
def test_no_vhosts(self):
self.assertEqual(self._call([]), None)
self.assertIsNone(self._call([]))
@mock.patch("certbot_apache._internal.display_ops.display_util")
@certbot_util.patch_get_utility()
@mock.patch("certbot_apache._internal.display_ops.logger")
def test_small_display(self, mock_logger, mock_util, mock_display_util):
def test_small_display(self, mock_logger, mock_display_util):
mock_display_util.WIDTH = 20
mock_util().menu.return_value = (display_util.OK, 0)
mock_display_util.menu.return_value = (display_util.OK, 0)
self._call(self.vhosts)
self.assertEqual(mock_logger.debug.call_count, 1)
@certbot_util.patch_get_utility()
@certbot_util.patch_display_util()
def test_multiple_names(self, mock_util):
mock_util().menu.return_value = (display_util.OK, 5)

View File

@@ -53,20 +53,20 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
primary=self.block.secondary,
secondary=self.block.primary)
# Switched around
self.assertTrue(cnode.primary is self.comment.secondary)
self.assertTrue(cnode.secondary is self.comment.primary)
self.assertTrue(dnode.primary is self.directive.secondary)
self.assertTrue(dnode.secondary is self.directive.primary)
self.assertTrue(bnode.primary is self.block.secondary)
self.assertTrue(bnode.secondary is self.block.primary)
self.assertEqual(cnode.primary, self.comment.secondary)
self.assertEqual(cnode.secondary, self.comment.primary)
self.assertEqual(dnode.primary, self.directive.secondary)
self.assertEqual(dnode.secondary, self.directive.primary)
self.assertEqual(bnode.primary, self.block.secondary)
self.assertEqual(bnode.secondary, self.block.primary)
def test_set_params(self):
params = ("first", "second")
self.directive.primary.set_parameters = mock.Mock()
self.directive.secondary.set_parameters = mock.Mock()
self.directive.set_parameters(params)
self.assertTrue(self.directive.primary.set_parameters.called)
self.assertTrue(self.directive.secondary.set_parameters.called)
self.assertIs(self.directive.primary.set_parameters.called, True)
self.assertIs(self.directive.secondary.set_parameters.called, True)
def test_set_parameters(self):
pparams = mock.MagicMock()
@@ -76,8 +76,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
self.directive.primary.set_parameters = pparams
self.directive.secondary.set_parameters = sparams
self.directive.set_parameters(("param", "seq"))
self.assertTrue(pparams.called)
self.assertTrue(sparams.called)
self.assertIs(pparams.called, True)
self.assertIs(sparams.called, True)
def test_delete_child(self):
pdel = mock.MagicMock()
@@ -85,8 +85,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
self.block.primary.delete_child = pdel
self.block.secondary.delete_child = sdel
self.block.delete_child(self.comment)
self.assertTrue(pdel.called)
self.assertTrue(sdel.called)
self.assertIs(pdel.called, True)
self.assertIs(sdel.called, True)
def test_unsaved_files(self):
puns = mock.MagicMock()
@@ -96,8 +96,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
self.block.primary.unsaved_files = puns
self.block.secondary.unsaved_files = suns
self.block.unsaved_files()
self.assertTrue(puns.called)
self.assertTrue(suns.called)
self.assertIs(puns.called, True)
self.assertIs(suns.called, True)
def test_getattr_equality(self):
self.directive.primary.variableexception = "value"
@@ -140,8 +140,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
self.block.primary.add_child_block = mock_first
self.block.secondary.add_child_block = mock_second
self.block.add_child_block("Block")
self.assertTrue(mock_first.called)
self.assertTrue(mock_second.called)
self.assertIs(mock_first.called, True)
self.assertIs(mock_second.called, True)
def test_add_child_directive(self):
mock_first = mock.MagicMock(return_value=self.directive.primary)
@@ -149,8 +149,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
self.block.primary.add_child_directive = mock_first
self.block.secondary.add_child_directive = mock_second
self.block.add_child_directive("Directive")
self.assertTrue(mock_first.called)
self.assertTrue(mock_second.called)
self.assertIs(mock_first.called, True)
self.assertIs(mock_second.called, True)
def test_add_child_comment(self):
mock_first = mock.MagicMock(return_value=self.comment.primary)
@@ -158,8 +158,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
self.block.primary.add_child_comment = mock_first
self.block.secondary.add_child_comment = mock_second
self.block.add_child_comment("Comment")
self.assertTrue(mock_first.called)
self.assertTrue(mock_second.called)
self.assertIs(mock_first.called, True)
self.assertIs(mock_second.called, True)
def test_find_comments(self):
pri_comments = [augeasparser.AugeasCommentNode(comment="some comment",
@@ -183,9 +183,9 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
# Check that every comment response is represented in the list of
# DualParserNode instances.
for p in p_dcoms:
self.assertTrue(p in p_coms)
self.assertIn(p, p_coms)
for s in s_dcoms:
self.assertTrue(s in s_coms)
self.assertIn(s, s_coms)
def test_find_blocks_first_passing(self):
youshallnotpass = [augeasparser.AugeasBlockNode(name="notpassing",
@@ -207,8 +207,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
assertions.assertEqual(block.primary, block.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertTrue(assertions.isPassDirective(block.primary))
self.assertFalse(assertions.isPassDirective(block.secondary))
self.assertIs(assertions.isPassDirective(block.primary), True)
self.assertIs(assertions.isPassDirective(block.secondary), False)
def test_find_blocks_second_passing(self):
youshallnotpass = [augeasparser.AugeasBlockNode(name="notpassing",
@@ -230,8 +230,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
assertions.assertEqual(block.primary, block.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertFalse(assertions.isPassDirective(block.primary))
self.assertTrue(assertions.isPassDirective(block.secondary))
self.assertIs(assertions.isPassDirective(block.primary), False)
self.assertIs(assertions.isPassDirective(block.secondary), True)
def test_find_dirs_first_passing(self):
notpassing = [augeasparser.AugeasDirectiveNode(name="notpassing",
@@ -253,8 +253,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
assertions.assertEqual(directive.primary, directive.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertTrue(assertions.isPassDirective(directive.primary))
self.assertFalse(assertions.isPassDirective(directive.secondary))
self.assertIs(assertions.isPassDirective(directive.primary), True)
self.assertIs(assertions.isPassDirective(directive.secondary), False)
def test_find_dirs_second_passing(self):
notpassing = [augeasparser.AugeasDirectiveNode(name="notpassing",
@@ -276,8 +276,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
assertions.assertEqual(directive.primary, directive.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertFalse(assertions.isPassDirective(directive.primary))
self.assertTrue(assertions.isPassDirective(directive.secondary))
self.assertIs(assertions.isPassDirective(directive.primary), False)
self.assertIs(assertions.isPassDirective(directive.secondary), True)
def test_find_coms_first_passing(self):
notpassing = [augeasparser.AugeasCommentNode(comment="notpassing",
@@ -299,8 +299,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
assertions.assertEqual(comment.primary, comment.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertTrue(assertions.isPassComment(comment.primary))
self.assertFalse(assertions.isPassComment(comment.secondary))
self.assertIs(assertions.isPassComment(comment.primary), True)
self.assertIs(assertions.isPassComment(comment.secondary), False)
def test_find_coms_second_passing(self):
notpassing = [augeasparser.AugeasCommentNode(comment="notpassing",
@@ -322,8 +322,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
assertions.assertEqual(comment.primary, comment.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertFalse(assertions.isPassComment(comment.primary))
self.assertTrue(assertions.isPassComment(comment.secondary))
self.assertIs(assertions.isPassComment(comment.primary), False)
self.assertIs(assertions.isPassComment(comment.secondary), True)
def test_find_blocks_no_pass_equal(self):
notpassing1 = [augeasparser.AugeasBlockNode(name="notpassing",
@@ -341,8 +341,9 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
blocks = self.block.find_blocks("anything")
for block in blocks:
self.assertEqual(block.primary, block.secondary)
self.assertTrue(block.primary is not block.secondary)
with self.subTest(block=block):
self.assertEqual(block.primary, block.secondary)
self.assertIsNot(block.primary, block.secondary)
def test_find_dirs_no_pass_equal(self):
notpassing1 = [augeasparser.AugeasDirectiveNode(name="notpassing",
@@ -360,8 +361,9 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
directives = self.block.find_directives("anything")
for directive in directives:
self.assertEqual(directive.primary, directive.secondary)
self.assertTrue(directive.primary is not directive.secondary)
with self.subTest(directive=directive):
self.assertEqual(directive.primary, directive.secondary)
self.assertIsNot(directive.primary, directive.secondary)
def test_find_comments_no_pass_equal(self):
notpassing1 = [augeasparser.AugeasCommentNode(comment="notpassing",
@@ -379,8 +381,9 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
comments = self.block.find_comments("anything")
for comment in comments:
self.assertEqual(comment.primary, comment.secondary)
self.assertTrue(comment.primary is not comment.secondary)
with self.subTest(comment=comment):
self.assertEqual(comment.primary, comment.secondary)
self.assertIsNot(comment.primary, comment.secondary)
def test_find_blocks_no_pass_notequal(self):
notpassing1 = [augeasparser.AugeasBlockNode(name="notpassing",
@@ -424,8 +427,8 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
self.block.primary.parsed_paths = mock_p
self.block.secondary.parsed_paths = mock_s
self.block.parsed_paths()
self.assertTrue(mock_p.called)
self.assertTrue(mock_s.called)
self.assertIs(mock_p.called, True)
self.assertIs(mock_s.called, True)
def test_parsed_paths_error(self):
mock_p = mock.MagicMock(return_value=['/path/file.conf'])
@@ -441,5 +444,5 @@ class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-
self.block.primary.find_ancestors = primarymock
self.block.secondary.find_ancestors = secondarymock
self.block.find_ancestors("anything")
self.assertTrue(primarymock.called)
self.assertTrue(secondarymock.called)
self.assertIs(primarymock.called, True)
self.assertIs(secondarymock.called, True)

View File

@@ -41,7 +41,7 @@ class EntryPointTest(unittest.TestCase):
with mock.patch("certbot.util.get_os_info") as mock_info:
mock_info.return_value = ("nonexistent", "irrelevant")
with mock.patch("certbot.util.get_systemd_os_like") as mock_like:
mock_like.return_value = ["unknonwn"]
mock_like.return_value = ["unknown"]
self.assertEqual(entrypoint.get_configurator(),
configurator.ApacheConfigurator)

View File

@@ -134,8 +134,8 @@ class MultipleVhostsTestFedora(util.ApacheTest):
self.assertEqual(mock_get.call_count, 3)
self.assertEqual(len(self.config.parser.modules), 4)
self.assertEqual(len(self.config.parser.variables), 2)
self.assertTrue("TEST2" in self.config.parser.variables)
self.assertTrue("mod_another.c" in self.config.parser.modules)
self.assertIn("TEST2", self.config.parser.variables)
self.assertIn("mod_another.c", self.config.parser.modules)
@mock.patch("certbot_apache._internal.configurator.util.run_script")
def test_get_version(self, mock_run_script):
@@ -172,11 +172,11 @@ class MultipleVhostsTestFedora(util.ApacheTest):
mock_osi.return_value = ("fedora", "29")
self.config.parser.update_runtime_variables()
self.assertTrue("mock_define" in self.config.parser.variables)
self.assertTrue("mock_define_too" in self.config.parser.variables)
self.assertTrue("mock_value" in self.config.parser.variables)
self.assertIn("mock_define", self.config.parser.variables)
self.assertIn("mock_define_too", self.config.parser.variables)
self.assertIn("mock_value", self.config.parser.variables)
self.assertEqual("TRUE", self.config.parser.variables["mock_value"])
self.assertTrue("MOCK_NOSEP" in self.config.parser.variables)
self.assertIn("MOCK_NOSEP", self.config.parser.variables)
self.assertEqual("NOSEP_VAL", self.config.parser.variables["NOSEP_TWO"])
@mock.patch("certbot_apache._internal.configurator.util.run_script")

View File

@@ -63,8 +63,7 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
self.temp_dir, "gentoo_apache/apache")
def test_get_parser(self):
self.assertTrue(isinstance(self.config.parser,
override_gentoo.GentooParser))
self.assertIsInstance(self.config.parser, override_gentoo.GentooParser)
def test_get_virtual_hosts(self):
"""Make sure all vhosts are being properly found."""
@@ -91,7 +90,7 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
with mock.patch("certbot_apache._internal.override_gentoo.GentooParser.update_modules"):
self.config.parser.update_runtime_variables()
for define in defines:
self.assertTrue(define in self.config.parser.variables)
self.assertIn(define, self.config.parser.variables)
@mock.patch("certbot_apache._internal.apache_util.parse_from_subprocess")
def test_no_binary_configdump(self, mock_subprocess):
@@ -101,11 +100,11 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
with mock.patch("certbot_apache._internal.override_gentoo.GentooParser.update_modules"):
self.config.parser.update_runtime_variables()
self.config.parser.reset_modules()
self.assertFalse(mock_subprocess.called)
self.assertIs(mock_subprocess.called, False)
self.config.parser.update_runtime_variables()
self.config.parser.reset_modules()
self.assertTrue(mock_subprocess.called)
self.assertIs(mock_subprocess.called, True)
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_opportunistic_httpd_runtime_parsing(self, mock_get):
@@ -129,7 +128,7 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
self.assertEqual(mock_get.call_count, 1)
self.assertEqual(len(self.config.parser.modules), 4)
self.assertTrue("mod_another.c" in self.config.parser.modules)
self.assertIn("mod_another.c", self.config.parser.modules)
@mock.patch("certbot_apache._internal.configurator.util.run_script")
def test_alt_restart_works(self, mock_run_script):

View File

@@ -51,7 +51,7 @@ class ApacheHttp01Test(util.ApacheTest):
self.http = ApacheHttp01(self.config)
def test_empty_perform(self):
self.assertFalse(self.http.perform())
self.assertEqual(len(self.http.perform()), 0)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.enable_mod")
def test_enable_modules_apache_2_2(self, mock_enmod):
@@ -77,7 +77,7 @@ class ApacheHttp01Test(util.ApacheTest):
self.http.prepare_http01_modules()
self.assertTrue(mock_enmod.called)
self.assertIs(mock_enmod.called, True)
calls = mock_enmod.call_args_list
other_calls = []
for call in calls:
@@ -125,6 +125,18 @@ class ApacheHttp01Test(util.ApacheTest):
domain="duplicate.example.com", account_key=self.account_key)]
self.common_perform_test(achalls, vhosts)
def test_configure_name_and_blank(self):
domain = "certbot.demo"
vhosts = [v for v in self.config.vhosts if v.name == domain or v.name is None]
achalls = [
achallenges.KeyAuthorizationAnnotatedChallenge(
challb=acme_util.chall_to_challb(
challenges.HTTP01(token=((b'a' * 16))),
"pending"),
domain=domain, account_key=self.account_key),
]
self.common_perform_test(achalls, vhosts)
def test_no_vhost(self):
for achall in self.achalls:
self.http.add_chall(achall)
@@ -174,7 +186,7 @@ class ApacheHttp01Test(util.ApacheTest):
def common_perform_test(self, achalls, vhosts):
"""Tests perform with the given achalls."""
challenge_dir = self.http.challenge_dir
self.assertFalse(os.path.exists(challenge_dir))
self.assertIs(os.path.exists(challenge_dir), False)
for achall in achalls:
self.http.add_chall(achall)
@@ -182,8 +194,8 @@ class ApacheHttp01Test(util.ApacheTest):
achall.response(self.account_key) for achall in achalls]
self.assertEqual(self.http.perform(), expected_response)
self.assertTrue(os.path.isdir(self.http.challenge_dir))
self.assertTrue(filesystem.has_min_permissions(self.http.challenge_dir, 0o755))
self.assertIs(os.path.isdir(self.http.challenge_dir), True)
self.assertIs(filesystem.has_min_permissions(self.http.challenge_dir, 0o755), True)
self._test_challenge_conf()
for achall in achalls:
@@ -199,7 +211,7 @@ class ApacheHttp01Test(util.ApacheTest):
vhost.path)
self.assertEqual(len(matches), 1)
self.assertTrue(os.path.exists(challenge_dir))
self.assertIs(os.path.exists(challenge_dir), True)
@mock.patch("certbot_apache._internal.http_01.filesystem.makedirs")
def test_failed_makedirs(self, mock_makedirs):
@@ -214,20 +226,20 @@ class ApacheHttp01Test(util.ApacheTest):
with open(self.http.challenge_conf_post) as f:
post_conf_contents = f.read()
self.assertTrue("RewriteEngine on" in pre_conf_contents)
self.assertTrue("RewriteRule" in pre_conf_contents)
self.assertIn("RewriteEngine on", pre_conf_contents)
self.assertIn("RewriteRule", pre_conf_contents)
self.assertTrue(self.http.challenge_dir in post_conf_contents)
self.assertIn(self.http.challenge_dir, post_conf_contents)
if self.config.version < (2, 4):
self.assertTrue("Allow from all" in post_conf_contents)
self.assertIn("Allow from all", post_conf_contents)
else:
self.assertTrue("Require all granted" in post_conf_contents)
self.assertIn("Require all granted", post_conf_contents)
def _test_challenge_file(self, achall):
name = os.path.join(self.http.challenge_dir, achall.chall.encode("token"))
validation = achall.validation(self.account_key)
self.assertTrue(filesystem.has_min_permissions(name, 0o644))
self.assertIs(filesystem.has_min_permissions(name, 0o644), True)
with open(name, 'rb') as f:
self.assertEqual(f.read(), validation.encode())

View File

@@ -44,15 +44,14 @@ class VirtualHostTest(unittest.TestCase):
"fp", "vhp",
{Addr.fromstring("*:443"), Addr.fromstring("1.2.3.4:443")},
False, False)
self.assertTrue(complex_vh.conflicts([self.addr1]))
self.assertTrue(complex_vh.conflicts([self.addr2]))
self.assertFalse(complex_vh.conflicts([self.addr_default]))
self.assertIs(complex_vh.conflicts([self.addr1]), True)
self.assertIs(complex_vh.conflicts([self.addr2]), True)
self.assertIs(complex_vh.conflicts([self.addr_default]), False)
self.assertTrue(self.vhost1.conflicts([self.addr2]))
self.assertFalse(self.vhost1.conflicts([self.addr_default]))
self.assertIs(self.vhost1.conflicts([self.addr2]), True)
self.assertIs(self.vhost1.conflicts([self.addr_default]), False)
self.assertFalse(self.vhost2.conflicts([self.addr1,
self.addr_default]))
self.assertIs(self.vhost2.conflicts([self.addr1, self.addr_default]), False)
def test_same_server(self):
from certbot_apache._internal.obj import VirtualHost
@@ -67,12 +66,12 @@ class VirtualHostTest(unittest.TestCase):
"fp", "vhp", {self.addr2, self.addr_default},
False, False, None)
self.assertTrue(self.vhost1.same_server(self.vhost2))
self.assertTrue(no_name1.same_server(no_name2))
self.assertIs(self.vhost1.same_server(self.vhost2), True)
self.assertIs(no_name1.same_server(no_name2), True)
self.assertFalse(self.vhost1.same_server(no_name1))
self.assertFalse(no_name1.same_server(no_name3))
self.assertFalse(no_name1.same_server(no_name4))
self.assertIs(self.vhost1.same_server(no_name1), False)
self.assertIs(no_name1.same_server(no_name3), False)
self.assertIs(no_name1.same_server(no_name4), False)
class AddrTest(unittest.TestCase):
@@ -88,9 +87,9 @@ class AddrTest(unittest.TestCase):
self.addr_default = Addr.fromstring("_default_:443")
def test_wildcard(self):
self.assertFalse(self.addr.is_wildcard())
self.assertTrue(self.addr1.is_wildcard())
self.assertTrue(self.addr2.is_wildcard())
self.assertIs(self.addr.is_wildcard(), False)
self.assertIs(self.addr1.is_wildcard(), True)
self.assertIs(self.addr2.is_wildcard(), True)
def test_get_sni_addr(self):
from certbot_apache._internal.obj import Addr
@@ -103,29 +102,29 @@ class AddrTest(unittest.TestCase):
def test_conflicts(self):
# Note: Defined IP is more important than defined port in match
self.assertTrue(self.addr.conflicts(self.addr1))
self.assertTrue(self.addr.conflicts(self.addr2))
self.assertTrue(self.addr.conflicts(self.addr_defined))
self.assertFalse(self.addr.conflicts(self.addr_default))
self.assertIs(self.addr.conflicts(self.addr1), True)
self.assertIs(self.addr.conflicts(self.addr2), True)
self.assertIs(self.addr.conflicts(self.addr_defined), True)
self.assertIs(self.addr.conflicts(self.addr_default), False)
self.assertFalse(self.addr1.conflicts(self.addr))
self.assertTrue(self.addr1.conflicts(self.addr_defined))
self.assertFalse(self.addr1.conflicts(self.addr_default))
self.assertIs(self.addr1.conflicts(self.addr), False)
self.assertIs(self.addr1.conflicts(self.addr_defined), True)
self.assertIs(self.addr1.conflicts(self.addr_default), False)
self.assertFalse(self.addr_defined.conflicts(self.addr1))
self.assertFalse(self.addr_defined.conflicts(self.addr2))
self.assertFalse(self.addr_defined.conflicts(self.addr))
self.assertFalse(self.addr_defined.conflicts(self.addr_default))
self.assertIs(self.addr_defined.conflicts(self.addr1), False)
self.assertIs(self.addr_defined.conflicts(self.addr2), False)
self.assertIs(self.addr_defined.conflicts(self.addr), False)
self.assertIs(self.addr_defined.conflicts(self.addr_default), False)
self.assertTrue(self.addr_default.conflicts(self.addr))
self.assertTrue(self.addr_default.conflicts(self.addr1))
self.assertTrue(self.addr_default.conflicts(self.addr_defined))
self.assertIs(self.addr_default.conflicts(self.addr), True)
self.assertIs(self.addr_default.conflicts(self.addr1), True)
self.assertIs(self.addr_default.conflicts(self.addr_defined), True)
# Self test
self.assertTrue(self.addr.conflicts(self.addr))
self.assertTrue(self.addr1.conflicts(self.addr1))
self.assertIs(self.addr.conflicts(self.addr), True)
self.assertIs(self.addr1.conflicts(self.addr1), True)
# This is a tricky one...
self.assertTrue(self.addr1.conflicts(self.addr2))
self.assertIs(self.addr1.conflicts(self.addr2), True)
def test_equal(self):
self.assertEqual(self.addr1, self.addr2)

View File

@@ -42,7 +42,7 @@ class BasicParserTest(util.ParserTest):
self.assertEqual(self.parser.check_aug_version(),
["something"])
self.parser.aug.match.side_effect = RuntimeError
self.assertFalse(self.parser.check_aug_version())
self.assertIs(self.parser.check_aug_version(), False)
def test_find_config_root_no_root(self):
# pylint: disable=protected-access
@@ -80,8 +80,7 @@ class BasicParserTest(util.ParserTest):
aug_default = "/files" + self.parser.loc["default"]
self.parser.add_dir(aug_default, "AddDirective", "test")
self.assertTrue(
self.parser.find_dir("AddDirective", "test", aug_default))
self.assertTrue(self.parser.find_dir("AddDirective", "test", aug_default))
self.parser.add_dir(aug_default, "AddList", ["1", "2", "3", "4"])
matches = self.parser.find_dir("AddList", None, aug_default)
@@ -94,20 +93,24 @@ class BasicParserTest(util.ParserTest):
"AddDirectiveBeginning",
"testBegin")
self.assertTrue(
self.parser.find_dir("AddDirectiveBeginning", "testBegin", aug_default))
self.assertTrue(self.parser.find_dir("AddDirectiveBeginning", "testBegin", aug_default))
self.assertEqual(
self.parser.aug.get(aug_default+"/directive[1]"),
"AddDirectiveBeginning")
self.assertEqual(self.parser.aug.get(aug_default+"/directive[1]"), "AddDirectiveBeginning")
self.parser.add_dir_beginning(aug_default, "AddList", ["1", "2", "3", "4"])
matches = self.parser.find_dir("AddList", None, aug_default)
for i, match in enumerate(matches):
self.assertEqual(self.parser.aug.get(match), str(i + 1))
for name in ("empty.conf", "no-directives.conf"):
conf = "/files" + os.path.join(self.parser.root, "sites-available", name)
self.parser.add_dir_beginning(conf, "AddDirectiveBeginning", "testBegin")
self.assertGreater(
len(self.parser.find_dir("AddDirectiveBeginning", "testBegin", conf)),
0
)
def test_empty_arg(self):
self.assertEqual(None,
self.parser.get_arg("/files/whatever/nonexistent"))
self.assertIsNone(self.parser.get_arg("/files/whatever/nonexistent"))
def test_add_dir_to_ifmodssl(self):
"""test add_dir_to_ifmodssl.
@@ -126,7 +129,7 @@ class BasicParserTest(util.ParserTest):
matches = self.parser.find_dir("FakeDirective", "123")
self.assertEqual(len(matches), 1)
self.assertTrue("IfModule" in matches[0])
self.assertIn("IfModule", matches[0])
def test_add_dir_to_ifmodssl_multiple(self):
from certbot_apache._internal.parser import get_aug_path
@@ -140,7 +143,7 @@ class BasicParserTest(util.ParserTest):
matches = self.parser.find_dir("FakeDirective")
self.assertEqual(len(matches), 3)
self.assertTrue("IfModule" in matches[0])
self.assertIn("IfModule", matches[0])
def test_get_aug_path(self):
from certbot_apache._internal.parser import get_aug_path
@@ -165,7 +168,7 @@ class BasicParserTest(util.ParserTest):
with mock.patch("certbot_apache._internal.parser.logger") as mock_logger:
self.parser.parse_modules()
# Make sure that we got None return value and logged the file
self.assertTrue(mock_logger.debug.called)
self.assertIs(mock_logger.debug.called, True)
@mock.patch("certbot_apache._internal.parser.ApacheParser.find_dir")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
@@ -183,6 +186,8 @@ class BasicParserTest(util.ParserTest):
'Define: DUMP_RUN_CFG\n'
'Define: U_MICH\n'
'Define: TLS=443\n'
'Define: WITH_ASSIGNMENT=URL=http://example.com\n'
'Define: EMPTY=\n'
'Define: example_path=Documents/path\n'
'User: name="www-data" id=33 not_used\n'
'Group: name="www-data" id=33 not_used\n'
@@ -261,7 +266,10 @@ class BasicParserTest(util.ParserTest):
mock_cfg.side_effect = mock_get_vars
expected_vars = {"TEST": "", "U_MICH": "", "TLS": "443",
"example_path": "Documents/path"}
"example_path": "Documents/path",
"WITH_ASSIGNMENT": "URL=http://example.com",
"EMPTY": "",
}
self.parser.modules = {}
with mock.patch(
@@ -296,15 +304,6 @@ class BasicParserTest(util.ParserTest):
# path derived from root configuration Include statements
self.assertEqual(mock_parse.call_count, 1)
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_update_runtime_vars_bad_output(self, mock_cfg):
mock_cfg.return_value = "Define: TLS=443=24"
self.parser.update_runtime_variables()
mock_cfg.return_value = "Define: DUMP_RUN_CFG\nDefine: TLS=443=24"
self.assertRaises(
errors.PluginError, self.parser.update_runtime_variables)
@mock.patch("certbot_apache._internal.apache_util.subprocess.run")
def test_update_runtime_vars_bad_ctl(self, mock_run):
mock_run.side_effect = OSError
@@ -327,7 +326,7 @@ class BasicParserTest(util.ParserTest):
self.parser.add_comment(get_aug_path(self.parser.loc["name"]), "123456")
comm = self.parser.find_comments("123456")
self.assertEqual(len(comm), 1)
self.assertTrue(self.parser.loc["name"] in comm[0])
self.assertIn(self.parser.loc["name"], comm[0])
class ParserInitTest(util.ApacheTest):
@@ -346,8 +345,8 @@ class ParserInitTest(util.ApacheTest):
self.config.config_test = mock.Mock()
self.assertRaises(
errors.NoInstallationError, ApacheParser,
os.path.relpath(self.config_path), "/dummy/vhostpath",
version=(2, 4, 22), configurator=self.config)
os.path.relpath(self.config_path), self.config,
"/dummy/vhostpath", version=(2, 4, 22))
def test_init_old_aug(self):
from certbot_apache._internal.parser import ApacheParser
@@ -355,8 +354,8 @@ class ParserInitTest(util.ApacheTest):
mock_c.return_value = False
self.assertRaises(
errors.NotSupportedError,
ApacheParser, os.path.relpath(self.config_path),
"/dummy/vhostpath", version=(2, 4, 22), configurator=self.config)
ApacheParser, os.path.relpath(self.config_path), self.config,
"/dummy/vhostpath", version=(2, 4, 22))
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_unparseable(self, mock_cfg):
@@ -364,8 +363,8 @@ class ParserInitTest(util.ApacheTest):
mock_cfg.return_value = ('Define: TEST')
self.assertRaises(
errors.PluginError,
ApacheParser, os.path.relpath(self.config_path),
"/dummy/vhostpath", version=(2, 2, 22), configurator=self.config)
ApacheParser, os.path.relpath(self.config_path), self.config,
"/dummy/vhostpath", version=(2, 2, 22))
def test_root_normalized(self):
from certbot_apache._internal.parser import ApacheParser
@@ -376,7 +375,7 @@ class ParserInitTest(util.ApacheTest):
self.temp_dir,
"debian_apache_2_4/////multiple_vhosts/../multiple_vhosts/apache2")
parser = ApacheParser(path, "/dummy/vhostpath", configurator=self.config)
parser = ApacheParser(path, self.config, "/dummy/vhostpath")
self.assertEqual(parser.root, self.config_path)
@@ -385,8 +384,7 @@ class ParserInitTest(util.ApacheTest):
with mock.patch("certbot_apache._internal.parser.ApacheParser."
"update_runtime_variables"):
parser = ApacheParser(
os.path.relpath(self.config_path),
"/dummy/vhostpath", configurator=self.config)
os.path.relpath(self.config_path), self.config, "/dummy/vhostpath")
self.assertEqual(parser.root, self.config_path)
@@ -395,8 +393,7 @@ class ParserInitTest(util.ApacheTest):
with mock.patch("certbot_apache._internal.parser.ApacheParser."
"update_runtime_variables"):
parser = ApacheParser(
self.config_path + os.path.sep,
"/dummy/vhostpath", configurator=self.config)
self.config_path + os.path.sep, self.config, "/dummy/vhostpath")
self.assertEqual(parser.root, self.config_path)

View File

@@ -32,7 +32,7 @@ class ConfiguratorParserNodeTest(util.ApacheTest): # pylint: disable=too-many-p
self.config.USE_PARSERNODE = True
vhosts = self.config.get_virtual_hosts()
# Legacy get_virtual_hosts() do not set the node
self.assertTrue(vhosts[0].node is not None)
self.assertIsNotNone(vhosts[0].node)
def test_parsernode_get_vhosts_mismatch(self):
vhosts = self.config.get_virtual_hosts_v2()

View File

@@ -0,0 +1,5 @@
<VirtualHost *:80>
<Location />
Require all denied
</Location>
</VirtualHost>

View File

@@ -1,18 +1,16 @@
"""Common utilities for certbot_apache."""
import shutil
import sys
import unittest
import augeas
import josepy as jose
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import zope.component
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot.compat import os
from certbot.display import util as display_util
from certbot.plugins import common
from certbot.tests import util as test_util
from certbot_apache._internal import configurator
@@ -69,16 +67,13 @@ class ParserTest(ApacheTest):
vhost_root="debian_apache_2_4/multiple_vhosts/apache2/sites-available"):
super().setUp(test_dir, config_root, vhost_root)
zope.component.provideUtility(display_util.FileDisplay(sys.stdout,
False))
from certbot_apache._internal.parser import ApacheParser
self.aug = augeas.Augeas(
flags=augeas.Augeas.NONE | augeas.Augeas.NO_MODL_AUTOLOAD)
with mock.patch("certbot_apache._internal.parser.ApacheParser."
"update_runtime_variables"):
self.parser = ApacheParser(
self.config_path, self.vhost_path, configurator=self.config)
self.config_path, self.config, self.vhost_path)
def get_apache_configurator(

File diff suppressed because it is too large Load Diff

View File

@@ -1,9 +1,11 @@
#!/usr/bin/env python
"""A Certbot hook for probing."""
import os
import sys
hook_script_type = os.path.basename(os.path.dirname(sys.argv[1]))
if hook_script_type == 'deploy' and ('RENEWED_DOMAINS' not in os.environ or 'RENEWED_LINEAGE' not in os.environ):
if hook_script_type == 'deploy' and ('RENEWED_DOMAINS' not in os.environ
or 'RENEWED_LINEAGE' not in os.environ):
sys.stderr.write('Environment variables not properly set!\n')
sys.exit(1)

View File

@@ -1,9 +1,10 @@
"""This module contains advanced assertions for the certbot integration tests."""
import io
import os
from typing import Type
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric.ec import EllipticCurvePrivateKey
from cryptography.hazmat.primitives.asymmetric.ec import EllipticCurvePrivateKey, EllipticCurve
from cryptography.hazmat.primitives.asymmetric.rsa import RSAPrivateKey
from cryptography.hazmat.primitives.serialization import load_pem_private_key
@@ -11,7 +12,6 @@ try:
import grp
POSIX_MODE = True
except ImportError:
import win32api
import win32security
import ntsecuritycon
POSIX_MODE = False
@@ -21,11 +21,11 @@ SYSTEM_SID = 'S-1-5-18'
ADMINS_SID = 'S-1-5-32-544'
def assert_elliptic_key(key, curve):
def assert_elliptic_key(key: str, curve: Type[EllipticCurve]) -> None:
"""
Asserts that the key at the given path is an EC key using the given curve.
:param key: path to key
:param curve: name of the expected elliptic curve
:param EllipticCurve curve: name of the expected elliptic curve
"""
with open(key, 'rb') as file:
privkey1 = file.read()
@@ -36,10 +36,10 @@ def assert_elliptic_key(key, curve):
assert isinstance(key.curve, curve)
def assert_rsa_key(key):
def assert_rsa_key(key: str) -> None:
"""
Asserts that the key at the given path is an RSA key.
:param key: path to key
:param str key: path to key
"""
with open(key, 'rb') as file:
privkey1 = file.read()
@@ -48,11 +48,11 @@ def assert_rsa_key(key):
assert isinstance(key, RSAPrivateKey)
def assert_hook_execution(probe_path, probe_content):
def assert_hook_execution(probe_path: str, probe_content: str) -> None:
"""
Assert that a certbot hook has been executed
:param probe_path: path to the file that received the hook output
:param probe_content: content expected when the hook is executed
:param str probe_path: path to the file that received the hook output
:param str probe_content: content expected when the hook is executed
"""
encoding = 'utf-8' if POSIX_MODE else 'utf-16'
with io.open(probe_path, 'rt', encoding=encoding) as file:
@@ -62,22 +62,22 @@ def assert_hook_execution(probe_path, probe_content):
assert probe_content in lines
def assert_saved_renew_hook(config_dir, lineage):
def assert_saved_renew_hook(config_dir: str, lineage: str) -> None:
"""
Assert that the renew hook configuration of a lineage has been saved.
:param config_dir: location of the certbot configuration
:param lineage: lineage domain name
:param str config_dir: location of the certbot configuration
:param str lineage: lineage domain name
"""
with open(os.path.join(config_dir, 'renewal', '{0}.conf'.format(lineage))) as file_h:
assert 'renew_hook' in file_h.read()
def assert_cert_count_for_lineage(config_dir, lineage, count):
def assert_cert_count_for_lineage(config_dir: str, lineage: str, count: int) -> None:
"""
Assert the number of certificates generated for a lineage.
:param config_dir: location of the certbot configuration
:param lineage: lineage domain name
:param count: number of expected certificates
:param str config_dir: location of the certbot configuration
:param str lineage: lineage domain name
:param int count: number of expected certificates
"""
archive_dir = os.path.join(config_dir, 'archive')
lineage_dir = os.path.join(archive_dir, lineage)
@@ -85,11 +85,11 @@ def assert_cert_count_for_lineage(config_dir, lineage, count):
assert len(certs) == count
def assert_equals_group_permissions(file1, file2):
def assert_equals_group_permissions(file1: str, file2: str) -> None:
"""
Assert that two files have the same permissions for group owner.
:param file1: first file path to compare
:param file2: second file path to compare
:param str file1: first file path to compare
:param str file2: second file path to compare
"""
# On Windows there is no group, so this assertion does nothing on this platform
if POSIX_MODE:
@@ -99,11 +99,11 @@ def assert_equals_group_permissions(file1, file2):
assert mode_file1 == mode_file2
def assert_equals_world_read_permissions(file1, file2):
def assert_equals_world_read_permissions(file1: str, file2: str) -> None:
"""
Assert that two files have the same read permissions for everyone.
:param file1: first file path to compare
:param file2: second file path to compare
:param str file1: first file path to compare
:param str file2: second file path to compare
"""
if POSIX_MODE:
mode_file1 = os.stat(file1).st_mode & 0o004
@@ -134,11 +134,11 @@ def assert_equals_world_read_permissions(file1, file2):
assert mode_file1 == mode_file2
def assert_equals_group_owner(file1, file2):
def assert_equals_group_owner(file1: str, file2: str) -> None:
"""
Assert that two files have the same group owner.
:param file1: first file path to compare
:param file2: second file path to compare
:param str file1: first file path to compare
:param str file2: second file path to compare
"""
# On Windows there is no group, so this assertion does nothing on this platform
if POSIX_MODE:
@@ -148,10 +148,10 @@ def assert_equals_group_owner(file1, file2):
assert group_owner_file1 == group_owner_file2
def assert_world_no_permissions(file):
def assert_world_no_permissions(file: str) -> None:
"""
Assert that the given file is not world-readable.
:param file: path of the file to check
:param str file: path of the file to check
"""
if POSIX_MODE:
mode_file_all = os.stat(file).st_mode & 0o007
@@ -168,10 +168,10 @@ def assert_world_no_permissions(file):
assert not mode
def assert_world_read_permissions(file):
def assert_world_read_permissions(file: str) -> None:
"""
Assert that the given file is world-readable, but not world-writable or world-executable.
:param file: path of the file to check
:param str file: path of the file to check
"""
if POSIX_MODE:
mode_file_all = os.stat(file).st_mode & 0o007
@@ -188,8 +188,3 @@ def assert_world_read_permissions(file):
assert not mode & ntsecuritycon.FILE_GENERIC_WRITE
assert not mode & ntsecuritycon.FILE_GENERIC_EXECUTE
assert mode & ntsecuritycon.FILE_GENERIC_READ == ntsecuritycon.FILE_GENERIC_READ
def _get_current_user():
account_name = win32api.GetUserNameEx(win32api.NameSamCompatible)
return win32security.LookupAccountName(None, account_name)[0]

View File

@@ -3,21 +3,25 @@ import os
import shutil
import sys
import tempfile
from typing import Iterable
from typing import Tuple
import pytest
from certbot_integration_tests.utils import certbot_call
class IntegrationTestsContext:
"""General fixture describing a certbot integration tests context"""
def __init__(self, request):
def __init__(self, request: pytest.FixtureRequest) -> None:
self.request = request
if hasattr(request.config, 'workerinput'): # Worker node
self.worker_id = request.config.workerinput['workerid']
acme_xdist = request.config.workerinput['acme_xdist']
self.worker_id = request.config.workerinput['workerid'] # type: ignore[attr-defined]
acme_xdist = request.config.workerinput['acme_xdist'] # type: ignore[attr-defined]
else: # Primary node
self.worker_id = 'primary'
acme_xdist = request.config.acme_xdist
acme_xdist = request.config.acme_xdist # type: ignore[attr-defined]
self.acme_server = acme_xdist['acme_server']
self.directory_url = acme_xdist['directory_url']
@@ -52,16 +56,17 @@ class IntegrationTestsContext:
'"'
).format(sys.executable, self.challtestsrv_port)
def cleanup(self):
def cleanup(self) -> None:
"""Cleanup the integration test context."""
shutil.rmtree(self.workspace)
def certbot(self, args, force_renew=True):
def certbot(self, args: Iterable[str], force_renew: bool = True) -> Tuple[str, str]:
"""
Execute certbot with given args, not renewing certificates by default.
:param args: args to pass to certbot
:param force_renew: set to False to not renew by default
:param bool force_renew: set to False to not renew by default
:return: stdout and stderr from certbot execution
:rtype: Tuple of `str`
"""
command = ['--authenticator', 'standalone', '--installer', 'null']
command.extend(args)
@@ -69,14 +74,15 @@ class IntegrationTestsContext:
command, self.directory_url, self.http_01_port, self.tls_alpn_01_port,
self.config_dir, self.workspace, force_renew=force_renew)
def get_domain(self, subdomain='le'):
def get_domain(self, subdomain: str = 'le') -> str:
"""
Generate a certificate domain name suitable for distributed certbot integration tests.
This is a requirement to let the distribution know how to redirect the challenge check
from the ACME server to the relevant pytest-xdist worker. This resolution is done by
appending the pytest worker id to the subdomain, using this pattern:
{subdomain}.{worker_id}.wtf
:param subdomain: the subdomain to use in the generated domain (default 'le')
:param str subdomain: the subdomain to use in the generated domain (default 'le')
:return: the well-formed domain suitable for redirection on
:rtype: str
"""
return '{0}.{1}.wtf'.format(subdomain, self.worker_id)

View File

@@ -1,5 +1,4 @@
"""Module executing integration tests against certbot core."""
import os
from os.path import exists
from os.path import join
@@ -7,14 +6,18 @@ import re
import shutil
import subprocess
import time
from typing import Iterable
from typing import Generator
from typing import Type
from cryptography.hazmat.primitives.asymmetric.ec import EllipticCurve
from cryptography.hazmat.primitives.asymmetric.ec import SECP256R1
from cryptography.hazmat.primitives.asymmetric.ec import SECP384R1
from cryptography.hazmat.primitives.asymmetric.ec import SECP521R1
from cryptography.x509 import NameOID
import pytest
from certbot_integration_tests.certbot_tests import context as certbot_context
from certbot_integration_tests.certbot_tests.context import IntegrationTestsContext
from certbot_integration_tests.certbot_tests.assertions import assert_cert_count_for_lineage
from certbot_integration_tests.certbot_tests.assertions import assert_elliptic_key
from certbot_integration_tests.certbot_tests.assertions import assert_equals_group_owner
@@ -30,17 +33,17 @@ from certbot_integration_tests.utils import misc
@pytest.fixture(name='context')
def test_context(request):
# pylint: disable=missing-function-docstring
def test_context(request: pytest.FixtureRequest) -> Generator[IntegrationTestsContext, None, None]:
"""Fixture providing the integration test context."""
# Fixture request is a built-in pytest fixture describing current test request.
integration_test_context = certbot_context.IntegrationTestsContext(request)
integration_test_context = IntegrationTestsContext(request)
try:
yield integration_test_context
finally:
integration_test_context.cleanup()
def test_basic_commands(context):
def test_basic_commands(context: IntegrationTestsContext) -> None:
"""Test simple commands on Certbot CLI."""
# TMPDIR env variable is set to workspace for the certbot subprocess.
# So tempdir module will create any temporary files/dirs in workspace,
@@ -58,7 +61,7 @@ def test_basic_commands(context):
assert initial_count_tmpfiles == new_count_tmpfiles
def test_hook_dirs_creation(context):
def test_hook_dirs_creation(context: IntegrationTestsContext) -> None:
"""Test all hooks directory are created during Certbot startup."""
context.certbot(['register'])
@@ -66,7 +69,7 @@ def test_hook_dirs_creation(context):
assert os.path.isdir(hook_dir)
def test_registration_override(context):
def test_registration_override(context: IntegrationTestsContext) -> None:
"""Test correct register/unregister, and registration override."""
context.certbot(['register'])
context.certbot(['unregister'])
@@ -76,14 +79,14 @@ def test_registration_override(context):
context.certbot(['update_account', '--email', 'ex1@domain.org,ex2@domain.org'])
def test_prepare_plugins(context):
def test_prepare_plugins(context: IntegrationTestsContext) -> None:
"""Test that plugins are correctly instantiated and displayed."""
stdout, _ = context.certbot(['plugins', '--init', '--prepare'])
assert 'webroot' in stdout
def test_http_01(context):
def test_http_01(context: IntegrationTestsContext) -> None:
"""Test the HTTP-01 challenge using standalone plugin."""
# We start a server listening on the port for the
# TLS-SNI challenge to prevent regressions in #3601.
@@ -101,7 +104,7 @@ def test_http_01(context):
assert_saved_renew_hook(context.config_dir, certname)
def test_manual_http_auth(context):
def test_manual_http_auth(context: IntegrationTestsContext) -> None:
"""Test the HTTP-01 challenge using manual plugin."""
with misc.create_http_server(context.http_01_port) as webroot,\
misc.manual_http_hooks(webroot, context.http_01_port) as scripts:
@@ -122,7 +125,7 @@ def test_manual_http_auth(context):
assert_saved_renew_hook(context.config_dir, certname)
def test_manual_dns_auth(context):
def test_manual_dns_auth(context: IntegrationTestsContext) -> None:
"""Test the DNS-01 challenge using manual plugin."""
certname = context.get_domain('dns')
context.certbot([
@@ -144,14 +147,14 @@ def test_manual_dns_auth(context):
assert_cert_count_for_lineage(context.config_dir, certname, 2)
def test_certonly(context):
def test_certonly(context: IntegrationTestsContext) -> None:
"""Test the certonly verb on certbot."""
context.certbot(['certonly', '--cert-name', 'newname', '-d', context.get_domain('newname')])
assert_cert_count_for_lineage(context.config_dir, 'newname', 1)
def test_certonly_webroot(context):
def test_certonly_webroot(context: IntegrationTestsContext) -> None:
"""Test the certonly verb with webroot plugin"""
with misc.create_http_server(context.http_01_port) as webroot:
certname = context.get_domain('webroot')
@@ -160,7 +163,7 @@ def test_certonly_webroot(context):
assert_cert_count_for_lineage(context.config_dir, certname, 1)
def test_auth_and_install_with_csr(context):
def test_auth_and_install_with_csr(context: IntegrationTestsContext) -> None:
"""Test certificate issuance and install using an existing CSR."""
certname = context.get_domain('le3')
key_path = join(context.workspace, 'key.pem')
@@ -187,7 +190,7 @@ def test_auth_and_install_with_csr(context):
])
def test_renew_files_permissions(context):
def test_renew_files_permissions(context: IntegrationTestsContext) -> None:
"""Test proper certificate file permissions upon renewal"""
certname = context.get_domain('renew')
context.certbot(['-d', certname])
@@ -207,7 +210,7 @@ def test_renew_files_permissions(context):
assert_equals_group_permissions(privkey1, privkey2)
def test_renew_with_hook_scripts(context):
def test_renew_with_hook_scripts(context: IntegrationTestsContext) -> None:
"""Test certificate renewal with script hooks."""
certname = context.get_domain('renew')
context.certbot(['-d', certname])
@@ -221,7 +224,7 @@ def test_renew_with_hook_scripts(context):
assert_hook_execution(context.hook_probe, 'deploy')
def test_renew_files_propagate_permissions(context):
def test_renew_files_propagate_permissions(context: IntegrationTestsContext) -> None:
"""Test proper certificate renewal with custom permissions propagated on private key."""
certname = context.get_domain('renew')
context.certbot(['-d', certname])
@@ -263,7 +266,7 @@ def test_renew_files_propagate_permissions(context):
assert_world_no_permissions(privkey2)
def test_graceful_renew_it_is_not_time(context):
def test_graceful_renew_it_is_not_time(context: IntegrationTestsContext) -> None:
"""Test graceful renew is not done when it is not due time."""
certname = context.get_domain('renew')
context.certbot(['-d', certname])
@@ -278,7 +281,7 @@ def test_graceful_renew_it_is_not_time(context):
assert_hook_execution(context.hook_probe, 'deploy')
def test_graceful_renew_it_is_time(context):
def test_graceful_renew_it_is_time(context: IntegrationTestsContext) -> None:
"""Test graceful renew is done when it is due time."""
certname = context.get_domain('renew')
context.certbot(['-d', certname])
@@ -298,7 +301,7 @@ def test_graceful_renew_it_is_time(context):
assert_hook_execution(context.hook_probe, 'deploy')
def test_renew_with_changed_private_key_complexity(context):
def test_renew_with_changed_private_key_complexity(context: IntegrationTestsContext) -> None:
"""Test proper renew with updated private key complexity."""
certname = context.get_domain('renew')
context.certbot(['-d', certname, '--rsa-key-size', '4096'])
@@ -320,7 +323,7 @@ def test_renew_with_changed_private_key_complexity(context):
assert os.stat(key3).st_size < 1800 # 2048 bits keys takes less than 1800 bytes
def test_renew_ignoring_directory_hooks(context):
def test_renew_ignoring_directory_hooks(context: IntegrationTestsContext) -> None:
"""Test hooks are ignored during renewal with relevant CLI flag."""
certname = context.get_domain('renew')
context.certbot(['-d', certname])
@@ -335,7 +338,7 @@ def test_renew_ignoring_directory_hooks(context):
assert_hook_execution(context.hook_probe, 'deploy')
def test_renew_empty_hook_scripts(context):
def test_renew_empty_hook_scripts(context: IntegrationTestsContext) -> None:
"""Test proper renew with empty hook scripts."""
certname = context.get_domain('renew')
context.certbot(['-d', certname])
@@ -346,13 +349,14 @@ def test_renew_empty_hook_scripts(context):
for hook_dir in misc.list_renewal_hooks_dirs(context.config_dir):
shutil.rmtree(hook_dir)
os.makedirs(join(hook_dir, 'dir'))
open(join(hook_dir, 'file'), 'w').close()
with open(join(hook_dir, 'file'), 'w'):
pass
context.certbot(['renew'])
assert_cert_count_for_lineage(context.config_dir, certname, 2)
def test_renew_hook_override(context):
def test_renew_hook_override(context: IntegrationTestsContext) -> None:
"""Test correct hook override on renew."""
certname = context.get_domain('override')
context.certbot([
@@ -368,7 +372,8 @@ def test_renew_hook_override(context):
assert_hook_execution(context.hook_probe, 'deploy')
# Now we override all previous hooks during next renew.
open(context.hook_probe, 'w').close()
with open(context.hook_probe, 'w'):
pass
context.certbot([
'renew', '--cert-name', certname,
'--pre-hook', misc.echo('pre_override', context.hook_probe),
@@ -387,7 +392,8 @@ def test_renew_hook_override(context):
assert_hook_execution(context.hook_probe, 'deploy')
# Expect that this renew will reuse new hooks registered in the previous renew.
open(context.hook_probe, 'w').close()
with open(context.hook_probe, 'w'):
pass
context.certbot(['renew', '--cert-name', certname])
assert_hook_execution(context.hook_probe, 'pre_override')
@@ -395,7 +401,7 @@ def test_renew_hook_override(context):
assert_hook_execution(context.hook_probe, 'deploy_override')
def test_invalid_domain_with_dns_challenge(context):
def test_invalid_domain_with_dns_challenge(context: IntegrationTestsContext) -> None:
"""Test certificate issuance failure with DNS-01 challenge."""
# Manual dns auth hooks from misc are designed to fail if the domain contains 'fail-*'.
domains = ','.join([context.get_domain('dns1'), context.get_domain('fail-dns1')])
@@ -412,7 +418,7 @@ def test_invalid_domain_with_dns_challenge(context):
assert context.get_domain('fail-dns1') not in stdout
def test_reuse_key(context):
def test_reuse_key(context: IntegrationTestsContext) -> None:
"""Test various scenarios where a key is reused."""
certname = context.get_domain('reusekey')
context.certbot(['--domains', certname, '--reuse-key'])
@@ -430,6 +436,21 @@ def test_reuse_key(context):
privkey3 = file.read()
assert privkey2 != privkey3
context.certbot(['--cert-name', certname, '--domains', certname,
'--reuse-key','--force-renewal'])
context.certbot(['renew', '--cert-name', certname, '--no-reuse-key', '--force-renewal'])
context.certbot(['renew', '--cert-name', certname, '--force-renewal'])
with open(join(context.config_dir, 'archive/{0}/privkey4.pem').format(certname), 'r') as file:
privkey4 = file.read()
with open(join(context.config_dir, 'archive/{0}/privkey5.pem').format(certname), 'r') as file:
privkey5 = file.read()
with open(join(context.config_dir, 'archive/{0}/privkey6.pem').format(certname), 'r') as file:
privkey6 = file.read()
assert privkey3 == privkey4
assert privkey4 != privkey5
assert privkey5 != privkey6
with open(join(context.config_dir, 'archive/{0}/cert1.pem').format(certname), 'r') as file:
cert1 = file.read()
with open(join(context.config_dir, 'archive/{0}/cert2.pem').format(certname), 'r') as file:
@@ -440,12 +461,12 @@ def test_reuse_key(context):
assert len({cert1, cert2, cert3}) == 3
def test_incorrect_key_type(context):
def test_incorrect_key_type(context: IntegrationTestsContext) -> None:
with pytest.raises(subprocess.CalledProcessError):
context.certbot(['--key-type="failwhale"'])
def test_ecdsa(context):
def test_ecdsa(context: IntegrationTestsContext) -> None:
"""Test issuance for ECDSA CSR based request (legacy supported mode)."""
key_path = join(context.workspace, 'privkey-p384.pem')
csr_path = join(context.workspace, 'csr-p384.der')
@@ -466,7 +487,7 @@ def test_ecdsa(context):
assert 'ASN1 OID: secp384r1' in certificate
def test_default_key_type(context):
def test_default_key_type(context: IntegrationTestsContext) -> None:
"""Test default key type is RSA"""
certname = context.get_domain('renew')
context.certbot([
@@ -477,7 +498,7 @@ def test_default_key_type(context):
assert_rsa_key(filename)
def test_default_curve_type(context):
def test_default_curve_type(context: IntegrationTestsContext) -> None:
"""test that the curve used when not specifying any is secp256r1"""
certname = context.get_domain('renew')
context.certbot([
@@ -491,9 +512,10 @@ def test_default_curve_type(context):
# Curve name, Curve class, ACME servers to skip
('secp256r1', SECP256R1, []),
('secp384r1', SECP384R1, []),
('secp521r1', SECP521R1, ['boulder-v1', 'boulder-v2'])]
('secp521r1', SECP521R1, ['boulder-v2'])]
)
def test_ecdsa_curves(context, curve, curve_cls, skip_servers):
def test_ecdsa_curves(context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve],
skip_servers: Iterable[str]) -> None:
"""Test issuance for each supported ECDSA curve"""
if context.acme_server in skip_servers:
pytest.skip('ACME server {} does not support ECDSA curve {}'
@@ -509,7 +531,7 @@ def test_ecdsa_curves(context, curve, curve_cls, skip_servers):
assert_elliptic_key(key, curve_cls)
def test_renew_with_ec_keys(context):
def test_renew_with_ec_keys(context: IntegrationTestsContext) -> None:
"""Test proper renew with updated private key complexity."""
certname = context.get_domain('renew')
context.certbot([
@@ -549,7 +571,7 @@ def test_renew_with_ec_keys(context):
assert_rsa_key(key3)
def test_ocsp_must_staple(context):
def test_ocsp_must_staple(context: IntegrationTestsContext) -> None:
"""Test that OCSP Must-Staple is correctly set in the generated certificate."""
if context.acme_server == 'pebble':
pytest.skip('Pebble does not support OCSP Must-Staple.')
@@ -562,7 +584,7 @@ def test_ocsp_must_staple(context):
assert 'status_request' in certificate or '1.3.6.1.5.5.7.1.24' in certificate
def test_revoke_simple(context):
def test_revoke_simple(context: IntegrationTestsContext) -> None:
"""Test various scenarios that revokes a certificate."""
# Default action after revoke is to delete the certificate.
certname = context.get_domain()
@@ -593,7 +615,7 @@ def test_revoke_simple(context):
context.certbot(['revoke', '--cert-path', cert_path, '--key-path', key_path])
def test_revoke_and_unregister(context):
def test_revoke_and_unregister(context: IntegrationTestsContext) -> None:
"""Test revoke with a reason then unregister."""
cert1 = context.get_domain('le1')
cert2 = context.get_domain('le2')
@@ -621,7 +643,7 @@ def test_revoke_and_unregister(context):
assert cert3 in stdout
def test_revoke_mutual_exclusive_flags(context):
def test_revoke_mutual_exclusive_flags(context: IntegrationTestsContext) -> None:
"""Test --cert-path and --cert-name cannot be used during revoke."""
cert = context.get_domain('le1')
context.certbot(['-d', cert])
@@ -633,7 +655,7 @@ def test_revoke_mutual_exclusive_flags(context):
assert 'Exactly one of --cert-path or --cert-name must be specified' in error.value.stderr
def test_revoke_multiple_lineages(context):
def test_revoke_multiple_lineages(context: IntegrationTestsContext) -> None:
"""Test revoke does not delete certs if multiple lineages share the same dir."""
cert1 = context.get_domain('le1')
context.certbot(['-d', cert1])
@@ -665,11 +687,8 @@ def test_revoke_multiple_lineages(context):
assert 'Not deleting revoked certificates due to overlapping archive dirs' in f.read()
def test_wildcard_certificates(context):
def test_wildcard_certificates(context: IntegrationTestsContext) -> None:
"""Test wildcard certificate issuance."""
if context.acme_server == 'boulder-v1':
pytest.skip('Wildcard certificates are not supported on ACME v1')
certname = context.get_domain('wild')
context.certbot([
@@ -682,7 +701,7 @@ def test_wildcard_certificates(context):
assert exists(join(context.config_dir, 'live', certname, 'fullchain.pem'))
def test_ocsp_status_stale(context):
def test_ocsp_status_stale(context: IntegrationTestsContext) -> None:
"""Test retrieval of OCSP statuses for staled config"""
sample_data_path = misc.load_sample_data_path(context.workspace)
stdout, _ = context.certbot(['certificates', '--config-dir', sample_data_path])
@@ -693,7 +712,7 @@ def test_ocsp_status_stale(context):
.format(stdout.count('EXPIRED')))
def test_ocsp_status_live(context):
def test_ocsp_status_live(context: IntegrationTestsContext) -> None:
"""Test retrieval of OCSP statuses for live config"""
cert = context.get_domain('ocsp-check')
@@ -715,7 +734,7 @@ def test_ocsp_status_live(context):
assert stdout.count('REVOKED') == 1, 'Expected {0} to be REVOKED'.format(cert)
def test_ocsp_renew(context):
def test_ocsp_renew(context: IntegrationTestsContext) -> None:
"""Test that revoked certificates are renewed."""
# Obtain a certificate
certname = context.get_domain('ocsp-renew')
@@ -732,7 +751,7 @@ def test_ocsp_renew(context):
assert_cert_count_for_lineage(context.config_dir, certname, 2)
def test_dry_run_deactivate_authzs(context):
def test_dry_run_deactivate_authzs(context: IntegrationTestsContext) -> None:
"""Test that Certbot deactivates authorizations when performing a dry run"""
name = context.get_domain('dry-run-authz-deactivation')
@@ -750,7 +769,7 @@ def test_dry_run_deactivate_authzs(context):
assert log_line in f.read(), 'Second order should have been recreated due to authz reuse'
def test_preferred_chain(context):
def test_preferred_chain(context: IntegrationTestsContext) -> None:
"""Test that --preferred-chain results in the correct chain.pem being produced"""
try:
issuers = misc.get_acme_issuers(context)

View File

@@ -1,3 +1,4 @@
# type: ignore
"""
General conftest for pytest execution of all integration tests lying
in the certbot_integration tests package.
@@ -20,9 +21,9 @@ def pytest_addoption(parser):
:param parser: current pytest parser that will be used on the CLI
"""
parser.addoption('--acme-server', default='pebble',
choices=['boulder-v1', 'boulder-v2', 'pebble'],
help='select the ACME server to use (boulder-v1, boulder-v2, '
'pebble), defaulting to pebble')
choices=['boulder-v2', 'pebble'],
help='select the ACME server to use (boulder-v2, pebble), '
'defaulting to pebble')
parser.addoption('--dns-server', default='challtestsrv',
choices=['bind', 'challtestsrv'],
help='select the DNS server to use (bind, challtestsrv), '

View File

@@ -1,6 +1,10 @@
"""Module to handle the context of nginx integration tests."""
import os
import subprocess
from typing import Iterable
from typing import Tuple
import pytest
from certbot_integration_tests.certbot_tests import context as certbot_context
from certbot_integration_tests.nginx_tests import nginx_config as config
@@ -10,7 +14,7 @@ from certbot_integration_tests.utils import misc
class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
"""General fixture describing a certbot-nginx integration tests context"""
def __init__(self, request):
def __init__(self, request: pytest.FixtureRequest) -> None:
super().__init__(request)
self.nginx_root = os.path.join(self.workspace, 'nginx')
@@ -22,16 +26,16 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
file_handler.write('Hello World!')
self.nginx_config_path = os.path.join(self.nginx_root, 'nginx.conf')
self.nginx_config = None
self.nginx_config: str
default_server = request.param['default_server']
default_server = request.param['default_server'] # type: ignore[attr-defined]
self.process = self._start_nginx(default_server)
def cleanup(self):
def cleanup(self) -> None:
self._stop_nginx()
super().cleanup()
def certbot_test_nginx(self, args):
def certbot_test_nginx(self, args: Iterable[str]) -> Tuple[str, str]:
"""
Main command to execute certbot using the nginx plugin.
:param list args: list of arguments to pass to nginx
@@ -44,7 +48,7 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
command, self.directory_url, self.http_01_port, self.tls_alpn_01_port,
self.config_dir, self.workspace, force_renew=True)
def _start_nginx(self, default_server):
def _start_nginx(self, default_server: bool) -> subprocess.Popen:
self.nginx_config = config.construct_nginx_config(
self.nginx_root, self.webroot, self.http_01_port, self.tls_alpn_01_port,
self.other_port, default_server, wtf_prefix=self.worker_id)
@@ -58,7 +62,7 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
misc.check_until_timeout('http://localhost:{0}'.format(self.http_01_port))
return process
def _stop_nginx(self):
def _stop_nginx(self) -> None:
assert self.process.poll() is None
self.process.terminate()
self.process.wait()

View File

@@ -1,12 +1,14 @@
# -*- coding: utf-8 -*-
"""General purpose nginx test configuration generator."""
import getpass
from typing import Optional
import pkg_resources
def construct_nginx_config(nginx_root, nginx_webroot, http_port, https_port, other_port,
default_server, key_path=None, cert_path=None, wtf_prefix='le'):
def construct_nginx_config(nginx_root: str, nginx_webroot: str, http_port: int, https_port: int,
other_port: int, default_server: bool, key_path: Optional[str] = None,
cert_path: Optional[str] = None, wtf_prefix: str = 'le') -> str:
"""
This method returns a full nginx configuration suitable for integration tests.
:param str nginx_root: nginx root configuration path

View File

@@ -1,17 +1,18 @@
"""Module executing integration tests against certbot with nginx plugin."""
import os
import ssl
from typing import Generator
from typing import List
import pytest
from certbot_integration_tests.nginx_tests import context as nginx_context
from certbot_integration_tests.nginx_tests.context import IntegrationTestsContext
@pytest.fixture(name='context')
def test_context(request):
def test_context(request: pytest.FixtureRequest) -> Generator[IntegrationTestsContext, None, None]:
# Fixture request is a built-in pytest fixture describing current test request.
integration_test_context = nginx_context.IntegrationTestsContext(request)
integration_test_context = IntegrationTestsContext(request)
try:
yield integration_test_context
finally:
@@ -33,7 +34,7 @@ def test_context(request):
], {'default_server': False}),
], indirect=['context'])
def test_certificate_deployment(certname_pattern: str, params: List[str],
context: nginx_context.IntegrationTestsContext) -> None:
context: IntegrationTestsContext) -> None:
"""
Test various scenarios to deploy a certificate to nginx using certbot.
"""

View File

@@ -2,9 +2,12 @@
from contextlib import contextmanager
import tempfile
from typing import Generator
from typing import Iterable
from typing import Tuple
from pkg_resources import resource_filename
from pytest import skip
import pytest
from certbot_integration_tests.certbot_tests import context as certbot_context
from certbot_integration_tests.utils import certbot_call
@@ -12,17 +15,17 @@ from certbot_integration_tests.utils import certbot_call
class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
"""Integration test context for certbot-dns-rfc2136"""
def __init__(self, request):
def __init__(self, request: pytest.FixtureRequest) -> None:
super().__init__(request)
self.request = request
if hasattr(request.config, 'workerinput'): # Worker node
self._dns_xdist = request.config.workerinput['dns_xdist']
self._dns_xdist = request.config.workerinput['dns_xdist'] # type: ignore[attr-defined]
else: # Primary node
self._dns_xdist = request.config.dns_xdist
self._dns_xdist = request.config.dns_xdist # type: ignore[attr-defined]
def certbot_test_rfc2136(self, args):
def certbot_test_rfc2136(self, args: Iterable[str]) -> Tuple[str, str]:
"""
Main command to execute certbot using the RFC2136 DNS authenticator.
:param list args: list of arguments to pass to Certbot
@@ -34,7 +37,7 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
self.config_dir, self.workspace, force_renew=True)
@contextmanager
def rfc2136_credentials(self, label='default'):
def rfc2136_credentials(self, label: str = 'default') -> Generator[str, None, None]:
"""
Produces the contents of a certbot-dns-rfc2136 credentials file.
:param str label: which RFC2136 credential to use
@@ -57,8 +60,8 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
fp.flush()
yield fp.name
def skip_if_no_bind9_server(self):
def skip_if_no_bind9_server(self) -> None:
"""Skips the test if there was no RFC2136-capable DNS server configured
in the test environment"""
if not self._dns_xdist:
skip('No RFC2136-capable DNS server is configured')
pytest.skip('No RFC2136-capable DNS server is configured')

View File

@@ -1,14 +1,16 @@
"""Module executing integration tests against Certbot with the RFC2136 DNS authenticator."""
from typing import Generator
import pytest
from certbot_integration_tests.rfc2136_tests import context as rfc2136_context
from certbot_integration_tests.rfc2136_tests.context import IntegrationTestsContext
@pytest.fixture(name="context")
def pytest_context(request):
def test_context(request: pytest.FixtureRequest) -> Generator[IntegrationTestsContext, None, None]:
# pylint: disable=missing-function-docstring
# Fixture request is a built-in pytest fixture describing current test request.
integration_test_context = rfc2136_context.IntegrationTestsContext(request)
integration_test_context = IntegrationTestsContext(request)
try:
yield integration_test_context
finally:
@@ -16,7 +18,7 @@ def pytest_context(request):
@pytest.mark.parametrize('domain', [('example.com'), ('sub.example.com')])
def test_get_certificate(domain, context):
def test_get_certificate(domain: str, context: IntegrationTestsContext) -> None:
context.skip_if_no_bind9_server()
with context.rfc2136_credentials() as creds:

View File

@@ -11,7 +11,15 @@ import subprocess
import sys
import tempfile
import time
from types import TracebackType
from typing import Any
from typing import cast
from typing import Dict
from typing import List
from typing import Mapping
from typing import Optional
from typing import Sequence
from typing import Type
import requests
@@ -34,11 +42,12 @@ class ACMEServer:
ACMEServer is also a context manager, and so can be used to ensure ACME server is
started/stopped upon context enter/exit.
"""
def __init__(self, acme_server, nodes, http_proxy=True, stdout=False,
dns_server=None, http_01_port=DEFAULT_HTTP_01_PORT):
def __init__(self, acme_server: str, nodes: Sequence[str], http_proxy: bool = True,
stdout: bool = False, dns_server: Optional[str] = None,
http_01_port: int = DEFAULT_HTTP_01_PORT) -> None:
"""
Create an ACMEServer instance.
:param str acme_server: the type of acme server used (boulder-v1, boulder-v2 or pebble)
:param str acme_server: the type of acme server used (boulder-v2 or pebble)
:param list nodes: list of node names that will be setup by pytest xdist
:param bool http_proxy: if False do not start the HTTP proxy
:param bool stdout: if True stream all subprocesses stdout to standard stdout
@@ -52,7 +61,7 @@ class ACMEServer:
self._proxy = http_proxy
self._workspace = tempfile.mkdtemp()
self._processes: List[subprocess.Popen] = []
self._stdout = sys.stdout if stdout else open(os.devnull, 'w')
self._stdout = sys.stdout if stdout else open(os.devnull, 'w') # pylint: disable=consider-using-with
self._dns_server = dns_server
self._http_01_port = http_01_port
if http_01_port != DEFAULT_HTTP_01_PORT:
@@ -60,7 +69,7 @@ class ACMEServer:
raise ValueError('setting http_01_port is not currently supported '
'with boulder or the HTTP proxy')
def start(self):
def start(self) -> None:
"""Start the test stack"""
try:
if self._proxy:
@@ -73,7 +82,7 @@ class ACMEServer:
self.stop()
raise e
def stop(self):
def stop(self) -> None:
"""Stop the test stack, and clean its resources"""
print('=> Tear down the test infrastructure...')
try:
@@ -104,14 +113,15 @@ class ACMEServer:
self._stdout.close()
print('=> Test infrastructure stopped and cleaned up.')
def __enter__(self):
def __enter__(self) -> Dict[str, Any]:
self.start()
return self.acme_xdist
def __exit__(self, exc_type, exc_val, exc_tb):
def __exit__(self, exc_type: Optional[Type[BaseException]], exc: Optional[BaseException],
traceback: Optional[TracebackType]) -> None:
self.stop()
def _construct_acme_xdist(self, acme_server, nodes):
def _construct_acme_xdist(self, acme_server: str, nodes: Sequence[str]) -> None:
"""Generate and return the acme_xdist dict"""
acme_xdist = {'acme_server': acme_server, 'challtestsrv_port': CHALLTESTSRV_PORT}
@@ -120,8 +130,7 @@ class ACMEServer:
if acme_server == 'pebble':
acme_xdist['directory_url'] = PEBBLE_DIRECTORY_URL
else: # boulder
acme_xdist['directory_url'] = BOULDER_V2_DIRECTORY_URL \
if acme_server == 'boulder-v2' else BOULDER_V1_DIRECTORY_URL
acme_xdist['directory_url'] = BOULDER_V2_DIRECTORY_URL
acme_xdist['http_port'] = {
node: port for (node, port) in # pylint: disable=unnecessary-comprehension
@@ -138,7 +147,7 @@ class ACMEServer:
self.acme_xdist = acme_xdist
def _prepare_pebble_server(self):
def _prepare_pebble_server(self) -> None:
"""Configure and launch the Pebble server"""
print('=> Starting pebble instance deployment...')
pebble_artifacts_rv = pebble_artifacts.fetch(self._workspace, self._http_01_port)
@@ -174,11 +183,11 @@ class ACMEServer:
# Wait for the ACME CA server to be up.
print('=> Waiting for pebble instance to respond...')
misc.check_until_timeout(self.acme_xdist['directory_url'])
misc.check_until_timeout(self.acme_xdist['directory_url']) # type: ignore[arg-type]
print('=> Finished pebble instance deployment.')
def _prepare_boulder_server(self):
def _prepare_boulder_server(self) -> None:
"""Configure and launch the Boulder server"""
print('=> Starting boulder instance deployment...')
instance_path = join(self._workspace, 'boulder')
@@ -207,7 +216,8 @@ class ACMEServer:
# Wait for the ACME CA server to be up.
print('=> Waiting for boulder instance to respond...')
misc.check_until_timeout(self.acme_xdist['directory_url'], attempts=300)
misc.check_until_timeout(
self.acme_xdist['directory_url'], attempts=300) # type: ignore[arg-type]
if not self._dns_server:
# Configure challtestsrv to answer any A record request with ip of the docker host.
@@ -226,16 +236,19 @@ class ACMEServer:
print('=> Finished boulder instance deployment.')
def _prepare_http_proxy(self):
def _prepare_http_proxy(self) -> None:
"""Configure and launch an HTTP proxy"""
print('=> Configuring the HTTP proxy...')
http_port_map = cast(Dict[str, int], self.acme_xdist['http_port'])
mapping = {r'.+\.{0}\.wtf'.format(node): 'http://127.0.0.1:{0}'.format(port)
for node, port in self.acme_xdist['http_port'].items()}
for node, port in http_port_map.items()}
command = [sys.executable, proxy.__file__, str(DEFAULT_HTTP_01_PORT), json.dumps(mapping)]
self._launch_process(command)
print('=> Finished configuring the HTTP proxy.')
def _launch_process(self, command, cwd=os.getcwd(), env=None, force_stderr=False):
def _launch_process(self, command: Sequence[str], cwd: str = os.getcwd(),
env: Optional[Mapping[str, str]] = None,
force_stderr: bool = False) -> subprocess.Popen:
"""Launch silently a subprocess OS command"""
if not env:
env = os.environ
@@ -248,14 +261,14 @@ class ACMEServer:
return process
def main():
def main() -> None:
# pylint: disable=missing-function-docstring
parser = argparse.ArgumentParser(
description='CLI tool to start a local instance of Pebble or Boulder CA server.')
parser.add_argument('--server-type', '-s',
choices=['pebble', 'boulder-v1', 'boulder-v2'], default='pebble',
help='type of CA server to start: can be Pebble or Boulder '
'(in ACMEv1 or ACMEv2 mode), Pebble is used if not set.')
choices=['pebble', 'boulder-v2'], default='pebble',
help='type of CA server to start: can be Pebble or Boulder. '
'Pebble is used if not set.')
parser.add_argument('--dns-server', '-d',
help='specify the DNS server as `IP:PORT` to use by '
'Pebble; if not specified, a local mock DNS server will be used to '

View File

@@ -1,18 +1,24 @@
#!/usr/bin/env python
"""Module to call certbot in test mode"""
from distutils.version import LooseVersion
import os
import pkg_resources
import subprocess
import sys
from typing import Dict
from typing import List
from typing import Mapping
from typing import Sequence
from typing import Tuple
import certbot_integration_tests
# pylint: disable=wildcard-import,unused-wildcard-import
from certbot_integration_tests.utils.constants import *
def certbot_test(certbot_args, directory_url, http_01_port, tls_alpn_01_port,
config_dir, workspace, force_renew=True):
def certbot_test(certbot_args: Sequence[str], directory_url: str, http_01_port: int,
tls_alpn_01_port: int, config_dir: str, workspace: str,
force_renew: bool = True) -> Tuple[str, str]:
"""
Invoke the certbot executable available in PATH in a test context for the given args.
The test context consists in running certbot in debug mode, with various flags suitable
@@ -40,7 +46,7 @@ def certbot_test(certbot_args, directory_url, http_01_port, tls_alpn_01_port,
return proc.stdout, proc.stderr
def _prepare_environ(workspace):
def _prepare_environ(workspace: str) -> Dict[str, str]:
# pylint: disable=missing-function-docstring
new_environ = os.environ.copy()
@@ -78,14 +84,15 @@ def _prepare_environ(workspace):
return new_environ
def _compute_additional_args(workspace, environ, force_renew):
def _compute_additional_args(workspace: str, environ: Mapping[str, str],
force_renew: bool) -> List[str]:
additional_args = []
output = subprocess.check_output(['certbot', '--version'],
universal_newlines=True, stderr=subprocess.STDOUT,
cwd=workspace, env=environ)
# Typical response is: output = 'certbot 0.31.0.dev0'
version_str = output.split(' ')[1].strip()
if LooseVersion(version_str) >= LooseVersion('0.30.0'):
if pkg_resources.parse_version(version_str) >= pkg_resources.parse_version('0.30.0'):
additional_args.append('--no-random-sleep-on-renew')
if force_renew:
@@ -94,8 +101,9 @@ def _compute_additional_args(workspace, environ, force_renew):
return additional_args
def _prepare_args_env(certbot_args, directory_url, http_01_port, tls_alpn_01_port,
config_dir, workspace, force_renew):
def _prepare_args_env(certbot_args: Sequence[str], directory_url: str, http_01_port: int,
tls_alpn_01_port: int, config_dir: str, workspace: str,
force_renew: bool) -> Tuple[List[str], Dict[str, str]]:
new_environ = _prepare_environ(workspace)
additional_args = _compute_additional_args(workspace, new_environ, force_renew)
@@ -126,7 +134,7 @@ def _prepare_args_env(certbot_args, directory_url, http_01_port, tls_alpn_01_por
return command, new_environ
def main():
def main() -> None:
# pylint: disable=missing-function-docstring
args = sys.argv[1:]

View File

@@ -2,7 +2,6 @@
DEFAULT_HTTP_01_PORT = 5002
TLS_ALPN_01_PORT = 5001
CHALLTESTSRV_PORT = 8055
BOULDER_V1_DIRECTORY_URL = 'http://localhost:4000/directory'
BOULDER_V2_DIRECTORY_URL = 'http://localhost:4001/directory'
PEBBLE_DIRECTORY_URL = 'https://localhost:14000/dir'
PEBBLE_MANAGEMENT_URL = 'https://localhost:15000'

View File

@@ -8,7 +8,11 @@ import subprocess
import sys
import tempfile
import time
from types import TracebackType
from typing import Any, Sequence
from typing import Dict
from typing import Optional
from typing import Type
from pkg_resources import resource_filename
@@ -30,7 +34,7 @@ class DNSServer:
future to support parallelization (https://github.com/certbot/certbot/issues/8455).
"""
def __init__(self, unused_nodes, show_output=False):
def __init__(self, unused_nodes: Sequence[str], show_output: bool = False) -> None:
"""
Create an DNSServer instance.
:param list nodes: list of node names that will be setup by pytest xdist
@@ -45,9 +49,10 @@ class DNSServer:
# Unfortunately the BIND9 image forces everything to stderr with -g and we can't
# modify the verbosity.
# pylint: disable=consider-using-with
self._output = sys.stderr if show_output else open(os.devnull, "w")
def start(self):
def start(self) -> None:
"""Start the DNS server"""
try:
self._configure_bind()
@@ -56,7 +61,7 @@ class DNSServer:
self.stop()
raise
def stop(self):
def stop(self) -> None:
"""Stop the DNS server, and clean its resources"""
if self.process:
try:
@@ -70,7 +75,7 @@ class DNSServer:
if self._output != sys.stderr:
self._output.close()
def _configure_bind(self):
def _configure_bind(self) -> None:
"""Configure the BIND9 server based on the prebaked configuration"""
bind_conf_src = resource_filename(
"certbot_integration_tests", "assets/bind-config"
@@ -80,7 +85,7 @@ class DNSServer:
os.path.join(bind_conf_src, directory), os.path.join(self.bind_root, directory)
)
def _start_bind(self):
def _start_bind(self) -> None:
"""Launch the BIND9 server as a Docker container"""
addr_str = "{}:{}".format(BIND_BIND_ADDRESS[0], BIND_BIND_ADDRESS[1])
# pylint: disable=consider-using-with
@@ -149,9 +154,10 @@ class DNSServer:
"Gave up waiting for DNS server {} to respond".format(BIND_BIND_ADDRESS)
)
def __enter__(self):
def __start__(self) -> Dict[str, Any]:
self.start()
return self.dns_xdist
def __exit__(self, exc_type, exc_val, exc_tb):
def __exit__(self, exc_type: Optional[Type[BaseException]], exc: Optional[BaseException],
traceback: Optional[TracebackType]) -> None:
self.stop()

View File

@@ -15,6 +15,11 @@ import sys
import tempfile
import time
import warnings
from typing import Generator
from typing import Iterable
from typing import List
from typing import Optional
from typing import Tuple
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import ec
@@ -22,10 +27,12 @@ from cryptography.hazmat.primitives.serialization import Encoding
from cryptography.hazmat.primitives.serialization import NoEncryption
from cryptography.hazmat.primitives.serialization import PrivateFormat
from cryptography.x509 import load_pem_x509_certificate
from cryptography.x509 import Certificate
from OpenSSL import crypto
import pkg_resources
import requests
from certbot_integration_tests.certbot_tests.context import IntegrationTestsContext
from certbot_integration_tests.utils.constants import PEBBLE_ALTERNATE_ROOTS
from certbot_integration_tests.utils.constants import PEBBLE_MANAGEMENT_URL
@@ -33,7 +40,7 @@ RSA_KEY_TYPE = 'rsa'
ECDSA_KEY_TYPE = 'ecdsa'
def _suppress_x509_verification_warnings():
def _suppress_x509_verification_warnings() -> None:
try:
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
@@ -44,7 +51,7 @@ def _suppress_x509_verification_warnings():
requests.packages.urllib3.disable_warnings(InsecureRequestWarning)
def check_until_timeout(url, attempts=30):
def check_until_timeout(url: str, attempts: int = 30) -> None:
"""
Wait and block until given url responds with status 200, or raise an exception
after the specified number of attempts.
@@ -72,12 +79,12 @@ class GracefulTCPServer(socketserver.TCPServer):
allow_reuse_address = True
def _run_server(port):
def _run_server(port: int) -> None:
GracefulTCPServer(('', port), SimpleHTTPServer.SimpleHTTPRequestHandler).serve_forever()
@contextlib.contextmanager
def create_http_server(port):
def create_http_server(port: int) -> Generator[str, None, None]:
"""
Setup and start an HTTP server for the given TCP port.
This server stays active for the lifetime of the context, and is automatically
@@ -111,7 +118,7 @@ def create_http_server(port):
shutil.rmtree(webroot)
def list_renewal_hooks_dirs(config_dir):
def list_renewal_hooks_dirs(config_dir: str) -> List[str]:
"""
Find and return paths of all hook directories for the given certbot config directory
:param str config_dir: path to the certbot config directory
@@ -121,14 +128,14 @@ def list_renewal_hooks_dirs(config_dir):
return [os.path.join(renewal_hooks_root, item) for item in ['pre', 'deploy', 'post']]
def generate_test_file_hooks(config_dir, hook_probe):
def generate_test_file_hooks(config_dir: str, hook_probe: str) -> None:
"""
Create a suite of certbot hook scripts and put them in the relevant hook directory
for the given certbot configuration directory. These scripts, when executed, will write
specific verbs in the given hook_probe file to allow asserting they have effectively
been executed. The deploy hook also checks that the renewal environment variables are set.
:param str config_dir: current certbot config directory
:param hook_probe: path to the hook probe to test hook scripts execution
:param str hook_probe: path to the hook probe to test hook scripts execution
"""
hook_path = pkg_resources.resource_filename('certbot_integration_tests', 'assets/hook.py')
@@ -163,7 +170,8 @@ set -e
@contextlib.contextmanager
def manual_http_hooks(http_server_root, http_port):
def manual_http_hooks(http_server_root: str,
http_port: int) -> Generator[Tuple[str, str], None, None]:
"""
Generate suitable http-01 hooks command for test purpose in the given HTTP
server webroot directory. These hooks command use temporary python scripts
@@ -216,7 +224,8 @@ shutil.rmtree(well_known)
shutil.rmtree(tempdir)
def generate_csr(domains, key_path, csr_path, key_type=RSA_KEY_TYPE):
def generate_csr(domains: Iterable[str], key_path: str, csr_path: str,
key_type: str = RSA_KEY_TYPE) -> None:
"""
Generate a private key, and a CSR for the given domains using this key.
:param domains: the domain names to include in the CSR
@@ -232,10 +241,15 @@ def generate_csr(domains, key_path, csr_path, key_type=RSA_KEY_TYPE):
with warnings.catch_warnings():
# Ignore a warning on some old versions of cryptography
warnings.simplefilter('ignore', category=PendingDeprecationWarning)
key = ec.generate_private_key(ec.SECP384R1(), default_backend())
key = key.private_bytes(encoding=Encoding.PEM, format=PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=NoEncryption())
key = crypto.load_privatekey(crypto.FILETYPE_PEM, key)
_key = ec.generate_private_key(ec.SECP384R1(), default_backend())
# This type ignore directive is required due to an outdated version of types-cryptography.
# It can be removed once package types-pyOpenSSL depends on cryptography instead of
# types-cryptography and so types-cryptography is not installed anymore.
# See https://github.com/python/typeshed/issues/5618
_bytes = _key.private_bytes(encoding=Encoding.PEM, # type: ignore
format=PrivateFormat.TraditionalOpenSSL,
encryption_algorithm=NoEncryption())
key = crypto.load_privatekey(crypto.FILETYPE_PEM, _bytes)
else:
raise ValueError('Invalid key type: {0}'.format(key_type))
@@ -255,7 +269,7 @@ def generate_csr(domains, key_path, csr_path, key_type=RSA_KEY_TYPE):
file_h.write(crypto.dump_certificate_request(crypto.FILETYPE_ASN1, req))
def read_certificate(cert_path):
def read_certificate(cert_path: str) -> str:
"""
Load the certificate from the provided path, and return a human readable version
of it (TEXT mode).
@@ -269,7 +283,7 @@ def read_certificate(cert_path):
return crypto.dump_certificate(crypto.FILETYPE_TEXT, cert).decode('utf-8')
def load_sample_data_path(workspace):
def load_sample_data_path(workspace: str) -> str:
"""
Load the certbot configuration example designed to make OCSP tests, and return its path
:param str workspace: current test workspace directory path
@@ -300,7 +314,7 @@ def load_sample_data_path(workspace):
return copied
def echo(keyword, path=None):
def echo(keyword: str, path: Optional[str] = None) -> str:
"""
Generate a platform independent executable command
that echoes the given keyword into the given file.
@@ -315,7 +329,7 @@ def echo(keyword, path=None):
os.path.basename(sys.executable), keyword, ' >> "{0}"'.format(path) if path else '')
def get_acme_issuers(context):
def get_acme_issuers(context: IntegrationTestsContext) -> List[Certificate]:
"""Gets the list of one or more issuer certificates from the ACME server used by the
context.
:param context: the testing context.

View File

@@ -3,6 +3,7 @@
import json
import os
import stat
from typing import Tuple
import pkg_resources
import requests
@@ -14,7 +15,7 @@ PEBBLE_VERSION = 'v2.3.0'
ASSETS_PATH = pkg_resources.resource_filename('certbot_integration_tests', 'assets')
def fetch(workspace, http_01_port=DEFAULT_HTTP_01_PORT):
def fetch(workspace: str, http_01_port: int = DEFAULT_HTTP_01_PORT) -> Tuple[str, str, str]:
# pylint: disable=missing-function-docstring
suffix = 'linux-amd64' if os.name != 'nt' else 'windows-amd64.exe'
@@ -25,7 +26,7 @@ def fetch(workspace, http_01_port=DEFAULT_HTTP_01_PORT):
return pebble_path, challtestsrv_path, pebble_config_path
def _fetch_asset(asset, suffix):
def _fetch_asset(asset: str, suffix: str) -> str:
asset_path = os.path.join(ASSETS_PATH, '{0}_{1}_{2}'.format(asset, PEBBLE_VERSION, suffix))
if not os.path.exists(asset_path):
asset_url = ('https://github.com/letsencrypt/pebble/releases/download/{0}/{1}_{2}'
@@ -39,7 +40,7 @@ def _fetch_asset(asset, suffix):
return asset_path
def _build_pebble_config(workspace, http_01_port):
def _build_pebble_config(workspace: str, http_01_port: int) -> str:
config_path = os.path.join(workspace, 'pebble-config.json')
with open(config_path, 'w') as file_h:
file_h.write(json.dumps({

View File

@@ -22,7 +22,7 @@ from certbot_integration_tests.utils.misc import GracefulTCPServer
class _ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
# pylint: disable=missing-function-docstring
def do_POST(self):
def do_POST(self) -> None:
request = requests.get(PEBBLE_MANAGEMENT_URL + '/intermediate-keys/0', verify=False)
issuer_key = serialization.load_pem_private_key(request.content, None, default_backend())

View File

@@ -5,17 +5,19 @@ import http.server as BaseHTTPServer
import json
import re
import sys
from typing import Mapping
from typing import Type
import requests
from certbot_integration_tests.utils.misc import GracefulTCPServer
def _create_proxy(mapping):
def _create_proxy(mapping: Mapping[str, str]) -> Type[BaseHTTPServer.BaseHTTPRequestHandler]:
# pylint: disable=missing-function-docstring
class ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
# pylint: disable=missing-class-docstring
def do_GET(self):
def do_GET(self) -> None:
headers = {key.lower(): value for key, value in self.headers.items()}
backend = [backend for pattern, backend in mapping.items()
if re.match(pattern, headers['host'])][0]

View File

@@ -1,6 +1,4 @@
from distutils.version import LooseVersion
import sys
from pkg_resources import parse_version
from setuptools import __version__ as setuptools_version
from setuptools import find_packages
from setuptools import setup
@@ -11,7 +9,7 @@ version = '0.32.0.dev0'
min_setuptools_version='36.2'
# This conditional isn't necessary, but it provides better error messages to
# people who try to install this package with older versions of setuptools.
if LooseVersion(setuptools_version) < LooseVersion(min_setuptools_version):
if parse_version(setuptools_version) < parse_version(min_setuptools_version):
raise RuntimeError(f'setuptools {min_setuptools_version}+ is required')
install_requires = [
@@ -30,6 +28,8 @@ install_requires = [
'pywin32>=300 ; sys_platform == "win32"',
'pyyaml',
'requests',
'setuptools',
'types-python-dateutil'
]
setup(
@@ -51,6 +51,7 @@ setup(
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],

View File

@@ -1,3 +1,4 @@
# type: ignore
"""
General conftest for pytest execution of all integration tests lying
in the snap_installer_integration tests package.
@@ -18,9 +19,10 @@ def pytest_addoption(parser):
parser.addoption('--snap-folder', required=True,
help='set the folder path where snaps to test are located')
parser.addoption('--snap-arch', default='amd64',
help='set the architecture do test (default: amd64)')
help='set the architecture do test (default: amd64)')
parser.addoption('--allow-persistent-changes', action='store_true',
help='needs to be set, and confirm that the test will make persistent changes on this machine')
help='needs to be set, and confirm that the test will make persistent '
'changes on this machine')
def pytest_configure(config):
@@ -30,7 +32,8 @@ def pytest_configure(config):
"""
if not config.option.allow_persistent_changes:
raise RuntimeError('This integration test would install the Certbot snap on your machine. '
'Please run it again with the `--allow-persistent-changes` flag set to acknowledge.')
'Please run it again with the `--allow-persistent-changes` flag set '
'to acknowledge.')
def pytest_generate_tests(metafunc):
@@ -40,6 +43,6 @@ def pytest_generate_tests(metafunc):
if "dns_snap_path" in metafunc.fixturenames:
snap_arch = metafunc.config.getoption('snap_arch')
snap_folder = metafunc.config.getoption('snap_folder')
snap_dns_path_list = glob.glob(os.path.join(snap_folder,
snap_dns_path_list = glob.glob(os.path.join(snap_folder,
'certbot-dns-*_{0}.snap'.format(snap_arch)))
metafunc.parametrize("dns_snap_path", snap_dns_path_list)

Some files were not shown because too many files have changed in this diff Show More