Compare commits

...

242 Commits

Author SHA1 Message Date
Brad Warren
942e9e9436 ugly version 2023-01-31 19:03:36 -08:00
Brad Warren
06d2559e7c ... 2023-01-31 18:54:45 -08:00
Brad Warren
83fbdbd278 .0 ... 2023-01-31 18:51:55 -08:00
Brad Warren
a128fccb8c & variable 2023-01-31 18:37:38 -08:00
Brad Warren
01664204fa json it 2023-01-31 18:34:47 -08:00
Brad Warren
e0d8452c26 1 indexed? 2023-01-31 18:26:58 -08:00
Brad Warren
8c5675ac1f what does split return? 2023-01-31 18:21:25 -08:00
Brad Warren
55536771bd change coverage upload condition 2023-01-31 14:55:17 -08:00
Brad Warren
6e85dfb306 add comment 2023-01-26 11:17:02 -08:00
Brad Warren
b339e25ddf set uploadCoverage 2023-01-26 11:14:09 -08:00
Brad Warren
51f03a1449 fix typo 2023-01-26 10:58:30 -08:00
Brad Warren
d6bc96aee1 change coverage upload condition 2023-01-26 10:50:44 -08:00
Will Greenberg
b0748b69e7 Replace probot/stale app with a Github Action (#9466)
* Replace probot/stale app with a Github Action

This creates a Github Actions workflow which seems to be the supported
way of automarking issues as stale. Adds a dry-run flag to test it out.

* small fixups

* cron typo

* disable unnecessary permissions

* use friendlier name
2023-01-25 15:59:22 -08:00
Brad Warren
c79a5d4407 Start sending coverage data to codecov (#9544)
* set up codecov

* export coverage data to xml
2023-01-26 08:15:51 +11:00
Brad Warren
4ad71ab5ae Fix tox environments (#9547)
* fix cover tox envs

* make test work on all Pythons

* Remove unused import

Co-authored-by: alexzorin <alex@zorin.id.au>

Co-authored-by: alexzorin <alex@zorin.id.au>
2023-01-25 12:00:06 +11:00
Will Greenberg
81ff6fcc0d acme.messages.Error: add mutability (#9546)
* acme.messages.Error: add mutability

As of Python 3.11, an exception caught within a `with` statement will
update the __traceback__ attribute. Because acme.messages.Error was
immutable, this was causing a knock-on exception, causing certbot to
exit abnormally. This commit hacks in mutability for acme.messages.Error

Fixes #9539

* Add CHANGELOG entry
2023-01-25 09:06:53 +11:00
Brad Warren
613e698199 disable random sleep in lock_test.py (#9545) 2023-01-25 08:05:01 +11:00
alexzorin
be3bf316c0 Deprecate {csr, keys} dirs & automatically truncate lineages (#9537)
Based on my design [here](https://docs.google.com/document/d/1jGh_bZPnrhi96KzuIcyCJfnudl4m3pRPGkiK4fTo8e4/edit?usp=sharing). 

Fixes https://github.com/certbot/certbot/issues/4634 and https://github.com/certbot/certbot/issues/4635.

- [x] Deprecate `NamespaceConfig.csr_dir`,`NamespaceConfig.key_dir`, ~~`constants.CSR_DIR` and `constants.KEY_DIR`~~. (`constants` is `_internal` so we can just delete it eventually).
- [x] Update `certbot.crypto_util.generate_csr` and `.generate_key` to make `csr_dir` and `key_dir` optional, respectively.
- [x] Change `certbot._internal.client.Client.obtain_certificate` to no longer include `csr_dir` and `key_dir` to the `.generate_csr` and `.generate_key` calls, respectively.
- Automatically delete unwanted lineage items:
  - [x] In `certbot._internal.storage.RenewableCert`, add a function to truncate the lineage history according to the criteria (keep the current and the 5 prior certificates). 
      - [x] Add a test suite for `truncate` 
  - [x] In `certbot._internal.renewal.renew_cert`, call the lineage truncation function after the symlinks have been updated for the renewal.


* Stop writing new files to /csr and /keys

* storage: add lineage truncation

* remove unused code

* deprecate keys_dir and csr_dir

* update CHANGELOG

* just keep 5 prior certificates, dont be clever with expiry

* docs: remove reference to /archive and /keys

* filter {csr,key}_dir deprecations directly in tests
2023-01-19 17:21:26 -08:00
alexzorin
e7fcd0e08d docs: give webroot and standalone better descriptions (#9536) 2023-01-12 08:03:51 -08:00
alexzorin
8149e255c8 Merge pull request #9534 from certbot/candidate-2.2.0
Update files from 2.2 release
2023-01-12 15:11:23 +11:00
Brad Warren
32a233d93b Bump version to 2.3.0 2023-01-11 13:21:23 -08:00
Brad Warren
a63bf5f88b Add contents to certbot/CHANGELOG.md for next version 2023-01-11 13:21:23 -08:00
Brad Warren
4ab4c9b65d Release 2.2.0 2023-01-11 13:21:22 -08:00
Brad Warren
b56df2fdd9 Update changelog for 2.2.0 release 2023-01-11 13:20:17 -08:00
Brad Warren
b1f22aa8a2 Add progressive release tooling (#9532)
This is based on what I wrote at https://opensource.eff.org/eff-open-source/pl/k1b4pcxnifyj9m7o4wdq7cka8h.
2023-01-11 12:27:38 -08:00
alexzorin
d641f062f2 limit challenge polling to 30 minutes (#9527)
* limit challenge polling to 30 minutes

* Fix docstring typo

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2023-01-06 09:24:58 +11:00
Brad Warren
666e12b25d upgrade shellingham (#9529) 2023-01-05 19:30:47 +11:00
Alex Bouma
b81ef33f33 Add link to dns-dnsmanager third party plugin (#9523) 2022-12-25 09:07:12 +11:00
Brad Warren
8155d60e9a remove setuptools pin (#9520) 2022-12-21 10:59:41 +11:00
Brad Warren
124e6d80c3 separate cover environment to workaround tox bug (#9519) 2022-12-19 13:38:04 -08:00
Brad Warren
ac75977156 update 1.32.x reqs (#9516) 2022-12-18 08:16:36 +11:00
alexzorin
63ff1f2a3a Merge pull request #9517 from certbot/candidate-1.32.2
Update master from 1.32.2 release
2022-12-18 08:13:52 +11:00
Brad Warren
74af586f4b Merge branch 'master' into candidate-1.32.2 2022-12-16 14:16:58 -08:00
Brad Warren
c3e1d7e560 Bump version to 2.2.0 2022-12-16 12:46:36 -08:00
Brad Warren
8e30f13e57 Add contents to certbot/CHANGELOG.md for next version 2022-12-16 12:46:36 -08:00
Brad Warren
06bba7167d Release 1.32.2 2022-12-16 12:46:34 -08:00
Brad Warren
118fce34d3 Update changelog for 1.32.2 release 2022-12-16 12:45:28 -08:00
Brad Warren
746631351f Prep for 1.32.2 (#9514)
* Update dependencies (#9505)

* upgrade dependencies

* forbid old setuptools

(cherry picked from commit 70a36fdf00)

* fix help output (#9509)

(cherry picked from commit 27af7b5d15)

* add reminder to repin.sh

* write changelog entry

* pin back mypy
2022-12-16 12:43:17 -08:00
alexzorin
3bc463a66f Merge pull request #9512 from certbot/candidate-2.1.1
Update master from 2.1.1 release
2022-12-17 07:36:22 +11:00
Brad Warren
ac0f4ba3ee Merge branch 'master' into candidate-2.1.1 2022-12-15 08:21:28 -08:00
Brad Warren
d47242296d Bump version to 2.2.0 2022-12-15 07:14:18 -08:00
Brad Warren
edfd84fab5 Add contents to certbot/CHANGELOG.md for next version 2022-12-15 07:14:18 -08:00
Brad Warren
af503ad836 Release 2.1.1 2022-12-15 07:14:17 -08:00
Brad Warren
06d40ec272 Update changelog for 2.1.1 release 2022-12-15 07:13:22 -08:00
Brad Warren
1615185a14 Prepare for 2.1.1 (#9508)
* Update dependencies (#9505)

* upgrade dependencies

* forbid old setuptools

(cherry picked from commit 70a36fdf00)

* prep changelog

* also mention windows
2022-12-15 11:31:56 +11:00
Brad Warren
27af7b5d15 fix help output (#9509) 2022-12-15 11:27:23 +11:00
Brad Warren
a807240db7 add 1.32.x/requirements.txt (#9506) 2022-12-13 11:00:30 +11:00
Brad Warren
70a36fdf00 Update dependencies (#9505)
* upgrade dependencies

* forbid old setuptools
2022-12-13 10:48:17 +11:00
Marcel Robitaille
6b7549bf3a Add filename to dns_common.py configuration errors (#9501)
Fixes #9500

Also print the path to the file with errors for the error "Error parsing credentials configuration" of `dns_common.py`. This makes debugging this error much easier.
2022-12-09 14:55:07 -08:00
alexzorin
4c04328e6d Merge pull request #9498 from certbot/candidate-2.1.0
Update files from 2.1.0 release
2022-12-08 06:37:34 +11:00
Brad Warren
7240e06613 Bump version to 2.2.0 2022-12-07 06:51:42 -08:00
Brad Warren
51bf92f353 Add contents to certbot/CHANGELOG.md for next version 2022-12-07 06:51:42 -08:00
Brad Warren
5e193eb12f Release 2.1.0 2022-12-07 06:51:41 -08:00
Brad Warren
63ea7d54e7 Update changelog for 2.1.0 release 2022-12-07 06:50:45 -08:00
alexzorin
26d3ab86b8 dns-linode: fix confusing credentials example (#9493) 2022-12-05 14:25:07 -08:00
ohemorange
b6695b7213 Merge pull request #9496 from certbot/1.32.x-candidate-1.32.1
Update 1.32.x from 1.32.1 release
2022-12-05 14:22:58 -08:00
ohemorange
1f262e677c Merge pull request #9494 from certbot/candidate-1.32.1
Update master changelog from 1.32.1 release
2022-12-05 14:22:53 -08:00
Brad Warren
023bb494b5 undo help text changes 2022-12-05 08:05:50 -08:00
Brad Warren
70d3fc5916 Merge branch 'master' into candidate-1.32.1 2022-12-05 08:00:21 -08:00
Brad Warren
e22d78b36c Bump version to 2.0.0 2022-12-05 07:04:31 -08:00
Brad Warren
17a7097011 Add contents to certbot/CHANGELOG.md for next version 2022-12-05 07:04:31 -08:00
Brad Warren
27809fbc59 Release 1.32.1 2022-12-05 07:04:30 -08:00
Brad Warren
a6ef3245ae Update changelog for 1.32.1 release 2022-12-05 07:03:16 -08:00
Brad Warren
1b5afb179f Prep for 1.32.1 (#9492)
I wanted to do this because we were notified that https://ubuntu.com/security/notices/USN-5638-3/ affects our snaps. This probably doesn't affect us, but rebuilding to be safe seems worth it to me personally.

I started to just trigger a new v1.32.0 release build, but I don't want to overwrite our 2.0 Docker images under the `latest` tag.

Changelog changes here are similar to what has been done for past point releases like https://github.com/certbot/certbot/pull/8501.

I also cherry picked #9474 to this branch to help the release process pass.

* add changelog

* Use a longer timeout for releases (#9474)

This is in response to the thread starting at https://github.com/certbot/certbot/pull/9330#issuecomment-1320416069.

In addition to this, I plan to add the following text to the step of the release instructions that tells you to wait until Azure Pipelines for the release has finished running:

> Some jobs such as building our snaps can take a long time to complete, however, if the process seems hung, you can cancel the build and then rerun the failed jobs. To do this, click on the build for the release in the link above, make sure you're logged into Azure Pipelines, and then use the cancel/rerun buttons in the top right of the web page.

(cherry picked from commit 30b4fd59a5)
2022-12-05 07:00:44 -08:00
Brad Warren
f0251a7959 fix apache unit tests (#9490)
Fixes https://github.com/certbot/certbot/issues/9481.

I poked around our other uses of this function and they seem OK to me for now, however, I opened https://github.com/certbot/certbot/issues/9489 to track the bigger refactor I think we should do here.
2022-12-01 12:27:24 -08:00
Brad Warren
8390c65a95 fix certbot plugins output (#9488) 2022-12-01 08:56:09 +11:00
alexzorin
fe5e56a52c certbot.interfaces: reintroduce empty zope interfaces (#9486)
* reintroduce certbot.interfaces.I* classes

* add wiki link
2022-12-01 08:42:54 +11:00
alexzorin
c178fa8c0b nginx: on encountering lua directives, produce a better warning (#9475)
* nginx: capitalise product names in warning message properly

* nginx: don't crash on encountering lua directives, warn instead

* add tests

* undo excess newline

* fix oldest tests: use old camelCase function name

* add missing newline in new testdata

* add tests for _by_lua, which should parse fine
2022-11-30 12:03:51 +11:00
Will Greenberg
c78503f21d Merge pull request #9477 from certbot/candidate-2.0.0
Release 2.0.0
2022-11-21 12:12:00 -08:00
Brad Warren
f171f0fcd9 remove botocore warning exceptions (#9476) 2022-11-22 06:42:00 +11:00
Will Greenberg
1e61513859 Bump version to 2.1.0 2022-11-21 09:59:06 -08:00
Will Greenberg
7b27d98370 Add contents to certbot/CHANGELOG.md for next version 2022-11-21 09:59:06 -08:00
Will Greenberg
3d0c2abd3b Release 2.0.0 2022-11-21 09:59:04 -08:00
Will Greenberg
f11dad9e04 Update changelog for 2.0.0 release 2022-11-21 09:58:20 -08:00
Brad Warren
30b4fd59a5 Use a longer timeout for releases (#9474)
This is in response to the thread starting at https://github.com/certbot/certbot/pull/9330#issuecomment-1320416069.

In addition to this, I plan to add the following text to the step of the release instructions that tells you to wait until Azure Pipelines for the release has finished running:

> Some jobs such as building our snaps can take a long time to complete, however, if the process seems hung, you can cancel the build and then rerun the failed jobs. To do this, click on the build for the release in the link above, make sure you're logged into Azure Pipelines, and then use the cancel/rerun buttons in the top right of the web page.
2022-11-21 08:18:06 -08:00
alexzorin
b2dc3e99d6 docs: remove section about dual RSA/ECDSA from User Guide (#9473)
As agreed here: https://github.com/certbot/certbot/pull/9465#discussion_r1025498427
2022-11-17 13:35:20 -08:00
Brad Warren
1c5e56d9c7 Claim Python 3.11 support and add tests (#9471)
* set up 3.11 tests

* fixup warnings

* sed -i "s/\( *'Pro.*3\.1\)0\(',\)/\10\2\n\11\2/" */setup.py

* update changelog
2022-11-18 07:55:27 +11:00
Brad Warren
ad708a0299 remove pylint pinning (#9472) 2022-11-18 07:36:50 +11:00
alexzorin
371cc6f9f1 docs: rewrite ecdsa section of user guide (#9465)
At the time this section was written, it was all about the introduction of support for ECDSA and how users can start taking advantage of that support.

Now that we use ECDSA by default, this piece of documentation probably should serve a new purpose. My idea here is to document the new behavior that we have in 2.0:  new key type on new certificates, old certificates will keep their existing key type.

Users may now be going in the reverse direction with their changes ("I got an ECDSA certificate but I need RSA because I have an old load balancer appliance!") so I have also updated some section titles to be less about ECDSA and more about Key Types in general.

Fixes #9442.
2022-11-17 09:41:34 -08:00
Brad Warren
d244013355 Upgrade pylint (#9470)
* upgrade pylint

* pylint --generate-rcfile > .pylintrc

* fixup pylintrc

* Remove unnecessary lambdas

* fix broad-except

* fix missing timeouts

* fix unit tests

* catch more generic exception
2022-11-17 18:21:14 +11:00
Brad Warren
652d5e96be Drop awscli dependency (#9459)
Fixes https://github.com/certbot/certbot/issues/9458.

* update readme

* drop awscli

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2022-11-16 17:10:18 -08:00
Brad Warren
455f9a0d6c Explain Certbot 2.0 snaps in changelog (#9469) 2022-11-17 11:40:17 +11:00
Brad Warren
9c003bc2d6 Add 2.0 release logic (#9467) (#9468)
This PR:

* Deletes the 2.0 pre-release pipeline
* Causes 1.x releases to be released to Docker Hub without updating the latest tag, PyPI, and the candidate and stable channels of the snap store
* Causes 2.x releases to be released to Docker Hub, PyPI, the beta channel of the snap store, and our Windows installer
We could potentially look into how to continue to do 1.x Windows installer releases through GitHub releases and tech ops tooling, but I personally don't think it's worth it right now.

This PR DOES NOT do anything about progressive snap releases. I think we can revisit this when/if we decide (how) to do them.

(cherry picked from commit 09af133af3)
2022-11-17 11:38:40 +11:00
Brad Warren
09af133af3 Add 2.0 release logic (#9467)
This PR:

* Deletes the 2.0 pre-release pipeline
* Causes 1.x releases to be released to Docker Hub without updating the latest tag, PyPI, and the candidate and stable channels of the snap store
* Causes 2.x releases to be released to Docker Hub, PyPI, the beta channel of the snap store, and our Windows installer
We could potentially look into how to continue to do 1.x Windows installer releases through GitHub releases and tech ops tooling, but I personally don't think it's worth it right now.

This PR DOES NOT do anything about progressive snap releases. I think we can revisit this when/if we decide (how) to do them.
2022-11-16 15:29:53 -08:00
Will Greenberg
21ef8e4332 main: set more permissive umask when creating work_dir (#9448)
* main: set more permissive umask when creating work_dir

This'll guarantee our working dir has the appropriate permissions,
even when a user has a strict umask

* update changelog

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2022-11-14 14:35:29 -08:00
Brad Warren
383a42851c Merge pull request #9461 from certbot/merge-2.0.x
Merge 2.0.x
2022-11-14 09:50:15 -08:00
Alex Zorin
f9962c3013 changelog: add 2.0 entries 2022-11-12 17:00:06 +11:00
Alex Zorin
a384886a15 changelog: update latest section to 2.0.0 2022-11-12 16:48:40 +11:00
Brad Warren
10f60bab0c Merge pull request #9460 from alexzorin/2.0.x
Merge `master` into `2.0.x`
2022-11-11 12:36:48 -08:00
Alex Zorin
202db15274 fix new mypy complaints 2022-11-11 18:03:57 +11:00
Alex Zorin
1773edcad0 Merge remote-tracking branch 'origin/master' into 2.0.x 2022-11-11 17:25:42 +11:00
Brad Warren
a8015fa102 Merge pull request #9457 from certbot/candidate-1.32.0
Release 1.32.0
2022-11-09 14:00:14 -08:00
Erica Portnoy
fd22bd0f66 Bump version to 1.33.0 2022-11-08 15:23:35 -08:00
Erica Portnoy
c087b6f6c9 Add contents to certbot/CHANGELOG.md for next version 2022-11-08 15:23:35 -08:00
Erica Portnoy
d88b9a5d11 Release 1.32.0 2022-11-08 15:23:34 -08:00
Erica Portnoy
dd2df86625 Update changelog for 1.32.0 release 2022-11-08 15:22:20 -08:00
alexzorin
7ab82b6f64 repin dependencies (#9454) 2022-11-02 12:32:00 -07:00
Brad Warren
9cf062d8d4 disable poetry's cache (#9453) 2022-11-02 10:23:57 -07:00
Kevin Jones
63de0ca9e6 Use https: protocol instead of deprecated git: protocol (#9452) 2022-10-31 14:17:50 -07:00
Will Greenberg
f73e062c7a Fix changelog entry (#9444)
* Fix changelog entry

* move to 1.32.0

Co-authored-by: Brad Warren <bmw@eff.org>
2022-11-01 07:22:07 +11:00
Will Greenberg
7865bbd39a Add comment explainig the load-bearing debug flags (#9443) 2022-10-27 14:47:29 +11:00
Will Greenberg
eed1afb808 certbot-apache: use httpd by default for CentOS/RHEL (#9402)
* certbot-apache: use httpd for newer RHEL derived distros

A change in RHEL 9 is causing apachectl to error out when used
with additional arguments, resulting in certbot errors. The CentOS
configurator now uses httpd instead for RHEL 9 (and later) derived
distros.

* Single CentOS class which uses the apache_bin option

* soothe mypy

* Always call super()._override_cmds()
2022-10-26 15:07:02 -07:00
Brad Warren
529942fe4b Unpin poetry (#9438)
* unpin poetry

* export constraints
2022-10-21 10:59:33 +02:00
Brad Warren
3a738cadc3 Remove docker-compose dependency (#9436)
This is progress towards https://github.com/certbot/certbot/issues/9370 as discussed at https://github.com/certbot/certbot/pull/9435.

I kept the command using `docker-compose` because `docker compose` doesn't seem that widely recognized yet and https://www.docker.com/blog/announcing-compose-v2-general-availability/ describes aliasing `docker-compose` to `docker compose` on newer systems by default.

* refactor boulder shutdown

* remove docker-compose dep

* Reorder shutdown process
2022-10-20 13:07:18 -07:00
alexzorin
5270c34dd7 docs: use modern tsig-keygen util in certbot-dns-rfc2136 (#9424)
Fixes #7206.

I think it's about time we did this:

- `dnssec-keygen` on new distros doesn't support the HMAC algorithms anymore, so our instructions don't work.
- The oldest distros we support are Debian Buster (`9.11.5.P4+dfsg-5.1+deb10u7`) and CentOS 7 (`9.11.4-26.P2.el7_9.9`), which ship `tsig-keygen` and support `HMAC-SHA512`.
2022-10-17 16:55:00 -07:00
alexzorin
314ded348e docs: add third-party dns-multi plugin (#9430) 2022-10-13 17:58:18 -07:00
Phil Martin
92aaa9703b TSIG SOA query fix (#9408)
* Use the TSIG keyring for the initial SOA request

Helps allow the use of keys in BIND ACLs to help certbot update the correct zone. Previously TSIG was only used for zone updates, rather than for both the authoritative SOA request and zone update.

* Update CHANGELOG.md

* Update AUTHORS.md

* Workaround for mypy failure due to dnspython stubs

As per https://github.com/certbot/certbot/pull/9408#issuecomment-1257868864

Co-authored-by: Alex Zorin <alex@zorin.id.au>
2022-10-14 08:52:08 +11:00
alexzorin
f5e7d16303 don't superfluously ask whether to renew, when changing key type (#9421)
* dont superfluously ask whether to renew, when changing key type

* reorder conditions

this prevents "Certificate not yet due for renewal" being printed

* and replace superfluous mock

* mock renewal.should_renew
2022-10-06 14:29:58 -07:00
Brad Warren
a0b8a2cc62 Merge pull request #9426 from certbot/2.0-merge-master
2.0.x: merge master and bump version to 2.0.0.dev0
2022-10-06 12:04:35 -07:00
Alex Zorin
d5d8739783 bump version to 2.0.0.dev0 2022-10-05 05:17:29 +11:00
Alex Zorin
4fcc0f7c2a Merge branch 'master' into 2.0-merge-master 2022-10-05 05:15:39 +11:00
alexzorin
e84271b36b Merge pull request #9425 from certbot/candidate-1.31.0
Release 1.31.0
2022-10-05 05:09:37 +11:00
Brad Warren
3eac48ba5a Bump version to 1.32.0 2022-10-04 07:41:45 -07:00
Brad Warren
9409c086d4 Add contents to certbot/CHANGELOG.md for next version 2022-10-04 07:41:45 -07:00
Brad Warren
d0fbde9126 Release 1.31.0 2022-10-04 07:41:44 -07:00
Brad Warren
049e29cc1c Update changelog for 1.31.0 release 2022-10-04 07:40:41 -07:00
osirisinferi
e3448fa0d5 Fix typo in install.rst (#9422) 2022-10-02 10:06:27 +11:00
Alexis
2460d9ad0c Docs: Rewrite Installation Instructions: User Guide (#9220)
* Rewrite Installation Instrcutions: User Guide

Simplifying Installation instructions in User Guide

- First step in simplifying docs for Certbot Users

* Amend Install Doc

- Address errors
- Clean up links

* Update certbot/docs/install.rst

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/docs/install.rst

Co-authored-by: alexzorin <alex@zor.io>

* Update certbot/docs/install.rst

Co-authored-by: alexzorin <alex@zor.io>

* Amend instructions
- clarify requirements
- update outdated advice
- remove direct link

* Remove unintentinally added files

Co-authored-by: alexzorin <alex@zor.io>
2022-10-01 09:13:30 +10:00
Charlie Britton
4ec115cca5 Add single domain option for OVH DNS creds (#9419) 2022-09-29 19:06:41 -07:00
alexzorin
fdd2a7e937 plugins: remove support for dist:plugin plugin names (#9359)
* plugins: remove support for dist:plugin plugin names

* address feedback
2022-09-30 07:09:03 +10:00
Will Greenberg
26d479d6e3 Remove external mock dependency (#9331)
* Remove external mock dependency

This also removes the "external-mock" test environment

* remove superfluous ignores

* remove mock warning ignore from pytest.ini

* drop deps on mock in oldest, drop dep on types-mock

Co-authored-by: Alex Zorin <alex@zorin.id.au>
2022-09-28 16:17:03 -07:00
Will Greenberg
c9eba6ccd3 Merge pull request #9353 from alexzorin/ecdsa-default-flag
change default key_type from rsa to ecdsa
2022-09-27 12:12:48 -07:00
Alex Zorin
5d6e067a74 fix tests broken by #9262 2022-09-27 13:51:35 +10:00
Alex Zorin
652c06a8ae fix typo in key conflict error message 2022-09-27 13:51:16 +10:00
Alex Zorin
f6d532a15b Merge remote-tracking branch 'origin/2.0.x' into ecdsa-default-flag 2022-09-27 12:38:20 +10:00
alexzorin
212c2ba990 error out when --reuse-key conflicts with other flags (#9262)
* error out when --reuse-key conflicts with other flags

* add unit test

* add integration tests

* lint
2022-09-27 12:37:24 +10:00
Brad Warren
c42dd567ca remove source_address arg (#9418) 2022-09-27 12:30:05 +10:00
osirisinferi
a845ab8446 Fix regression in Cloudflare library (#9417)
* Fix regression in CF library

* Add changelog entry

* Fix typo

Co-authored-by: alexzorin <alex@zor.io>

* Add note to docs

Co-authored-by: alexzorin <alex@zor.io>
2022-09-27 07:48:30 +10:00
Brad Warren
758cfb9f79 upgrade base docker image (#9415) 2022-09-26 20:36:08 +10:00
Will Greenberg
7c3b9043a1 Merge pull request #9414 from certbot/simplify-ci
Actually test everything in test- branches (besides deployment)
2022-09-22 14:09:20 -07:00
Brad Warren
e0b639397b actually test everything in test- branches 2022-09-22 07:07:43 -07:00
Brad Warren
db31a8c1f5 Upgrade dependency pinnings (#9412)
* upgrade dependencies

* remove unused ignore
2022-09-21 18:37:30 +10:00
osirisinferi
d214da191d Certbot-specific temp log dir prefix (#9406)
Fixes #9405.
2022-09-16 06:34:02 -07:00
Patrick Neumann
0326cbf95e Update generate_dnsplugins_snapcraft.sh (#9398)
There is no need for two interconneced (pipe) processes.
The regular expression in the grep part is not strict enough in some cases (presence of long_description.
sed does not seem to support perl regular expressions ("\s").
Some Python developers prefer single quotes to double qoutes. Some even go so far as to adapt generated templates (setup.py).
This update will (hopefully) fix this all.
This was tested on Ubuntu 20.04.5 LTS (Focal Fossa) and macOS 12.5.1 (Monterey).
2022-09-13 07:16:27 -07:00
ohemorange
314b2ef89b Merge pull request #9404 from certbot/master
Add 2.0 pre-release pipeline to 2.0.x branch
2022-09-12 15:56:54 -07:00
Brad Warren
39e8d14e1b Set up 2.0 pre-releases (#9400)
* update credential info

* update release tooling to use candidate channel

* split deploy jobs

* pass parameter through

* add 2.0 pipeline prerelease

* add comments

* quote file path
2022-09-09 14:23:39 -07:00
alexzorin
f4db687130 Merge pull request #9401 from alexzorin/update-2.0.x
Merge master into 2.0.x
2022-09-09 09:59:52 +10:00
Alex Zorin
63771b48bb Merge remote-tracking branch 'origin/master' into update-2.0.x 2022-09-09 08:37:56 +10:00
Brad Warren
80071c86f5 Merge pull request #9399 from certbot/candidate-1.30.0
Release 1.30.0
2022-09-08 10:28:45 -07:00
Will Greenberg
614eaf6898 Bump version to 1.31.0 2022-09-07 11:09:12 -07:00
Will Greenberg
0b284125d2 Add contents to certbot/CHANGELOG.md for next version 2022-09-07 11:09:12 -07:00
Will Greenberg
667b736879 Release 1.30.0 2022-09-07 11:09:11 -07:00
Will Greenberg
c68d4d6389 Update changelog for 1.30.0 release 2022-09-07 11:08:15 -07:00
Adrien Ferrand
9d736d5c9c Remove zope from Certbot (#9161)
* Remove zope and the internal reporter util.

* remove zope references from .pylintrc and pytest.ini

Co-authored-by: Alex Zorin <alex@zorin.id.au>
2022-09-07 15:09:32 +10:00
Will Greenberg
529a0e2272 Remove deprecated functions (#9315)
* Remove deprecated functions

* rm unused imports

* actually remove execute_command!

* revert changelog

Co-authored-by: Alex Zorin <alex@zorin.id.au>
2022-09-07 13:31:21 +10:00
Will Greenberg
a4a2315537 Removed deprecated functions (#9314)
* Removed deprecated functions

* rm import of distutils.version

* revert changelog

Co-authored-by: Alex Zorin <alex@zorin.id.au>
2022-09-07 13:20:56 +10:00
alexzorin
5e247d1683 unexport attributes in certbot.display.util (#9358) 2022-09-07 13:00:05 +10:00
Will Greenberg
20ca9288d5 Add UI text recommending multi-domain certs (#9393)
* Suggest multi-domain certs in domain selection menu

* Update changelog

* lint: fix long line

Co-authored-by: Alex Zorin <alex@zorin.id.au>
2022-09-07 12:55:58 +10:00
alexzorin
804ca32314 acme: remove Client and BackwardsCompatibleClientV2 (#9356)
* acme: remove Client and BackwardsCompatibleClientV2

* remove ClientTestBase and some unused variables

* add ClientV2.get_directory

* tweak ToS callback code

* acme: update example to use ClientV2.get_directory

* simplify ToS callback further into one step

* further removal of acmev1-related code

- remove acme.client.ClientBase
- remove acme.mixins.VersionedLEACMEMixin
- remove acme.client.DER_CONTENT_TYPE
- remove various ACMEv1 special cases
- remove acme.messages.ChallengeResources.combinations

* remove .mixins.ResourceMixin, fields.resource, fields.Resource
and resource field from various .message classes.

* simplify acme.messages.Directory:

- remove Directory.register
- remove HasResourceType and GenericHasResourceType
- remove ability to look up Directory resources by anything other
  than the exact field name in RFC8555 (section 9.7.5)

* remove acme.messages.OLD_ERROR_PREFIX and support the old prefix

* remove acme.mixins

* reorder imports

* add comment to Directory about resource lookups

* s/new-cert/newOrder/

* get rid of `resource` sillyness in tests

* remove acmev1 terms-of-service support from directory
2022-09-06 14:36:55 -07:00
alexzorin
c20d40ddba acme: further deprecations (#9395)
* acme: deprecate acme.fields.Resource and .resource

* acme: deprecate .messages.OLD_ERROR_PREFIX

* acme: deprecate .messages.Directory.register

* acme: clean up deprecations

* dont use unscoped filterwarnings

* change deprecation approach for acme.fields

* warn on non-string keys in acme.messages.Directory

* remove leaked filterwarnings in BackwardsCompatibleClientV2Test

* remove non-string lookups of acme.messages.Directory
2022-09-02 06:55:04 -07:00
alexzorin
f7e61edcb2 deprecate more attributes in acme (#9369)
* deprecate more attributes in acme

* Deprecate .Authorization.combinations by renaming the field and
  deprecating in getters/setters

* Silence deprecation warnings from our own imports of acme.mixins

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2022-08-30 14:41:53 -07:00
Brad Warren
f9d148be56 Upgrade CI OS (#9391)
* upgrade ubuntu

* upgrade macos

* use python3
2022-08-30 16:39:48 +10:00
Brad Warren
012314d946 Deprecate source address (#9389)
* deprecate source_address

* filter warnings

* fix route53 tests

* test warning

* update docstring
2022-08-30 10:28:47 +10:00
alexzorin
d8e45c286d apache: remove support for Apache 2.2 and CentOS 6 (#9354)
* apache: remove support for Apache 2.2 and CentOS 6

* delete more unused code

* remove unused attributes

* reorganize REWRITE_HTTPS_ARGS*
2022-08-29 10:05:48 -07:00
alexzorin
a81d58fa6e deprecate certbot-dns-cloudxns (#9367) 2022-08-27 07:25:37 +10:00
Brad Warren
cb632c376f encourage words before code (#9377) 2022-08-17 09:01:51 +10:00
Matthew W. Thomas
94bbb4c44c docs: add BunnyDNS to list of 3rd-party plugins (#9375)
* docs: add BunnyDNS to list of 3rd-party plugins

You can find the plugin here:
https://github.com/mwt/certbot-dns-bunny
It's for [BunnyDNS](https://bunny.net/dns/).

* Update AUTHORS.md
2022-08-12 14:03:08 -07:00
alexzorin
2574a8dfb5 remove all cloudxns-related code (#9361) 2022-08-10 11:01:11 -07:00
Gusmanov Timur
1b79c077a6 add dns-yandexcloud authentication plugin to third-party plugins (#9371) 2022-07-29 12:01:01 -07:00
Brad Warren
b73f3e2b16 pin back pylint (#9368) 2022-07-29 12:58:47 +10:00
alexzorin
42a4d30267 deps: remove pyjwt dependency (#9337)
* deps: remove pyjwt dependency

* pinning: strip extras from dependencies

`poetry export` outputs in requirements.txt format, which is now
apparently producing "dep[extra]==...". We are using this output
as the constraints file for pip and pip's new resolver does not
permit extras in the constraints file.

This change filters out the extras specifiers.

* repin current dependencies

* fix new pylint complaints

* silence lint about distutils.version

We have already deprecated the function and it'll be removed in
2.0.

* docs: set sphinx language to 'en'

this is emitting a warning and failing the build

* Revert "pinning: strip extras from dependencies"

This reverts commit 11268fd23160ac53fd8dad7a2ff15e453678e159.

* pin poetry back to avoid extras issue

* repin

* fix new mypy complaints in acme/
2022-07-28 17:26:12 -07:00
Brad Warren
e9e7a69c7b Update Azure Docker docs (#9363)
* describe docker access token

more

* Remove extra spaces

Co-authored-by: ohemorange <ebportnoy@gmail.com>

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2022-07-28 13:28:36 -07:00
Preston Locke
495b97aafe Clarify in docs that deletion does not revoke (#9348)
* Clarify in docs that deletion does not revoke

* Add myself to AUTHORS.md

* Move new paragraph below first note and change its wording
2022-07-26 16:03:53 -07:00
alexzorin
f82530d8c0 letstest: replace ubuntu 21.10 with 22.04 (#9364)
as ubuntu 21.10 is now EOL
2022-07-25 13:43:49 -07:00
alexzorin
ae7967c8ae docs: how to override the trusted CA certificates (#9357)
* docs: how to override the trusted CA certificates

* Update certbot/docs/using.rst

Co-authored-by: ohemorange <ebportnoy@gmail.com>

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2022-07-19 16:17:27 -07:00
Alex Zorin
82b6e15be7 change default key_type from rsa to ecdsa 2022-07-18 18:27:19 +10:00
Shahar Naveh
32608a142b DOC: Fix typo (#9346)
Co-authored-by: Shahar Naveh <>
2022-07-11 11:30:50 -07:00
Shahar Naveh
b9f6c3e5b6 DEP: Pin version of cryptography (#9339)
* DEP: Pin version of cryptography

* Added myself to authors:)

Co-authored-by: Shahar Naveh <>
2022-07-08 12:57:48 -07:00
ohemorange
184e087edf Prompt for username in finish_release.py (#9343)
The local machine's username may not be the same as the one on the CSS, so let's prompt for it instead.
2022-07-08 12:27:50 -07:00
Will Greenberg
1da36a9278 If a snap build times out, dump the logs (#9340) 2022-07-07 14:31:48 -07:00
Will Greenberg
2b1255cd6a finish_release.py: fix revision regex, add more logging (#9342) 2022-07-06 17:40:27 -07:00
ohemorange
c599aa08ad Merge pull request #9341 from certbot/candidate-1.29.0
Release 1.29.0
2022-07-06 13:18:26 -07:00
Will Greenberg
f1f526d63c Bump version to 1.30.0 2022-07-05 11:16:40 -07:00
Will Greenberg
ef0746eb1d Add contents to certbot/CHANGELOG.md for next version 2022-07-05 11:16:40 -07:00
Will Greenberg
befa4434ad Release 1.29.0 2022-07-05 11:16:39 -07:00
Will Greenberg
7e2105fca8 Update changelog for 1.29.0 release 2022-07-05 11:15:47 -07:00
Alexis
6e1696ba32 Add Signed Windows Installer Workflow (#9076)
* Add Code Signing action for Windows Installer

* Clean up variable names and input

* Amend and add to documentation per PR guidelines

* Update tools/finish_release.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update tools/finish_release.py

Amend typo

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Amend release script for better work flow

- SCP commands to upload and download unsigned & signed installers from CSS

* Collapse spaces

* Update tools/finish_release.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Create new windows signer function

* Update Windows Installer Script

- Update change log
- add new function for signing and document
- @TODO Streammline SSH session

* Remove Azure and Github release methods

- Methods moved to CSS
- Reduced to a ssh function that triggers the process on a CSS

* Amend Chnagelog and Remove Unneeded Deps

* Update tools/finish_release.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Add Verison Fetch Function

- For the purpose of snap releases
- Add back package to dev extras for function

* Chaneg path in ssh command

* Amend release script

* Amend the ssh command for CSS

* Update tools/finish_release.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update script with proper path and subprocess call

* Update ssh command

* Correct typo in path

* Fix typo in path

* Update certbot/CHANGELOG.md

Co-authored-by: ohemorange <ebportnoy@gmail.com>

* Remove missed conflict text

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
Co-authored-by: ohemorange <ebportnoy@gmail.com>
2022-06-29 15:52:50 -07:00
Amir Omidi
dedbdea1d9 Update generated CSRs to create V1 CSRs (#9334)
* Update generated CSRs to create V1 CSRs

Per the RFC: https://datatracker.ietf.org/doc/html/rfc2986#section-4

Version 3 CSRs, as far as I can tell, are not a thing (yet).

Relevant code in Go, for example: https://cs.opensource.google/go/go/+/refs/tags/go1.18.3:src/crypto/x509/x509.go;l=1979

* Update AUTHORS.md

* Unit test for PR #9334

* Add a small comment explaining this line for future readers.

* Add info to changelog

Co-authored-by: Paul Buonopane <paul@namepros.com>
2022-06-29 14:24:24 +10:00
Alexis Kim
b9f9952660 removed certbot-auto references from docs (#9333) 2022-06-28 11:43:57 +10:00
ohemorange
1d2540629f Use a different timeout for nightly vs daytime (release and extended) builds (#9330) 2022-06-22 18:06:53 -07:00
alexzorin
49f21bcc9f deps: bump pyOpenSSL in oldest pinnings (#9329) 2022-06-22 16:38:32 -07:00
ohemorange
885ebf80e3 Change snapcraft authentication to use SNAPCRAFT_STORE_CREDENTIALS (#9326)
* try the easy thing of just doing what the error message says

* temporarily add deploy stage to extended tests to see if it uploads properly

* follow instructions on https://forum.snapcraft.io/t/snapcraft-authentication-options/30473

* just run the packaging jobs for speed

* fix formatting

* import changes from test- branch and revert temporary changes

* Update instructions in deploy-stage.yml
2022-06-20 06:37:40 +10:00
Will Greenberg
7505bb0c60 Drop the snap build tiemout to 90 minutes (#9320)
It was previously 5.5 hours, which was just to have an exception thrown
before Azure's 6 hour timeout. Generally we aren't seeing this step take
more than 45 minutes, so 90 minutes seems like more than enough.
2022-06-14 15:09:09 -07:00
Will Greenberg
99da999b2b Merge pull request #9318 from certbot/docs-clarify-plugin-contributions
docs: clarify that we're not merging any new plugins (not just DNS)
2022-06-13 11:37:52 -07:00
Alex Zorin
7197ae4b77 docs: clarify that we're not merging any new plugins (not just DNS) 2022-06-09 07:51:28 +10:00
osirisinferi
1a25c4052c Change query_registration() to use _get_v2_account() (#9307)
* Change `query_registration()` to use `_get_v2_account()`

* Improve `_get_v2_account()`

Required for proper working of `certbot.main.update_registration()`. This
function updates the `regr.body` locally instead of passing the fields
which need to be updated to `acme.client.update_registration()` as a
separate argument in the `update` parameter.

* Revert "Improve `_get_v2_account()`"

This reverts commit e88a23ad76b6dc092645a870b3b5f99bd4fbd095.

* Improve `_get_v2_account() (version 2)

Instead of e88a23a, this change should be more compatible with older
ACMEv1 accounts used through symlinking ACMEv2 account dirs to the
existing ACMEv1 account dirs.
It should also still be compatible with `certbot.main.update_registration`.

* Move and slightly update CHANGELOG entry
2022-06-09 07:49:40 +10:00
James Balazs
a73a86bbc0 Retry errors with subproblems in obtain_certificate with --allow-subset-of-names (#9251) (#9272)
* Handle CAA failure on finalize_order during renewal (#9251)

* Fix CAA error on renewal test

* Attempt to fix failing test in CI

* Retry errors with subproblems in obtain_certificate_from_csr with allow_subset_of_names

Only retry if not all domains succeeded

* Back out renewal changes

* Fix linting error line too long

* Update log message for more general case and only log on retry

* Changelog entry

* Add retry logic to order creation

* Changelog entry wording

* Fix acme error handling when no subproblems provided

* Fix test name

* Use summarize domain list to display list of failed domains

* Tidy up incorrect client tests

* Remove unused var and output all failed domains

* Add logging to failed authorization case

* use _retry_obtain_certificate for failed authorizations

* Fix typo failing in CI

* Retry logic comments

* Preserve original error

* Move changelog entry to latest version
2022-06-08 18:36:13 +10:00
alexzorin
3b211a6e1b Merge pull request #9317 from certbot/candidate-1.28.0
Candidate 1.28.0
2022-06-08 16:48:40 +10:00
Will Greenberg
4dd603f786 Bump version to 1.29.0 2022-06-07 12:43:12 -07:00
Will Greenberg
0dac0f173a Add contents to certbot/CHANGELOG.md for next version 2022-06-07 12:43:12 -07:00
Will Greenberg
b9f9ebc4fc Release 1.28.0 2022-06-07 12:43:11 -07:00
Will Greenberg
bcf1ce3f33 Update changelog for 1.28.0 release 2022-06-07 12:41:07 -07:00
alexzorin
295fc5e33a cli: fix help text for --no-autorenew (#9312) 2022-06-04 11:37:05 +10:00
Will Greenberg
d13131e303 Merge pull request #9309 from certbot/test-account-updates
certbot-ci: improve tests for update_account/show_account
2022-05-31 12:58:19 -07:00
Alex Zorin
7758a03b5b skip boulder for show_account assertions 2022-05-31 17:31:52 +10:00
Alex Zorin
cf63470db9 certbot-ci: improve tests for update_account/show_account 2022-05-31 17:02:43 +10:00
amplifi
5c111d0bd1 Cite Mozilla ssl-config in Apache/NGINX TLS configs (#8670) (#9295)
* Cite Mozilla ssl-config in Apache/nginx TLS configs (certbot#8670)

* Update CHANGELOG

* Add TLS config hashes to ALL_SSL_OPTIONS_HASHES

* Update wording in CHANGELOG
2022-05-13 10:59:49 -07:00
alexzorin
ec49b94acb acme: use order "status" to determine action during finalization (#9297)
Rather than deducing the status of an order by the "certificate"
and "error" fields, use the "status" field directly.
2022-05-13 09:51:11 -07:00
Brad Warren
7dd1e814fb Ignore parallel coverage files (#9293)
* ignore parallel coverage files

* Properly shutdown & close HTTP server
2022-05-07 13:31:59 +10:00
Brad Warren
2017669544 Merge pull request #9292 from certbot/candidate-1.27.0 2022-05-04 07:36:23 -07:00
Will Greenberg
8d7ced5e12 Bump version to 1.28.0 2022-05-03 11:35:09 -07:00
Will Greenberg
e593921560 Add contents to certbot/CHANGELOG.md for next version 2022-05-03 11:35:09 -07:00
Will Greenberg
373ff0e6e9 Release 1.27.0 2022-05-03 11:35:08 -07:00
Will Greenberg
103b8bc8f9 Update changelog for 1.27.0 release 2022-05-03 11:33:11 -07:00
Will Greenberg
828be0071e Add new signing key (#9288)
* Add new signing key

* Update certbot/CHANGELOG.md
2022-04-28 11:04:43 -07:00
Will Greenberg
71a3d8fffb Merge pull request #9289 from certbot/9184-fix-changelog
changelog: move entry for #9184
2022-04-27 12:19:53 -07:00
Alex Zorin
48155b1ec7 changelog: move entry for #9184 2022-04-27 13:19:42 +10:00
Will Greenberg
8066f230f5 If an installer is provided to certonly, restart after cert issuance (#9184)
* If an installer is provided to certonly, restart after cert issuance

* Add myself to AUTHORS.md

* Handle certonly's "installer" error case

* Handle interactive case, use lazy interpolation

* fix trailing whitespace

* fix whitespace in error message, re-raise exception

* Handle cases where user specified an authenticator but no installer

* make tox happy

* Clarify comment in selection.py

Co-authored-by: ohemorange <ebportnoy@gmail.com>

* Add tests for the certonly installer changes

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2022-04-26 18:51:57 -07:00
Will Greenberg
3b6f3450c2 Add --debug to docker push (#9286)
This'll (hopefully) help us debug the connectivity issues during
the deploy CI
2022-04-22 08:07:59 -07:00
Richard "mtfnpy" Harman
20336266fd Add documentation on interactions between multiple views in BIND and the dns_rfc2136 plugin (#9284)
* Add documentation on interactions between multiple views in BIND and the dns_rfc2136 plugin

* Missing ; in example config

* Make lines shorter

* Missed one long line, and move Examples up in the documentation

* Apply suggestions from code review

Co-authored-by: alexzorin <alex@zor.io>

Co-authored-by: alexzorin <alex@zor.io>
2022-04-22 10:31:46 +10:00
Will Greenberg
549bc0a5fd Use win32 as platform in tox.ini (#9277)
This is used to match against sys.platform, which for windows is
win32 regardless of bitness
2022-04-19 07:40:46 +10:00
osirisinferi
0ca8ec6f7f Add missing closing parenthesis (#9279) 2022-04-13 11:47:19 +10:00
Brad Warren
df982b33b9 cleanup renewer defaults (#9274) 2022-04-09 19:20:03 +10:00
alexzorin
7a2c26fd22 docs: in contributing, ipdb→ipdb3 (#9271)
The binary is renamed in Python 3.
2022-04-07 23:27:16 +02:00
James Balazs
0fb5094250 Add subproblems to errors (#7046) (#9258)
* Add subproblems to errors (#7046)

* Fix can't assign attribute

* Tidy up string representations of errors and add decoders for subproblems / identifiers

* Add missing attributes to docstring

* Move change to 1.27.0 in changelog
2022-04-06 09:34:26 -07:00
Brad Warren
87216372dd Fix race condition and uncaught exception (#9264)
* Fix race condition and uncaught exception

* fix typo
2022-04-06 09:12:38 +10:00
alexzorin
b7df4416b5 Merge pull request #9267 from certbot/candidate-1.26.0
Update files from 1.26.0 release
2022-04-06 08:59:07 +10:00
Brad Warren
b9a7d771bc Bump version to 1.27.0 2022-04-05 10:43:01 -07:00
Brad Warren
3f8fde4270 Add contents to certbot/CHANGELOG.md for next version 2022-04-05 10:43:01 -07:00
Brad Warren
5b8cc18456 Release 1.26.0 2022-04-05 10:43:00 -07:00
Brad Warren
e8a1e6deb1 Update changelog for 1.26.0 release 2022-04-05 10:41:26 -07:00
alexzorin
b5a187841e certbot-ci: upgrade pebble to v2.3.1 (#9260) 2022-04-02 08:17:08 +11:00
alexzorin
d45a702649 changelog: clarify --new-key entry (#9259)
@osirisinferi pointed out [in chat](https://opensource.eff.org/eff-open-source/pl/y5whp5ny378wuedi8gd7995qbo) that the way this entry was written, suggested that `--new-key` might affect whether `--reuse-key` is set or not.

I think the second sentence was the main culprit, so I've nixed it and replaced it with a reminder about our other flags.

This maybe calls out more for a documentation section but let's fix this quickly before the release.
2022-04-01 13:27:11 -07:00
alexzorin
fe0b637e4d display acme.Errors less verbosely (#9255)
* display acme.Errors less verbosely

* remove superfluous import
2022-03-31 13:48:47 -07:00
alexzorin
284023a1b7 Add --new-key (#9252)
* add --new-key

* add tests
2022-03-31 11:40:21 -07:00
osirisinferi
4456a6ba0b Add error message to account registration error (#9233)
* Add  message to account reg. error

* Changelog

* Remove forced lowercase first char

* Catch errors raised by acme library

* Fix mypy and add some comments

* Add some tests

* Move changelog entry to current version

* Address comments

* Address additional comments

Put everything in this commit instead of using the "Commit suggestion"
feat on Github, which would resolve in 4 different tiny commits.
2022-03-31 07:36:15 +11:00
Mads Jensen
142fcad28b Update various references to draft RFC to published versions. (#9250) 2022-03-28 17:26:06 -07:00
osirisinferi
1d45939cab Skip ToS agreement question if ToS value is None (#9245)
* Skip ToS agreement question if ToS value is None

* Add changelog entry

* Typo in CHANGELOG

Co-authored-by: ohemorange <ebportnoy@gmail.com>

* Typo in CHANGELOG

Co-authored-by: ohemorange <ebportnoy@gmail.com>

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2022-03-24 15:42:47 -07:00
Will Greenberg
9ef6110e36 Point pip to filesystem packages rather than local HTTP server (#9240) 2022-03-24 13:32:03 -07:00
alexzorin
05a9ded297 pinning: update awscli pin (#9242) 2022-03-23 15:13:05 -07:00
alexzorin
690f62bae2 dns-ovh: increase default propagation timeout to 120s (#9244) 2022-03-23 15:07:29 -07:00
alexzorin
5404701111 windows: upgrade Python to 3.9.11 (#9241) 2022-03-18 10:03:49 +11:00
alexzorin
5ef18d905a Merge pull request #9238 from certbot/candidate-1.25.0
Release 1.25.0
2022-03-17 08:55:14 +11:00
Erica Portnoy
429bc553a0 Bump version to 1.26.0 2022-03-16 11:17:55 -07:00
Erica Portnoy
690c35530f Add contents to certbot/CHANGELOG.md for next version 2022-03-16 11:17:55 -07:00
Erica Portnoy
44c097fc05 Release 1.25.0 2022-03-16 11:17:54 -07:00
Erica Portnoy
cf6c511e91 Update changelog for 1.25.0 release 2022-03-16 11:16:28 -07:00
ohemorange
f58e3c5e92 Run repin.sh to pull in new version of cryptography, using OpenSSL 1.1.1n (#9237) 2022-03-15 16:46:58 -07:00
alexzorin
f54d9a3257 certbot-ci: fix boulder-v2 failures related to unexported challtestsrv port (#9235)
* certbot-ci: fix challtestsrv address for boulder-v2

The port is no longer exposed on the Docker host.

* vary the challtestsrv URL by acme server

* fix mypy

* fix comment

Co-authored-by: ohemorange <ebportnoy@gmail.com>

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2022-03-16 08:50:26 +11:00
Mads Jensen
ae41832f7c Update ACME spec links to point to RFC 8555. (#9232) 2022-03-13 07:53:45 +11:00
Mads Jensen
2b51661430 Remove cast for jose.fields. (#9228)
* Remove cast for jose.fields.

https://github.com/certbot/certbot/pull/9073 references this.

* Some of them can't be removed, though.

* Fix josepy type hints of json

* Increase josepy pinning version.

Note that the repin scripts have not been used.

* Run repin scripts.

* Fix constraints
2022-03-12 20:31:54 +11:00
alexzorin
ee2f5f5a0a pinning: work around poetry crash caused by bad 3rd party constraint (#9229) 2022-03-12 09:04:09 +11:00
259 changed files with 4302 additions and 7538 deletions

View File

@@ -9,6 +9,7 @@ variables:
# We don't publish our Docker images in this pipeline, but when building them
# for testing, let's use the nightly tag.
dockerTag: nightly
snapBuildTimeout: 5400
stages:
- template: templates/stages/test-and-package-stage.yml

View File

@@ -1,8 +1,18 @@
trigger: none
# We run the test suite on commits to master so codecov gets coverage data
# about the master branch and can use it to track coverage changes.
trigger:
- master
pr:
- master
- '*.x'
variables:
# We set this here to avoid coverage data being uploaded from things like our
# nightly pipeline. This is done because codecov (helpfully) keeps track of
# the number of coverage uploads for a commit and displays a warning when
# comparing two commits with an unequal number of uploads. Only uploading
# coverage here should keep the number of uploads it sees consistent.
uploadCoverage: true
jobs:
- template: templates/jobs/standard-tests-jobs.yml

View File

@@ -11,6 +11,7 @@ schedules:
variables:
dockerTag: nightly
snapBuildTimeout: 19800
stages:
- template: templates/stages/test-and-package-stage.yml

View File

@@ -8,11 +8,18 @@ pr: none
variables:
dockerTag: ${{variables['Build.SourceBranchName']}}
snapBuildTimeout: 19800
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/changelog-stage.yml
- template: templates/stages/deploy-stage.yml
parameters:
snapReleaseChannel: beta
${{ if startsWith(variables['Build.SourceBranchName'], 'v2') }}:
snapReleaseChannel: beta
${{ elseif startsWith(variables['Build.SourceBranchName'], 'v1') }}:
snapReleaseChannel: candidate
${{ else }}:
# This should never happen
snapReleaseChannel: somethingInvalid
- template: templates/stages/notify-failure-stage.yml

View File

@@ -2,9 +2,9 @@ jobs:
- job: extended_test
variables:
- name: IMAGE_NAME
value: ubuntu-18.04
value: ubuntu-22.04
- name: PYTHON_VERSION
value: 3.10
value: 3.11
- group: certbot-common
strategy:
matrix:
@@ -14,12 +14,13 @@ jobs:
linux-py39:
PYTHON_VERSION: 3.9
TOXENV: py39
linux-py310:
PYTHON_VERSION: 3.10
TOXENV: py310
linux-py37-nopin:
PYTHON_VERSION: 3.7
TOXENV: py37
CERTBOT_NO_PIN: 1
linux-external-mock:
TOXENV: external-mock
linux-boulder-v2-integration-certbot-oldest:
PYTHON_VERSION: 3.7
TOXENV: integration-certbot-oldest
@@ -44,16 +45,20 @@ jobs:
PYTHON_VERSION: 3.10
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v2-py311-integration:
PYTHON_VERSION: 3.11
TOXENV: integration
ACME_SERVER: boulder-v2
nginx-compat:
TOXENV: nginx_compat
linux-integration-rfc2136:
IMAGE_NAME: ubuntu-18.04
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.8
TOXENV: integration-dns-rfc2136
docker-dev:
TOXENV: docker_dev
le-modification:
IMAGE_NAME: ubuntu-18.04
IMAGE_NAME: ubuntu-22.04
TOXENV: modification
farmtest-apache2:
PYTHON_VERSION: 3.8

View File

@@ -1,17 +1,15 @@
jobs:
- job: docker_build
pool:
vmImage: ubuntu-18.04
vmImage: ubuntu-22.04
strategy:
matrix:
amd64:
DOCKER_ARCH: amd64
# Do not run the heavy non-amd64 builds for test branches
${{ if not(startsWith(variables['Build.SourceBranchName'], 'test-')) }}:
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
# The default timeout of 60 minutes is a little low for compiling
# cryptography on ARM architectures.
timeoutInMinutes: 180
@@ -37,7 +35,7 @@ jobs:
- job: docker_run
dependsOn: docker_build
pool:
vmImage: ubuntu-18.04
vmImage: ubuntu-22.04
steps:
- task: DownloadPipelineArtifact@2
inputs:
@@ -116,17 +114,15 @@ jobs:
displayName: Run certbot integration tests
- job: snaps_build
pool:
vmImage: ubuntu-18.04
vmImage: ubuntu-22.04
strategy:
matrix:
amd64:
SNAP_ARCH: amd64
# Do not run the heavy non-amd64 builds for test branches
${{ if not(startsWith(variables['Build.SourceBranchName'], 'test-')) }}:
armhf:
SNAP_ARCH: armhf
arm64:
SNAP_ARCH: arm64
armhf:
SNAP_ARCH: armhf
arm64:
SNAP_ARCH: arm64
timeoutInMinutes: 0
steps:
- script: |
@@ -149,7 +145,7 @@ jobs:
git config --global user.name "$(Build.RequestedFor)"
mkdir -p ~/.local/share/snapcraft/provider/launchpad
cp $(credentials.secureFilePath) ~/.local/share/snapcraft/provider/launchpad/credentials
python3 tools/snap/build_remote.py ALL --archs ${SNAP_ARCH} --timeout 19800
python3 tools/snap/build_remote.py ALL --archs ${SNAP_ARCH} --timeout $(snapBuildTimeout)
displayName: Build snaps
- script: |
set -e
@@ -164,7 +160,7 @@ jobs:
- job: snap_run
dependsOn: snaps_build
pool:
vmImage: ubuntu-18.04
vmImage: ubuntu-22.04
steps:
- task: UsePythonVersion@0
inputs:
@@ -194,7 +190,7 @@ jobs:
- job: snap_dns_run
dependsOn: snaps_build
pool:
vmImage: ubuntu-18.04
vmImage: ubuntu-22.04
steps:
- script: |
set -e

View File

@@ -0,0 +1,75 @@
# As (somewhat) described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops#context,
# each template only has access to the parameters passed into it. To help make
# use of this design, we define snapReleaseChannel without a default value
# which requires the user of this template to define it as described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema/parameters-name?view=azure-pipelines#remarks.
# This makes the user of this template be explicit while allowing them to
# define their own parameters with defaults that make sense for that context.
parameters:
- name: snapReleaseChannel
type: string
values:
- edge
- beta
- candidate
jobs:
# This job relies on credentials used to publish the Certbot snaps. This
# credential file was created by running:
#
# snapcraft logout
# snapcraft export-login --channels=candidate,beta,edge snapcraft.cfg
# (provide the shared snapcraft credentials when prompted)
#
# Then the file was added as a secure file in Azure pipelines
# with the name snapcraft.cfg by following the instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops
# including authorizing the file for use in the "nightly" and "release"
# pipelines as described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops#q-how-do-i-authorize-a-secure-file-for-use-in-a-specific-pipeline.
#
# This file has a maximum lifetime of one year and the current file will
# expire on 2023-09-06. The file will need to be updated before then to
# prevent automated deploys from breaking.
#
# Revoking these credentials can be done by changing the password of the
# account used to generate the credentials. See
# https://forum.snapcraft.io/t/revoking-exported-credentials/19031 for
# more info.
- job: publish_snap
pool:
vmImage: ubuntu-22.04
variables:
- group: certbot-common
strategy:
matrix:
amd64:
SNAP_ARCH: amd64
arm32v6:
SNAP_ARCH: armhf
arm64v8:
SNAP_ARCH: arm64
steps:
- bash: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
sudo snap install --classic snapcraft
displayName: Install dependencies
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps_$(SNAP_ARCH)
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- task: DownloadSecureFile@1
name: snapcraftCfg
inputs:
secureFile: snapcraft.cfg
- bash: |
set -e
export SNAPCRAFT_STORE_CREDENTIALS=$(cat "$(snapcraftCfg.secureFilePath)")
for SNAP_FILE in snap/*.snap; do
tools/retry.sh eval snapcraft upload --release=${{ parameters.snapReleaseChannel }} "${SNAP_FILE}"
done
displayName: Publish to Snap store

View File

@@ -1,17 +1,16 @@
jobs:
- job: test
variables:
PYTHON_VERSION: 3.10
PYTHON_VERSION: 3.11
strategy:
matrix:
macos-py37-cover:
IMAGE_NAME: macOS-10.15
IMAGE_NAME: macOS-12
PYTHON_VERSION: 3.7
TOXENV: py37-cover
macos-py310-cover:
IMAGE_NAME: macOS-10.15
PYTHON_VERSION: 3.10
TOXENV: py310-cover
TOXENV: cover
macos-cover:
IMAGE_NAME: macOS-12
TOXENV: cover
windows-py37:
IMAGE_NAME: windows-2019
PYTHON_VERSION: 3.7
@@ -19,48 +18,45 @@ jobs:
windows-py39-cover:
IMAGE_NAME: windows-2019
PYTHON_VERSION: 3.9
TOXENV: py39-cover-win
TOXENV: cover-win
windows-integration-certbot:
IMAGE_NAME: windows-2019
PYTHON_VERSION: 3.9
TOXENV: integration-certbot
linux-oldest-tests-1:
IMAGE_NAME: ubuntu-18.04
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.7
TOXENV: '{acme,apache,apache-v2,certbot}-oldest'
linux-oldest-tests-2:
IMAGE_NAME: ubuntu-18.04
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.7
TOXENV: '{dns,nginx}-oldest'
linux-py37:
IMAGE_NAME: ubuntu-18.04
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.7
TOXENV: py37
linux-py310-cover:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.10
TOXENV: py310-cover
linux-py310-lint:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.10
linux-cover:
IMAGE_NAME: ubuntu-22.04
TOXENV: cover
linux-lint:
IMAGE_NAME: ubuntu-22.04
TOXENV: lint-posix
linux-py310-mypy:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.10
linux-mypy:
IMAGE_NAME: ubuntu-22.04
TOXENV: mypy-posix
linux-integration:
IMAGE_NAME: ubuntu-18.04
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: pebble
apache-compat:
IMAGE_NAME: ubuntu-18.04
IMAGE_NAME: ubuntu-22.04
TOXENV: apache_compat
apacheconftest:
IMAGE_NAME: ubuntu-18.04
IMAGE_NAME: ubuntu-22.04
TOXENV: apacheconftest-with-pebble
nginxroundtrip:
IMAGE_NAME: ubuntu-18.04
IMAGE_NAME: ubuntu-22.04
TOXENV: nginxroundtrip
pool:
vmImage: $(IMAGE_NAME)

View File

@@ -1,77 +1,19 @@
parameters:
# We do not define acceptable values for this parameter here as it is passed
# through to ../jobs/snap-deploy-job.yml which does its own sanity checking.
- name: snapReleaseChannel
type: string
default: edge
values:
- edge
- beta
stages:
- stage: Deploy
jobs:
# This job relies on credentials used to publish the Certbot snaps. This
# credential file was created by running:
#
# snapcraft logout
# snapcraft login (provide the shared snapcraft credentials when prompted)
# snapcraft export-login --channels=beta,edge snapcraft.cfg
#
# Then the file was added as a secure file in Azure pipelines
# with the name snapcraft.cfg by following the instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops
# including authorizing the file for use in the "nightly" and "release"
# pipelines as described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops#q-how-do-i-authorize-a-secure-file-for-use-in-a-specific-pipeline.
#
# This file has a maximum lifetime of one year and the current
# file will expire on 2022-07-25 which is also tracked by
# https://github.com/certbot/certbot/issues/7931. The file will
# need to be updated before then to prevent automated deploys
# from breaking.
#
# Revoking these credentials can be done by changing the password of the
# account used to generate the credentials. See
# https://forum.snapcraft.io/t/revoking-exported-credentials/19031 for
# more info.
- job: publish_snap
pool:
vmImage: ubuntu-18.04
variables:
- group: certbot-common
strategy:
matrix:
amd64:
SNAP_ARCH: amd64
arm32v6:
SNAP_ARCH: armhf
arm64v8:
SNAP_ARCH: arm64
steps:
- bash: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
sudo snap install --classic snapcraft
displayName: Install dependencies
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps_$(SNAP_ARCH)
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- task: DownloadSecureFile@1
name: snapcraftCfg
inputs:
secureFile: snapcraft.cfg
- bash: |
set -e
snapcraft login --with $(snapcraftCfg.secureFilePath)
for SNAP_FILE in snap/*.snap; do
tools/retry.sh eval snapcraft upload --release=${{ parameters.snapReleaseChannel }} "${SNAP_FILE}"
done
displayName: Publish to Snap store
- template: ../jobs/snap-deploy-job.yml
parameters:
snapReleaseChannel: ${{ parameters.snapReleaseChannel }}
- job: publish_docker
pool:
vmImage: ubuntu-18.04
vmImage: ubuntu-22.04
strategy:
matrix:
amd64:
@@ -96,11 +38,16 @@ stages:
# which was created by following the instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
# The name given to this service account must match the value
# given to containerRegistry below. "Grant access to all
# pipelines" should also be checked. To revoke these
# credentials, we can change the password on the certbotbot
# Docker Hub account or remove the account from the
# Certbot organization on Docker Hub.
# given to containerRegistry below. The authentication used when
# creating this service account was a personal access token
# rather than a password to bypass 2FA. When Brad set this up,
# Azure Pipelines failed to verify the credentials with an error
# like "access is forbidden with a JWT issued from a personal
# access token", but after saving them without verification, the
# access token worked when the pipeline actually ran. "Grant
# access to all pipelines" should also be checked on the service
# account. The access token can be deleted on Docker Hub if
# these credentials need to be revoked.
containerRegistry: docker-hub
displayName: Login to Docker Hub
- bash: set -e && tools/docker/deploy.sh $(dockerTag) $DOCKER_ARCH

View File

@@ -12,7 +12,7 @@ steps:
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends \
python-dev \
python3-dev \
gcc \
libaugeas0 \
libssl-dev \
@@ -36,8 +36,8 @@ steps:
# problems with its lack of real dependency resolution.
- bash: |
set -e
python tools/pipstrap.py
python tools/pip_install.py -I tox virtualenv
python3 tools/pipstrap.py
python3 tools/pip_install.py -I tox virtualenv
displayName: Install runtime dependencies
- task: DownloadSecureFile@1
name: testFarmPem
@@ -49,9 +49,34 @@ steps:
export TARGET_BRANCH="`echo "${BUILD_SOURCEBRANCH}" | sed -E 's!refs/(heads|tags)/!!g'`"
[ -z "${SYSTEM_PULLREQUEST_TARGETBRANCH}" ] || export TARGET_BRANCH="${SYSTEM_PULLREQUEST_TARGETBRANCH}"
env
python -m tox
python3 -m tox
env:
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
AWS_EC2_PEM_FILE: $(testFarmPem.secureFilePath)
displayName: Run tox
# For now, let's omit `set -e` and avoid the script exiting with a nonzero
# status code to prevent problems here from causing build failures. If
# this turns out to work well, we can change this.
- bash: |
python3 tools/pip_install.py -I coverage
case "$AGENT_OS" in
Darwin)
CODECOV_URL="https://uploader.codecov.io/latest/macos/codecov"
;;
Linux)
CODECOV_URL="https://uploader.codecov.io/latest/linux/codecov"
;;
Windows_NT)
CODECOV_URL="https://uploader.codecov.io/latest/windows/codecov.exe"
;;
*)
echo "Unexpected OS"
exit 0
esac
curl --retry 3 -o codecov "$CODECOV_URL"
chmod +x codecov
coverage xml
./codecov || echo "Uploading coverage data failed"
condition: and(eq(variables['uploadCoverage'], true), or(startsWith(variables['TOXENV'], 'cover'), startsWith(variables['TOXENV'], 'integration')))
displayName: Upload coverage data

7
.github/codecov.yml vendored Normal file
View File

@@ -0,0 +1,7 @@
# This disables all reporting from codecov. Let's just set it up to collect
# data for now and then we can play with the settings here.
comment: false
coverage:
status:
project: off
patch: off

View File

@@ -1,5 +1,6 @@
## Pull Request Checklist
- [ ] The Certbot team has recently expressed interest in reviewing a PR for this. If not, this PR may be closed due our limited resources and need to prioritize how we spend them.
- [ ] If the change being made is to a [distributed component](https://certbot.eff.org/docs/contributing.html#code-components-and-layout), edit the `master` section of `certbot/CHANGELOG.md` to include a description of the change being made.
- [ ] Add or update any documentation as needed to support the changes in this PR.
- [ ] Include your name in `AUTHORS.md` if you like.

35
.github/stale.yml vendored
View File

@@ -1,35 +0,0 @@
# Configuration for https://github.com/marketplace/stale
# Number of days of inactivity before an Issue or Pull Request becomes stale
daysUntilStale: 365
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
# When changing this value, be sure to also update markComment below.
daysUntilClose: 30
# Ignore issues with an assignee (defaults to false)
exemptAssignees: true
# Label to use when marking as stale
staleLabel: needs-update
# Comment to post when marking as stale. Set to `false` to disable
markComment: >
We've made a lot of changes to Certbot since this issue was opened. If you
still have this issue with an up-to-date version of Certbot, can you please
add a comment letting us know? This helps us to better see what issues are
still affecting our users. If there is no activity in the next 30 days, this
issue will be automatically closed.
# Comment to post when closing a stale Issue or Pull Request.
closeComment: >
This issue has been closed due to lack of activity, but if you think it
should be reopened, please open a new issue with a link to this one and we'll
take a look.
# Limit the number of actions per hour, from 1-30. Default is 30
limitPerRun: 1
# Don't mark pull requests as stale.
only: issues

42
.github/workflows/stale.yml vendored Normal file
View File

@@ -0,0 +1,42 @@
name: Update Stale Issues
on:
schedule:
# Run at midnight every night
- cron: '24 1 * * *'
permissions:
issues: write
jobs:
stale:
runs-on: ubuntu-latest
steps:
- uses: actions/stale@v6
with:
# REMOVEME: dry run to see if this works
debug-only: true
# Idle number of days before marking issues stale
days-before-issue-stale: 365
# Idle number of days before closing stale issues
days-before-issue-close: 30
# Ignore issues with an assignee
exempt-all-issue-assignees: true
# Label to use when marking as stale
stale-issue-label: needs-update
stale-issue-message: >
We've made a lot of changes to Certbot since this issue was opened. If you
still have this issue with an up-to-date version of Certbot, can you please
add a comment letting us know? This helps us to better see what issues are
still affecting our users. If there is no activity in the next 30 days, this
issue will be automatically closed.
close-issue-message: >
This issue has been closed due to lack of activity, but if you think it
should be reopened, please open a new issue with a link to this one and we'll
take a look.
# Limit the number of actions per hour, from 1-30. Default is 30
operations-per-run: 1

2
.gitignore vendored
View File

@@ -13,6 +13,7 @@ poetry.lock
# coverage
.coverage
.coverage.*
/htmlcov/
/.vagrant
@@ -58,3 +59,4 @@ certbot-dns*/certbot-dns*_arm*.txt
/certbot_amd64*.txt
/certbot_arm*.txt
certbot-dns*/snap
snapcraft.cfg

708
.pylintrc
View File

@@ -1,10 +1,65 @@
[MASTER]
[MAIN]
# use as many jobs as there are cores
jobs=0
# Analyse import fallback blocks. This can be used to support both Python 2 and
# 3 compatible code, which means that the block might have code that exists
# only in one or another interpreter, leading to false positives when analysed.
analyse-fallback-blocks=no
# Specify a configuration file.
#rcfile=
# Load and enable all available extensions. Use --list-extensions to see a list
# all available extensions.
#enable-all-extensions=
# In error mode, messages with a category besides ERROR or FATAL are
# suppressed, and no reports are done by default. Error mode is compatible with
# disabling specific errors.
#errors-only=
# Always return a 0 (non-error) status code, even if lint errors are found.
# This is primarily useful in continuous integration scripts.
#exit-zero=
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-allow-list=
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code. (This is an alternative name to extension-pkg-allow-list
# for backward compatibility.)
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
# Return non-zero exit code if any of these messages/categories are detected,
# even if score is above --fail-under value. Syntax same as enable. Messages
# specified are enabled, while categories only check already-enabled messages.
fail-on=
# Specify a score threshold under which the program will exit with error.
fail-under=10
# Interpret the stdin as a python script, whose filename needs to be passed as
# the module_or_package argument.
#from-stdin=
# Files or directories to be skipped. They should be base names, not paths.
ignore=CVS
# Add files or directories matching the regular expressions patterns to the
# ignore-list. The regex matches against paths and can be in Posix or Windows
# format. Because '\' represents the directory delimiter on Windows systems, it
# can't be used as an escape character.
ignore-paths=
# Files or directories matching the regular expression patterns are skipped.
# The regex matches against base names, not paths. The default value ignores
# Emacs file locks
ignore-patterns=^\.#
# List of module names for which member attributes should not be checked
# (useful for modules/projects where namespaces are manipulated during runtime
# and thus existing member attributes cannot be deduced by static analysis). It
# supports qualified module names, as well as Unix pattern matching.
ignored-modules=
# Python code to execute, usually for sys.path manipulation such as
# pygtk.require().
@@ -13,42 +68,303 @@ jobs=0
# https://github.com/PyCQA/pylint/pull/3396.
init-hook="import pylint.config, os, sys; sys.path.append(os.path.dirname(pylint.config.PYLINTRC))"
# Profiled execution.
profile=no
# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
# number of processors available to use, and will cap the count on Windows to
# avoid hangs.
jobs=0
# Add files or directories to the blacklist. They should be base names, not
# paths.
ignore=CVS
# Control the amount of potential inferred values when inferring a single
# object. This can help the performance when dealing with large functions or
# complex, nested conditions.
limit-inference-results=100
# List of plugins (as comma separated values of python module names) to load,
# usually to register additional checkers.
load-plugins=linter_plugin
# Pickle collected data for later comparisons.
persistent=yes
# List of plugins (as comma separated values of python modules names) to load,
# usually to register additional checkers.
load-plugins=linter_plugin
# Minimum Python version to use for version dependent checks. Will default to
# the version used to run pylint.
py-version=3.10
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
# Discover python modules and packages in the file system subtree.
recursive=no
# When enabled, pylint would attempt to guess common misconfiguration and emit
# user-friendly hints instead of false-positive error messages.
suggestion-mode=yes
# Allow loading of arbitrary C extensions. Extensions are imported into the
# active Python interpreter and may run arbitrary code.
unsafe-load-any-extension=no
# In verbose mode, extra non-checker-related info will be displayed.
#verbose=
[BASIC]
# Naming style matching correct argument names.
argument-naming-style=snake_case
# Regular expression matching correct argument names. Overrides argument-
# naming-style. If left empty, argument names will be checked with the set
# naming style.
#argument-rgx=
# Naming style matching correct attribute names.
attr-naming-style=snake_case
# Regular expression matching correct attribute names. Overrides attr-naming-
# style. If left empty, attribute names will be checked with the set naming
# style.
#attr-rgx=
# Bad variable names which should always be refused, separated by a comma.
bad-names=foo,
bar,
baz,
toto,
tutu,
tata
# Bad variable names regexes, separated by a comma. If names match any regex,
# they will always be refused
bad-names-rgxs=
# Naming style matching correct class attribute names.
class-attribute-naming-style=any
# Regular expression matching correct class attribute names. Overrides class-
# attribute-naming-style. If left empty, class attribute names will be checked
# with the set naming style.
#class-attribute-rgx=
# Naming style matching correct class constant names.
class-const-naming-style=UPPER_CASE
# Regular expression matching correct class constant names. Overrides class-
# const-naming-style. If left empty, class constant names will be checked with
# the set naming style.
#class-const-rgx=
# Naming style matching correct class names.
class-naming-style=PascalCase
# Regular expression matching correct class names. Overrides class-naming-
# style. If left empty, class names will be checked with the set naming style.
#class-rgx=
# Naming style matching correct constant names.
const-naming-style=UPPER_CASE
# Regular expression matching correct constant names. Overrides const-naming-
# style. If left empty, constant names will be checked with the set naming
# style.
#const-rgx=
# Minimum line length for functions/classes that require docstrings, shorter
# ones are exempt.
docstring-min-length=-1
# Naming style matching correct function names.
function-naming-style=snake_case
# Regular expression matching correct function names. Overrides function-
# naming-style. If left empty, function names will be checked with the set
# naming style.
function-rgx=[a-z_][a-z0-9_]{2,40}$
# Good variable names which should always be accepted, separated by a comma.
good-names=i,
j,
k,
ex,
Run,
_,
fd,
logger
# Good variable names regexes, separated by a comma. If names match any regex,
# they will always be accepted
good-names-rgxs=
# Include a hint for the correct naming format with invalid-name.
include-naming-hint=no
# Naming style matching correct inline iteration names.
inlinevar-naming-style=any
# Regular expression matching correct inline iteration names. Overrides
# inlinevar-naming-style. If left empty, inline iteration names will be checked
# with the set naming style.
#inlinevar-rgx=
# Naming style matching correct method names.
method-naming-style=snake_case
# Regular expression matching correct method names. Overrides method-naming-
# style. If left empty, method names will be checked with the set naming style.
method-rgx=[a-z_][a-z0-9_]{2,50}$
# Naming style matching correct module names.
module-naming-style=snake_case
# Regular expression matching correct module names. Overrides module-naming-
# style. If left empty, module names will be checked with the set naming style.
#module-rgx=
# Colon-delimited sets of names that determine each other's naming style when
# the name regexes allow several styles.
name-group=
# Regular expression which should only match function or class names that do
# not require a docstring.
no-docstring-rgx=(__.*__)|(test_[A-Za-z0-9_]*)|(_.*)|(.*Test$)
# List of decorators that produce properties, such as abc.abstractproperty. Add
# to this list to register other decorators that produce valid properties.
# These decorators are taken in consideration only for invalid-name.
property-classes=abc.abstractproperty
# Regular expression matching correct type variable names. If left empty, type
# variable names will be checked with the set naming style.
#typevar-rgx=
# Naming style matching correct variable names.
variable-naming-style=snake_case
# Regular expression matching correct variable names. Overrides variable-
# naming-style. If left empty, variable names will be checked with the set
# naming style.
variable-rgx=[a-z_][a-z0-9_]{1,30}$
[CLASSES]
# Warn about protected attribute access inside special methods
check-protected-access-in-special-methods=no
# List of method names used to declare (i.e. assign) instance attributes.
defining-attr-methods=__init__,
__new__,
setUp,
__post_init__
# List of valid names for the first argument in a class method.
valid-classmethod-first-arg=cls
# List of valid names for the first argument in a metaclass class method.
valid-metaclass-classmethod-first-arg=cls
[EXCEPTIONS]
# Exceptions that will emit a warning when caught.
overgeneral-exceptions=BaseException,
Exception
[FORMAT]
# Expected format of line ending, e.g. empty (any line ending), LF or CRLF.
expected-line-ending-format=
# Regexp for a line that is allowed to be longer than the limit.
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
# Number of spaces of indent required inside a hanging or continued line.
# git history told me that "This does something silly/broken..."
#indent-after-paren=4
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
# tab).
indent-string=' '
# Maximum number of characters on a single line.
max-line-length=100
# Maximum number of lines in a module.
max-module-lines=1250
# Allow the body of a class to be on the same line as the declaration if body
# contains single statement.
single-line-class-stmt=no
# Allow the body of an if to be on the same line as the test if there is no
# else.
single-line-if-stmt=no
[IMPORTS]
# List of modules that can be imported at any level, not just the top level
# one.
allow-any-import-level=
# Allow wildcard imports from modules that define __all__.
allow-wildcard-with-all=no
# Deprecated modules which should not be used, separated by a comma.
deprecated-modules=
# Output a graph (.gv or any supported image format) of external dependencies
# to the given file (report RP0402 must not be disabled).
ext-import-graph=
# Output a graph (.gv or any supported image format) of all (i.e. internal and
# external) dependencies to the given file (report RP0402 must not be
# disabled).
import-graph=
# Output a graph (.gv or any supported image format) of internal dependencies
# to the given file (report RP0402 must not be disabled).
int-import-graph=
# Force import order to recognize a module as part of the standard
# compatibility libraries.
known-standard-library=
# Force import order to recognize a module as part of a third party library.
known-third-party=enchant
# Couples of modules and preferred modules, separated by a comma.
preferred-modules=
[LOGGING]
# The type of string formatting that logging methods do. `old` means using %
# formatting, `new` is for `{}` formatting.
logging-format-style=old
# Logging modules to check that the string format arguments are in logging
# function parameter format.
logging-modules=logging,logger
[MESSAGES CONTROL]
# Enable the message, report, category or checker with the given id(s). You can
# either give multiple identifier separated by comma (,) or put this option
# multiple time. See also the "--disable" option for examples.
#enable=
# Only show warnings with the listed confidence levels. Leave empty to show
# all. Valid levels: HIGH, CONTROL_FLOW, INFERENCE, INFERENCE_FAILURE,
# UNDEFINED.
confidence=HIGH,
CONTROL_FLOW,
INFERENCE,
INFERENCE_FAILURE,
UNDEFINED
# Disable the message, report, category or checker with the given id(s). You
# can either give multiple identifiers separated by comma (,) or put this
# option multiple times (only on the command line, not in the configuration
# file where it should appear only once).You can also use "--disable=all" to
# disable everything first and then reenable specific checks. For example, if
# file where it should appear only once). You can also use "--disable=all" to
# disable everything first and then re-enable specific checks. For example, if
# you want to run only the similarities checker, you can use "--disable=all
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
# no Warning level messages displayed, use "--disable=all --enable=classes
# --disable=W".
# CERTBOT COMMENT
# 1) Once certbot codebase is claimed to be compatible exclusively with Python 3,
# the useless-object-inheritance check can be enabled again, and code fixed accordingly.
@@ -74,261 +390,185 @@ extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
# not need to enforce encoding on files so we disable this check.
# 7) consider-using-f-string is "suggesting" to move to f-string when possible with an error. This
# clearly relates to code design and not to potential defects in the code, let's just ignore that.
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue,raise-missing-from,wrong-import-order,unspecified-encoding,consider-using-f-string
disable=fixme,locally-disabled,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue,raise-missing-from,wrong-import-order,unspecified-encoding,consider-using-f-string,raw-checker-failed,bad-inline-option,file-ignored,suppressed-message,useless-suppression,deprecated-pragma,use-symbolic-message-instead
[REPORTS]
# Set the output format. Available formats are text, parseable, colorized, msvs
# (visual studio) and html. You can also give a reporter class, eg
# mypackage.mymodule.MyReporterClass.
output-format=text
# Put messages in a separate file for each module / package specified on the
# command line instead of printing them on stdout. Reports (if any) will be
# written in a file name "pylint_global.[txt|html]".
files-output=no
# Tells whether to display a full report or only the messages
reports=yes
# Python expression which should return a note less than 10 (10 is the highest
# note). You have access to the variables errors warning, statement which
# respectively contain the number of errors / warnings messages and the total
# number of statements analyzed. This is used by the global evaluation report
# (RP0004).
evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
# Add a comment according to your evaluation note. This is used by the global
# evaluation report (RP0004).
comment=no
# Template used to display messages. This is a python new-style format string
# used to format the message information. See doc for all details
#msg-template=
# Enable the message, report, category or checker with the given id(s). You can
# either give multiple identifier separated by comma (,) or put this option
# multiple time (only on the command line, not in the configuration file where
# it should appear only once). See also the "--disable" option for examples.
enable=c-extension-no-member
[BASIC]
[METHOD_ARGS]
# Required attributes for module, separated by a comma
required-attributes=
# List of builtins function names that should not be used, separated by a comma
bad-functions=map,filter,apply,input,file
# Good variable names which should always be accepted, separated by a comma
good-names=f,i,j,k,ex,Run,_,fd,logger
# Bad variable names which should always be refused, separated by a comma
bad-names=foo,bar,baz,toto,tutu,tata
# Colon-delimited sets of names that determine each other's naming style when
# the name regexes allow several styles.
name-group=
# Include a hint for the correct naming format with invalid-name
include-naming-hint=no
# Regular expression matching correct function names
function-rgx=[a-z_][a-z0-9_]{2,40}$
# Naming hint for function names
function-name-hint=[a-z_][a-z0-9_]{2,40}$
# Regular expression matching correct variable names
variable-rgx=[a-z_][a-z0-9_]{1,30}$
# Naming hint for variable names
variable-name-hint=[a-z_][a-z0-9_]{2,30}$
# Regular expression matching correct constant names
const-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__))$
# Naming hint for constant names
const-name-hint=(([A-Z_][A-Z0-9_]*)|(__.*__))$
# Regular expression matching correct attribute names
attr-rgx=[a-z_][a-z0-9_]{2,30}$
# Naming hint for attribute names
attr-name-hint=[a-z_][a-z0-9_]{2,30}$
# Regular expression matching correct argument names
argument-rgx=[a-z_][a-z0-9_]{2,30}$
# Naming hint for argument names
argument-name-hint=[a-z_][a-z0-9_]{2,30}$
# Regular expression matching correct class attribute names
class-attribute-rgx=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$
# Naming hint for class attribute names
class-attribute-name-hint=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$
# Regular expression matching correct inline iteration names
inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$
# Naming hint for inline iteration names
inlinevar-name-hint=[A-Za-z_][A-Za-z0-9_]*$
# Regular expression matching correct class names
class-rgx=[A-Z_][a-zA-Z0-9]+$
# Naming hint for class names
class-name-hint=[A-Z_][a-zA-Z0-9]+$
# Regular expression matching correct module names
module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
# Naming hint for module names
module-name-hint=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
# Regular expression matching correct method names
method-rgx=[a-z_][a-z0-9_]{2,50}$
# Naming hint for method names
method-name-hint=[a-z_][a-z0-9_]{2,50}$
# Regular expression which should only match function or class names that do
# not require a docstring.
no-docstring-rgx=(__.*__)|(test_[A-Za-z0-9_]*)|(_.*)|(.*Test$)
# Minimum line length for functions/classes that require docstrings, shorter
# ones are exempt.
docstring-min-length=-1
# List of qualified names (i.e., library.method) which require a timeout
# parameter e.g. 'requests.api.get,requests.api.post'
timeout-methods=requests.api.delete,requests.api.get,requests.api.head,requests.api.options,requests.api.patch,requests.api.post,requests.api.put,requests.api.request
[MISCELLANEOUS]
# List of note tags to take in consideration, separated by a comma.
notes=FIXME,XXX,TODO
notes=FIXME,
XXX,
TODO
# Regular expression of note tags to take in consideration.
notes-rgx=
[LOGGING]
[REFACTORING]
# Logging modules to check that the string format arguments are in logging
# function parameter format
logging-modules=logging,logger
# Maximum number of nested blocks for function / method body
max-nested-blocks=5
# Complete name of functions that never returns. When checking for
# inconsistent-return-statements if a never returning function is called then
# it will be considered as an explicit return statement and no message will be
# printed.
never-returning-functions=sys.exit,argparse.parse_error
[VARIABLES]
[REPORTS]
# Tells whether we should check for unused import in __init__ files.
init-import=no
# Python expression which should return a score less than or equal to 10. You
# have access to the variables 'fatal', 'error', 'warning', 'refactor',
# 'convention', and 'info' which contain the number of messages in each
# category, as well as 'statement' which is the total number of statements
# analyzed. This score is used by the global evaluation report (RP0004).
evaluation=max(0, 0 if fatal else 10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10))
# A regular expression matching the name of dummy variables (i.e. expectedly
# not used).
dummy-variables-rgx=(unused)?_.*|dummy
# Template used to display messages. This is a python new-style format string
# used to format the message information. See doc for all details.
msg-template=
# List of additional names supposed to be defined in builtins. Remember that
# you should avoid to define new builtins when possible.
additional-builtins=
# Set the output format. Available formats are text, parseable, colorized, json
# and msvs (visual studio). You can also give a reporter class, e.g.
# mypackage.mymodule.MyReporterClass.
#output-format=
# Tells whether to display a full report or only the messages.
reports=no
# Activate the evaluation score.
score=yes
[SIMILARITIES]
# Comments are removed from the similarity computation
ignore-comments=yes
# Docstrings are removed from the similarity computation
ignore-docstrings=yes
# Imports are removed from the similarity computation
ignore-imports=yes
# Signatures are removed from the similarity computation
ignore-signatures=yes
# Minimum lines number of a similarity.
min-similarity-lines=6
# Ignore comments when computing similarities.
ignore-comments=yes
# Ignore docstrings when computing similarities.
ignore-docstrings=yes
[STRING]
# Ignore imports when computing similarities.
ignore-imports=yes
# This flag controls whether inconsistent-quotes generates a warning when the
# character used as a quote delimiter is used inconsistently within a module.
check-quote-consistency=no
[FORMAT]
# Maximum number of characters on a single line.
max-line-length=100
# Regexp for a line that is allowed to be longer than the limit.
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
# Allow the body of an if to be on the same line as the test if there is no
# else.
single-line-if-stmt=no
# List of optional constructs for which whitespace checking is disabled
no-space-check=trailing-comma
# Maximum number of lines in a module
max-module-lines=1250
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
# tab).
indent-string=' '
# Number of spaces of indent required inside a hanging or continued line.
# This does something silly/broken...
#indent-after-paren=4
# This flag controls whether the implicit-str-concat should generate a warning
# on implicit string concatenation in sequences defined over several lines.
check-str-concat-over-line-jumps=no
[TYPECHECK]
# Tells whether missing members accessed in mixin class should be ignored. A
# mixin class is detected if its name ends with "mixin" (case insensitive).
ignore-mixin-members=yes
# List of decorators that produce context managers, such as
# contextlib.contextmanager. Add to this list to register other decorators that
# produce valid context managers.
contextmanager-decorators=contextlib.contextmanager
# List of members which are set dynamically and missed by pylint inference
# system, and so shouldn't trigger E1101 when accessed. Python regular
# expressions are accepted.
generated-members=
# Tells whether to warn about missing members when the owner of the attribute
# is inferred to be None.
ignore-none=yes
# This flag controls whether pylint should warn about no-member and similar
# checks whenever an opaque object is returned when inferring. The inference
# can return multiple potential results while evaluating a Python object, but
# some branches might not be evaluated, which results in partial inference. In
# that case, it might be useful to still emit no-member and other checks for
# the rest of the inferred objects.
ignore-on-opaque-inference=yes
# List of symbolic message names to ignore for Mixin members.
ignored-checks-for-mixins=no-member,
not-async-context-manager,
not-context-manager,
attribute-defined-outside-init
# List of class names for which member attributes should not be checked (useful
# for classes with dynamically set attributes). This supports the use of
# qualified names.
ignored-classes=optparse.Values,thread._local,_thread._local,argparse.Namespace,Field,Header,JWS,closing
# List of module names for which member attributes should not be checked
# (useful for modules/projects where namespaces are manipulated during runtime
# and thus existing member attributes cannot be deduced by static analysis
ignored-modules=pkg_resources,confargparse,argparse
# import errors ignored only in 1.4.4
# https://bitbucket.org/logilab/pylint/commits/cd000904c9e2
# List of classes names for which member attributes should not be checked
# (useful for classes with attributes dynamically set).
ignored-classes=Field,Header,JWS,closing
# Show a hint with possible names when a member name was not found. The aspect
# of finding the hint is based on edit distance.
missing-member-hint=yes
# When zope mode is activated, add a predefined set of Zope acquired attributes
# to generated-members.
zope=yes
# The minimum edit distance a name should have in order to be considered a
# similar match for a missing member name.
missing-member-hint-distance=1
# List of members which are set dynamically and missed by pylint inference
# system, and so shouldn't trigger E0201 when accessed. Python regular
# expressions are accepted.
generated-members=REQUEST,acl_users,aq_parent
# The total number of similar names that should be taken in consideration when
# showing a hint for a missing member.
missing-member-max-choices=1
# Regex pattern to define which classes are considered mixins.
mixin-class-rgx=.*[Mm]ixin
# List of decorators that change the signature of a decorated function.
signature-mutators=
[IMPORTS]
[VARIABLES]
# Deprecated modules which should not be used, separated by a comma
deprecated-modules=regsub,TERMIOS,Bastion,rexec
# List of additional names supposed to be defined in builtins. Remember that
# you should avoid defining new builtins when possible.
additional-builtins=
# Create a graph of every (i.e. internal and external) dependencies in the
# given file (report RP0402 must not be disabled)
import-graph=
# Tells whether unused global variables should be treated as a violation.
allow-global-unused-variables=yes
# Create a graph of external dependencies in the given file (report RP0402 must
# not be disabled)
ext-import-graph=
# List of names allowed to shadow builtins
allowed-redefined-builtins=
# Create a graph of internal dependencies in the given file (report RP0402 must
# not be disabled)
int-import-graph=
# List of strings which can identify a callback function by name. A callback
# name must start or end with one of those strings.
callbacks=cb_,
_cb
# A regular expression matching the name of dummy variables (i.e. expected to
# not be used).
dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_
[CLASSES]
# Argument names that match this expression will be ignored.
ignored-argument-names=_.*|^ignored_|^unused_
# List of interface methods to ignore, separated by a comma. This is used for
# instance to not check methods defined in Zope's Interface base class.
ignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by,implementedBy,providedBy
# Tells whether we should check for unused import in __init__ files.
init-import=no
# List of method names used to declare (i.e. assign) instance attributes.
defining-attr-methods=__init__,__new__,setUp
# List of valid names for the first argument in a class method.
valid-classmethod-first-arg=cls
# List of valid names for the first argument in a metaclass class method.
valid-metaclass-classmethod-first-arg=mcs
[EXCEPTIONS]
# Exceptions that will emit a warning when being caught. Defaults to
# "Exception"
overgeneral-exceptions=Exception
# List of qualified module names which can have objects that can redefine
# builtins.
redefining-builtins-modules=six.moves,past.builtins,future.builtins,builtins,io

View File

@@ -17,7 +17,10 @@ Authors
* [Alex Halderman](https://github.com/jhalderm)
* [Alex Jordan](https://github.com/strugee)
* [Alex Zorin](https://github.com/alexzorin)
* [Alexis Hancock](https://github.com/zoracon)
* [Amir Omidi](https://github.com/aaomidi)
* [Amjad Mashaal](https://github.com/TheNavigat)
* [amplifi](https://github.com/amplifi)
* [Andrew Murray](https://github.com/radarhere)
* [Andrzej Górski](https://github.com/andrzej3393)
* [Anselm Levskaya](https://github.com/levskaya)
@@ -115,6 +118,7 @@ Authors
* [Jacob Sachs](https://github.com/jsachs)
* [Jairo Llopis](https://github.com/Yajo)
* [Jakub Warmuz](https://github.com/kuba)
* [James Balazs](https://github.com/jamesbalazs)
* [James Kasten](https://github.com/jdkasten)
* [Jason Grinblat](https://github.com/ptychomancer)
* [Jay Faulkner](https://github.com/jayofdoom)
@@ -174,6 +178,7 @@ Authors
* [Mathieu Leduc-Hamel](https://github.com/mlhamel)
* [Matt Bostock](https://github.com/mattbostock)
* [Matthew Ames](https://github.com/SuperMatt)
* [Matthew W. Thomas](https://github.com/mwt)
* [Michael Schumacher](https://github.com/schumaml)
* [Michael Strache](https://github.com/Jarodiv)
* [Michael Sverdlin](https://github.com/sveder)
@@ -198,18 +203,21 @@ Authors
* [osirisinferi](https://github.com/osirisinferi)
* Patrick Figel
* [Patrick Heppler](https://github.com/PatrickHeppler)
* [Paul Buonopane](https://github.com/Zenexer)
* [Paul Feitzinger](https://github.com/pfeyz)
* [Pavan Gupta](https://github.com/pavgup)
* [Pavel Pavlov](https://github.com/ghost355)
* [Peter Conrad](https://github.com/pconrad-fb)
* [Peter Eckersley](https://github.com/pde)
* [Peter Mosmans](https://github.com/PeterMosmans)
* [Phil Martin](https://github.com/frillip)
* [Philippe Langlois](https://github.com/langloisjp)
* [Philipp Spitzer](https://github.com/spitza)
* [Piero Steinger](https://github.com/Jadaw1n)
* [Pierre Jaury](https://github.com/kaiyou)
* [Piotr Kasprzyk](https://github.com/kwadrat)
* [Prayag Verma](https://github.com/pra85)
* [Preston Locke](https://github.com/Preston12321)
* [Rasesh Patel](https://github.com/raspat1)
* [Reinaldo de Souza Jr](https://github.com/juniorz)
* [Remi Rampin](https://github.com/remram44)
@@ -273,6 +281,7 @@ Authors
* [Wilfried Teiken](https://github.com/wteiken)
* [Willem Fibbe](https://github.com/fibbers)
* [William Budington](https://github.com/Hainish)
* [Will Greenberg](https://github.com/wgreenberg)
* [Will Newby](https://github.com/willnewby)
* [Will Oller](https://github.com/willoller)
* [Yan](https://github.com/diracdeltas)
@@ -283,3 +292,4 @@ Authors
* [Yuseong Cho](https://github.com/g6123)
* [Zach Shepherd](https://github.com/zjs)
* [陈三](https://github.com/chenxsan)
* [Shahar Naveh](https://github.com/ShaharNaveh)

View File

@@ -2,7 +2,7 @@
This module is an implementation of the `ACME protocol`_.
.. _`ACME protocol`: https://ietf-wg-acme.github.io/acme
.. _`ACME protocol`: https://datatracker.ietf.org/doc/html/rfc8555
"""
import sys

View File

@@ -23,9 +23,6 @@ import requests
from acme import crypto_util
from acme import errors
from acme import fields
from acme.mixins import ResourceMixin
from acme.mixins import TypeMixin
logger = logging.getLogger(__name__)
@@ -47,12 +44,17 @@ class Challenge(jose.TypedJSONObjectWithFields):
return UnrecognizedChallenge.from_json(jobj)
class ChallengeResponse(ResourceMixin, TypeMixin, jose.TypedJSONObjectWithFields):
class ChallengeResponse(jose.TypedJSONObjectWithFields):
# _fields_to_partial_json
"""ACME challenge response."""
TYPES: Dict[str, Type['ChallengeResponse']] = {}
resource_type = 'challenge'
resource: str = fields.resource(resource_type)
def to_partial_json(self) -> Dict[str, Any]:
# Removes the `type` field which is inserted by TypedJSONObjectWithFields.to_partial_json.
# This field breaks RFC8555 compliance.
jobj = super().to_partial_json()
jobj.pop(self.type_field_name, None)
return jobj
class UnrecognizedChallenge(Challenge):
@@ -299,7 +301,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
"""Whitespace characters which should be ignored at the end of the body."""
def simple_verify(self, chall: 'HTTP01', domain: str, account_public_key: jose.JWK,
port: Optional[int] = None) -> bool:
port: Optional[int] = None, timeout: int = 30) -> bool:
"""Simple verify.
:param challenges.SimpleHTTP chall: Corresponding challenge.
@@ -307,6 +309,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
:param JWK account_public_key: Public key for the key pair
being authorized.
:param int port: Port used in the validation.
:param int timeout: Timeout in seconds.
:returns: ``True`` iff validation with the files currently served by the
HTTP server is successful.
@@ -328,7 +331,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
uri = chall.uri(domain)
logger.debug("Verifying %s at %s...", chall.typ, uri)
try:
http_response = requests.get(uri, verify=False)
http_response = requests.get(uri, verify=False, timeout=timeout)
except requests.exceptions.RequestException as error:
logger.error("Unable to reach %s: %s", uri, error)
return False
@@ -408,7 +411,7 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""
ID_PE_ACME_IDENTIFIER_V1 = b"1.3.6.1.5.5.7.1.30.1"
ACME_TLS_1_PROTOCOL = "acme-tls/1"
ACME_TLS_1_PROTOCOL = b"acme-tls/1"
@property
def h(self) -> bytes:

File diff suppressed because it is too large Load Diff

View File

@@ -11,6 +11,7 @@ from typing import Callable
from typing import List
from typing import Mapping
from typing import Optional
from typing import Sequence
from typing import Set
from typing import Tuple
from typing import Union
@@ -39,7 +40,9 @@ class _DefaultCertSelection:
def __call__(self, connection: SSL.Connection) -> Optional[Tuple[crypto.PKey, crypto.X509]]:
server_name = connection.get_servername()
return self.certs.get(server_name, None)
if server_name:
return self.certs.get(server_name, None)
return None # pragma: no cover
class SSLSocket: # pylint: disable=too-few-public-methods
@@ -60,7 +63,8 @@ class SSLSocket: # pylint: disable=too-few-public-methods
method: int = _DEFAULT_SSL_METHOD,
alpn_selection: Optional[Callable[[SSL.Connection, List[bytes]], bytes]] = None,
cert_selection: Optional[Callable[[SSL.Connection],
Tuple[crypto.PKey, crypto.X509]]] = None
Optional[Tuple[crypto.PKey,
crypto.X509]]]] = None
) -> None:
self.sock = sock
self.alpn_selection = alpn_selection
@@ -71,8 +75,8 @@ class SSLSocket: # pylint: disable=too-few-public-methods
raise ValueError("Both cert_selection and certs specified.")
actual_cert_selection: Union[_DefaultCertSelection,
Optional[Callable[[SSL.Connection],
Tuple[crypto.PKey,
crypto.X509]]]] = cert_selection
Optional[Tuple[crypto.PKey,
crypto.X509]]]]] = cert_selection
if actual_cert_selection is None:
actual_cert_selection = _DefaultCertSelection(certs if certs else {})
self.cert_selection = actual_cert_selection
@@ -120,7 +124,14 @@ class SSLSocket: # pylint: disable=too-few-public-methods
def shutdown(self, *unused_args: Any) -> bool:
# OpenSSL.SSL.Connection.shutdown doesn't accept any args
return self._wrapped.shutdown()
try:
return self._wrapped.shutdown()
except SSL.Error as error:
# We wrap the error so we raise the same error type as sockets
# in the standard library. This is useful when this object is
# used by code which expects a standard socket such as
# socketserver in the standard library.
raise socket.error(error)
def accept(self) -> Tuple[FakeConnection, Any]: # pylint: disable=missing-function-docstring
sock, addr = self.sock.accept()
@@ -135,6 +146,8 @@ class SSLSocket: # pylint: disable=too-few-public-methods
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
ssl_sock.set_accept_state()
# This log line is especially desirable because without it requests to
# our standalone TLSALPN server would not be logged.
logger.debug("Performing handshake with %s", addr)
try:
ssl_sock.do_handshake()
@@ -148,7 +161,7 @@ class SSLSocket: # pylint: disable=too-few-public-methods
def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, # pylint: disable=too-many-arguments
method: int = _DEFAULT_SSL_METHOD, source_address: Tuple[str, int] = ('', 0),
alpn_protocols: Optional[List[str]] = None) -> crypto.X509:
alpn_protocols: Optional[Sequence[bytes]] = None) -> crypto.X509:
"""Probe SNI server for SSL certificate.
:param bytes name: Byte string to send as the server name in the
@@ -161,7 +174,7 @@ def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, #
of source interface). See `socket.creation_connection` for more
info. Available only in Python 2.7+.
:param alpn_protocols: Protocols to request using ALPN.
:type alpn_protocols: `list` of `str`
:type alpn_protocols: `Sequence` of `bytes`
:raises acme.errors.Error: In case of any problems.
@@ -198,7 +211,9 @@ def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, #
client_ssl.shutdown()
except SSL.Error as error:
raise errors.Error(error)
return client_ssl.get_peer_certificate()
cert = client_ssl.get_peer_certificate()
assert cert # Appease mypy. We would have crashed out by now if there was no certificate.
return cert
def make_csr(private_key_pem: bytes, domains: Optional[Union[Set[str], List[str]]] = None,
@@ -249,7 +264,8 @@ def make_csr(private_key_pem: bytes, domains: Optional[Union[Set[str], List[str]
value=b"DER:30:03:02:01:05"))
csr.add_extensions(extensions)
csr.set_pubkey(private_key)
csr.set_version(2)
# RFC 2986 Section 4.1 only defines version 0
csr.set_version(0)
csr.sign(private_key, 'sha256')
return crypto.dump_certificate_request(
crypto.FILETYPE_PEM, csr)

View File

@@ -51,22 +51,6 @@ class RFC3339Field(jose.Field):
raise jose.DeserializationError(error)
class Resource(jose.Field):
"""Resource MITM field."""
def __init__(self, resource_type: str, *args: Any, **kwargs: Any) -> None:
self.resource_type = resource_type
kwargs['default'] = resource_type
super().__init__('resource', *args, **kwargs)
def decode(self, value: Any) -> Any:
if value != self.resource_type:
raise jose.DeserializationError(
'Wrong resource type: {0} instead of {1}'.format(
value, self.resource_type))
return value
def fixed(json_name: str, value: Any) -> Any:
"""Generates a type-friendly Fixed field."""
return Fixed(json_name, value)
@@ -75,8 +59,3 @@ def fixed(json_name: str, value: Any) -> Any:
def rfc3339(json_name: str, omitempty: bool = False) -> Any:
"""Generates a type-friendly RFC3339 field."""
return RFC3339Field(json_name, omitempty=omitempty)
def resource(resource_type: str) -> Any:
"""Generates a type-friendly Resource field."""
return Resource(resource_type)

View File

@@ -1,18 +0,0 @@
"""Simple shim around the typing module.
This was useful when this code supported Python 2 and typing wasn't always
available. This code is being kept for now for backwards compatibility.
"""
import warnings
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
from typing import Any
warnings.warn("acme.magic_typing is deprecated and will be removed in a future release.",
DeprecationWarning)
class TypingClass:
"""Ignore import errors by getting anything"""
def __getattr__(self, name: str) -> Any:
return None # pragma: no cover

View File

@@ -11,9 +11,7 @@ from typing import MutableMapping
from typing import Optional
from typing import Tuple
from typing import Type
from typing import TYPE_CHECKING
from typing import TypeVar
from typing import Union
import josepy as jose
@@ -22,14 +20,8 @@ from acme import errors
from acme import fields
from acme import jws
from acme import util
from acme.mixins import ResourceMixin
if TYPE_CHECKING:
from typing_extensions import Protocol # pragma: no cover
else:
Protocol = object
OLD_ERROR_PREFIX = "urn:acme:error:"
ERROR_PREFIX = "urn:ietf:params:acme:error:"
ERROR_CODES = {
@@ -67,31 +59,93 @@ ERROR_CODES = {
ERROR_TYPE_DESCRIPTIONS = {**{
ERROR_PREFIX + name: desc for name, desc in ERROR_CODES.items()
}, **{ # add errors with old prefix, deprecate me
OLD_ERROR_PREFIX + name: desc for name, desc in ERROR_CODES.items()
}}
def is_acme_error(err: BaseException) -> bool:
"""Check if argument is an ACME error."""
if isinstance(err, Error) and (err.typ is not None):
return (ERROR_PREFIX in err.typ) or (OLD_ERROR_PREFIX in err.typ)
return ERROR_PREFIX in err.typ
return False
class _Constant(jose.JSONDeSerializable, Hashable):
"""ACME constant."""
__slots__ = ('name',)
POSSIBLE_NAMES: Dict[str, '_Constant'] = NotImplemented
def __init__(self, name: str) -> None:
super().__init__()
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
self.name = name
def to_partial_json(self) -> str:
return self.name
@classmethod
def from_json(cls, jobj: str) -> '_Constant':
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
raise jose.DeserializationError(f'{cls.__name__} not recognized')
return cls.POSSIBLE_NAMES[jobj]
def __repr__(self) -> str:
return f'{self.__class__.__name__}({self.name})'
def __eq__(self, other: Any) -> bool:
return isinstance(other, type(self)) and other.name == self.name
def __hash__(self) -> int:
return hash((self.__class__, self.name))
class IdentifierType(_Constant):
"""ACME identifier type."""
POSSIBLE_NAMES: Dict[str, _Constant] = {}
IDENTIFIER_FQDN = IdentifierType('dns') # IdentifierDNS in Boulder
IDENTIFIER_IP = IdentifierType('ip') # IdentifierIP in pebble - not in Boulder yet
class Identifier(jose.JSONObjectWithFields):
"""ACME identifier.
:ivar IdentifierType typ:
:ivar str value:
"""
typ: IdentifierType = jose.field('type', decoder=IdentifierType.from_json)
value: str = jose.field('value')
class Error(jose.JSONObjectWithFields, errors.Error):
"""ACME error.
https://tools.ietf.org/html/draft-ietf-appsawg-http-problem-00
https://datatracker.ietf.org/doc/html/rfc7807
Note: Although Error inherits from JSONObjectWithFields, which is immutable,
we add mutability for Error to comply with the Python exception API.
:ivar str typ:
:ivar str title:
:ivar str detail:
:ivar Identifier identifier:
:ivar tuple subproblems: An array of ACME Errors which may be present when the CA
returns multiple errors related to the same request, `tuple` of `Error`.
"""
typ: str = jose.field('type', omitempty=True, default='about:blank')
title: str = jose.field('title', omitempty=True)
detail: str = jose.field('detail', omitempty=True)
identifier: Optional['Identifier'] = jose.field(
'identifier', decoder=Identifier.from_json, omitempty=True)
subproblems: Optional[Tuple['Error', ...]] = jose.field('subproblems', omitempty=True)
# Mypy does not understand the josepy magic happening here, and falsely claims
# that subproblems is redefined. Let's ignore the type check here.
@subproblems.decoder # type: ignore
def subproblems(value: List[Dict[str, Any]]) -> Tuple['Error', ...]: # pylint: disable=no-self-argument,missing-function-docstring
return tuple(Error.from_json(subproblem) for subproblem in value)
@classmethod
def with_code(cls, code: str, **kwargs: Any) -> 'Error':
@@ -134,40 +188,21 @@ class Error(jose.JSONObjectWithFields, errors.Error):
return code
return None
# Hack to allow mutability on Errors (see GH #9539)
def __setattr__(self, name: str, value: Any) -> None:
return object.__setattr__(self, name, value)
def __str__(self) -> str:
return b' :: '.join(
result = b' :: '.join(
part.encode('ascii', 'backslashreplace') for part in
(self.typ, self.description, self.detail, self.title)
if part is not None).decode()
class _Constant(jose.JSONDeSerializable, Hashable):
"""ACME constant."""
__slots__ = ('name',)
POSSIBLE_NAMES: Dict[str, '_Constant'] = NotImplemented
def __init__(self, name: str) -> None:
super().__init__()
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
self.name = name
def to_partial_json(self) -> str:
return self.name
@classmethod
def from_json(cls, jobj: str) -> '_Constant':
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
raise jose.DeserializationError(f'{cls.__name__} not recognized')
return cls.POSSIBLE_NAMES[jobj]
def __repr__(self) -> str:
return f'{self.__class__.__name__}({self.name})'
def __eq__(self, other: Any) -> bool:
return isinstance(other, type(self)) and other.name == self.name
def __hash__(self) -> int:
return hash((self.__class__, self.name))
if self.identifier:
result = f'Problem for {self.identifier.value}: ' + result # pylint: disable=no-member
if self.subproblems and len(self.subproblems) > 0:
for subproblem in self.subproblems:
result += f'\n{subproblem}'
return result
class Status(_Constant):
@@ -185,45 +220,15 @@ STATUS_READY = Status('ready')
STATUS_DEACTIVATED = Status('deactivated')
class IdentifierType(_Constant):
"""ACME identifier type."""
POSSIBLE_NAMES: Dict[str, _Constant] = {}
IDENTIFIER_FQDN = IdentifierType('dns') # IdentifierDNS in Boulder
IDENTIFIER_IP = IdentifierType('ip') # IdentifierIP in pebble - not in Boulder yet
class Identifier(jose.JSONObjectWithFields):
"""ACME identifier.
:ivar IdentifierType typ:
:ivar str value:
"""
typ: IdentifierType = jose.field('type', decoder=IdentifierType.from_json)
value: str = jose.field('value')
class HasResourceType(Protocol):
"""
Represents a class with a resource_type class parameter of type string.
"""
resource_type: str = NotImplemented
GenericHasResourceType = TypeVar("GenericHasResourceType", bound=HasResourceType)
class Directory(jose.JSONDeSerializable):
"""Directory."""
"""Directory.
_REGISTERED_TYPES: Dict[str, Type[HasResourceType]] = {}
Directory resources must be accessed by the exact field name in RFC8555 (section 9.7.5).
"""
class Meta(jose.JSONObjectWithFields):
"""Directory Meta."""
_terms_of_service: str = jose.field('terms-of-service', omitempty=True)
_terms_of_service_v2: str = jose.field('termsOfService', omitempty=True)
_terms_of_service: str = jose.field('termsOfService', omitempty=True)
website: str = jose.field('website', omitempty=True)
caa_identities: List[str] = jose.field('caaIdentities', omitempty=True)
external_account_required: bool = jose.field('externalAccountRequired', omitempty=True)
@@ -235,7 +240,7 @@ class Directory(jose.JSONDeSerializable):
@property
def terms_of_service(self) -> str:
"""URL for the CA TOS"""
return self._terms_of_service or self._terms_of_service_v2
return self._terms_of_service
def __iter__(self) -> Iterator[str]:
# When iterating over fields, use the external name 'terms_of_service' instead of
@@ -246,41 +251,23 @@ class Directory(jose.JSONDeSerializable):
def _internal_name(self, name: str) -> str:
return '_' + name if name == 'terms_of_service' else name
@classmethod
def _canon_key(cls, key: Union[str, HasResourceType, Type[HasResourceType]]) -> str:
if isinstance(key, str):
return key
return key.resource_type
@classmethod
def register(cls,
resource_body_cls: Type[GenericHasResourceType]) -> Type[GenericHasResourceType]:
"""Register resource."""
resource_type = resource_body_cls.resource_type
assert resource_type not in cls._REGISTERED_TYPES
cls._REGISTERED_TYPES[resource_type] = resource_body_cls
return resource_body_cls
def __init__(self, jobj: Mapping[str, Any]) -> None:
canon_jobj = util.map_keys(jobj, self._canon_key)
# TODO: check that everything is an absolute URL; acme-spec is
# not clear on that
self._jobj = canon_jobj
self._jobj = jobj
def __getattr__(self, name: str) -> Any:
try:
return self[name.replace('_', '-')]
return self[name]
except KeyError as error:
raise AttributeError(str(error))
def __getitem__(self, name: Union[str, HasResourceType, Type[HasResourceType]]) -> Any:
def __getitem__(self, name: str) -> Any:
try:
return self._jobj[self._canon_key(name)]
return self._jobj[name]
except KeyError:
raise KeyError('Directory field "' + self._canon_key(name) + '" not found')
raise KeyError(f'Directory field "{name}" not found')
def to_partial_json(self) -> Dict[str, Any]:
return self._jobj
return util.map_keys(self._jobj, lambda k: k)
@classmethod
def from_json(cls, jobj: MutableMapping[str, Any]) -> 'Directory':
@@ -441,17 +428,12 @@ class Registration(ResourceBody):
return self._filter_contact(self.email_prefix)
@Directory.register
class NewRegistration(ResourceMixin, Registration):
class NewRegistration(Registration):
"""New registration."""
resource_type = 'new-reg'
resource: str = fields.resource(resource_type)
class UpdateRegistration(ResourceMixin, Registration):
class UpdateRegistration(Registration):
"""Update registration."""
resource_type = 'reg'
resource: str = fields.resource(resource_type)
class RegistrationResource(ResourceWithURI):
@@ -489,7 +471,6 @@ class ChallengeBody(ResourceBody):
# challenge object supports either one, but should be accessed through the
# name "uri". In Client.answer_challenge, whichever one is set will be
# used.
_uri: str = jose.field('uri', omitempty=True, default=None)
_url: str = jose.field('url', omitempty=True, default=None)
status: Status = jose.field('status', decoder=Status.from_json,
omitempty=True, default=STATUS_PENDING)
@@ -518,7 +499,7 @@ class ChallengeBody(ResourceBody):
@property
def uri(self) -> str:
"""The URL of this challenge."""
return self._url or self._uri
return self._url
def __getattr__(self, name: str) -> Any:
return getattr(self.chall, name)
@@ -527,10 +508,10 @@ class ChallengeBody(ResourceBody):
# When iterating over fields, use the external name 'uri' instead of
# the internal '_uri'.
for name in super().__iter__():
yield name[1:] if name == '_uri' else name
yield 'uri' if name == '_url' else name
def _internal_name(self, name: str) -> str:
return '_' + name if name == 'uri' else name
return '_url' if name == 'uri' else name
class ChallengeResource(Resource):
@@ -554,15 +535,12 @@ class Authorization(ResourceBody):
:ivar acme.messages.Identifier identifier:
:ivar list challenges: `list` of `.ChallengeBody`
:ivar tuple combinations: Challenge combinations (`tuple` of `tuple`
of `int`, as opposed to `list` of `list` from the spec).
:ivar acme.messages.Status status:
:ivar datetime.datetime expires:
"""
identifier: Identifier = jose.field('identifier', decoder=Identifier.from_json, omitempty=True)
challenges: List[ChallengeBody] = jose.field('challenges', omitempty=True)
combinations: Tuple[Tuple[int, ...], ...] = jose.field('combinations', omitempty=True)
status: Status = jose.field('status', omitempty=True, decoder=Status.from_json)
# TODO: 'expires' is allowed for Authorization Resources in
@@ -575,27 +553,16 @@ class Authorization(ResourceBody):
# Mypy does not understand the josepy magic happening here, and falsely claims
# that challenge is redefined. Let's ignore the type check here.
@challenges.decoder # type: ignore
def challenges(value: List[Dict[str, Any]]) -> Tuple[ChallengeBody, ...]: # type: ignore[misc] # pylint: disable=no-self-argument,missing-function-docstring
def challenges(value: List[Dict[str, Any]]) -> Tuple[ChallengeBody, ...]: # pylint: disable=no-self-argument,missing-function-docstring
return tuple(ChallengeBody.from_json(chall) for chall in value)
@property
def resolved_combinations(self) -> Tuple[Tuple[ChallengeBody, ...], ...]:
"""Combinations with challenges instead of indices."""
return tuple(tuple(self.challenges[idx] for idx in combo)
for combo in self.combinations) # pylint: disable=not-an-iterable
@Directory.register
class NewAuthorization(ResourceMixin, Authorization):
class NewAuthorization(Authorization):
"""New authorization."""
resource_type = 'new-authz'
resource: str = fields.resource(resource_type)
class UpdateAuthorization(ResourceMixin, Authorization):
class UpdateAuthorization(Authorization):
"""Update authorization."""
resource_type = 'authz'
resource: str = fields.resource(resource_type)
class AuthorizationResource(ResourceWithURI):
@@ -609,16 +576,13 @@ class AuthorizationResource(ResourceWithURI):
new_cert_uri: str = jose.field('new_cert_uri', omitempty=True)
@Directory.register
class CertificateRequest(ResourceMixin, jose.JSONObjectWithFields):
"""ACME new-cert request.
class CertificateRequest(jose.JSONObjectWithFields):
"""ACME newOrder request.
:ivar jose.ComparableX509 csr:
`OpenSSL.crypto.X509Req` wrapped in `.ComparableX509`
"""
resource_type = 'new-cert'
resource: str = fields.resource(resource_type)
csr: jose.ComparableX509 = jose.field('csr', decoder=jose.decode_csr, encoder=jose.encode_csr)
@@ -635,16 +599,13 @@ class CertificateResource(ResourceWithURI):
authzrs: Tuple[AuthorizationResource, ...] = jose.field('authzrs')
@Directory.register
class Revocation(ResourceMixin, jose.JSONObjectWithFields):
class Revocation(jose.JSONObjectWithFields):
"""Revocation message.
:ivar jose.ComparableX509 certificate: `OpenSSL.crypto.X509` wrapped in
`jose.ComparableX509`
"""
resource_type = 'revoke-cert'
resource: str = fields.resource(resource_type)
certificate: jose.ComparableX509 = jose.field(
'certificate', decoder=jose.decode_cert, encoder=jose.encode_cert)
reason: int = jose.field('reason')
@@ -675,7 +636,7 @@ class Order(ResourceBody):
# Mypy does not understand the josepy magic happening here, and falsely claims
# that identifiers is redefined. Let's ignore the type check here.
@identifiers.decoder # type: ignore
def identifiers(value: List[Dict[str, Any]]) -> Tuple[Identifier, ...]: # type: ignore[misc] # pylint: disable=no-self-argument,missing-function-docstring
def identifiers(value: List[Dict[str, Any]]) -> Tuple[Identifier, ...]: # pylint: disable=no-self-argument,missing-function-docstring
return tuple(Identifier.from_json(identifier) for identifier in value)
@@ -701,7 +662,5 @@ class OrderResource(ResourceWithURI):
omitempty=True)
@Directory.register
class NewOrder(Order):
"""New order."""
resource_type = 'new-order'

View File

@@ -1,68 +0,0 @@
"""Useful mixins for Challenge and Resource objects"""
from typing import Any
from typing import Dict
class VersionedLEACMEMixin:
"""This mixin stores the version of Let's Encrypt's endpoint being used."""
@property
def le_acme_version(self) -> int:
"""Define the version of ACME protocol to use"""
return getattr(self, '_le_acme_version', 1)
@le_acme_version.setter
def le_acme_version(self, version: int) -> None:
# We need to use object.__setattr__ to not depend on the specific implementation of
# __setattr__ in current class (eg. jose.TypedJSONObjectWithFields raises AttributeError
# for any attempt to set an attribute to make objects immutable).
object.__setattr__(self, '_le_acme_version', version)
def __setattr__(self, key: str, value: Any) -> None:
if key == 'le_acme_version':
# Required for @property to operate properly. See comment above.
object.__setattr__(self, key, value)
else:
super().__setattr__(key, value) # pragma: no cover
class ResourceMixin(VersionedLEACMEMixin):
"""
This mixin generates a RFC8555 compliant JWS payload
by removing the `resource` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(),
'to_partial_json', 'resource')
def fields_to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(),
'fields_to_partial_json', 'resource')
class TypeMixin(VersionedLEACMEMixin):
"""
This mixin allows generation of a RFC8555 compliant JWS payload
by removing the `type` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(),
'to_partial_json', 'type')
def fields_to_partial_json(self) -> Dict[str, Any]:
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(),
'fields_to_partial_json', 'type')
def _safe_jobj_compliance(instance: Any, jobj_method: str,
uncompliant_field: str) -> Dict[str, Any]:
if hasattr(instance, jobj_method):
jobj: Dict[str, Any] = getattr(instance, jobj_method)()
if instance.le_acme_version == 2:
jobj.pop(uncompliant_field, None)
return jobj
raise AttributeError(f'Method {jobj_method}() is not implemented.') # pragma: no cover

View File

@@ -46,10 +46,12 @@ class TLSServer(socketserver.TCPServer):
method=self.method))
def _cert_selection(self, connection: SSL.Connection
) -> Tuple[crypto.PKey, crypto.X509]: # pragma: no cover
) -> Optional[Tuple[crypto.PKey, crypto.X509]]: # pragma: no cover
"""Callback selecting certificate for connection."""
server_name = connection.get_servername()
return self.certs.get(server_name, None)
if server_name:
return self.certs.get(server_name, None)
return None
def server_bind(self) -> None:
self._wrap_sock()
@@ -151,14 +153,18 @@ class TLSALPN01Server(TLSServer, ACMEServerMixin):
def __init__(self, server_address: Tuple[str, int],
certs: List[Tuple[crypto.PKey, crypto.X509]],
challenge_certs: Mapping[str, Tuple[crypto.PKey, crypto.X509]],
challenge_certs: Mapping[bytes, Tuple[crypto.PKey, crypto.X509]],
ipv6: bool = False) -> None:
# We don't need to implement a request handler here because the work
# (including logging) is being done by wrapped socket set up in the
# parent TLSServer class.
TLSServer.__init__(
self, server_address, _BaseRequestHandlerWithLogging, certs=certs,
self, server_address, socketserver.BaseRequestHandler, certs=certs,
ipv6=ipv6)
self.challenge_certs = challenge_certs
def _cert_selection(self, connection: SSL.Connection) -> Tuple[crypto.PKey, crypto.X509]:
def _cert_selection(self, connection: SSL.Connection) -> Optional[Tuple[crypto.PKey,
crypto.X509]]:
# TODO: We would like to serve challenge cert only if asked for it via
# ALPN. To do this, we need to retrieve the list of protos from client
# hello, but this is currently impossible with openssl [0], and ALPN
@@ -167,8 +173,10 @@ class TLSALPN01Server(TLSServer, ACMEServerMixin):
# handshake in alpn_selection() if ALPN protos are not what we expect.
# [0] https://github.com/openssl/openssl/issues/4952
server_name = connection.get_servername()
logger.debug("Serving challenge cert for server name %s", server_name)
return self.challenge_certs[server_name]
if server_name:
logger.debug("Serving challenge cert for server name %s", server_name)
return self.challenge_certs[server_name]
return None # pragma: no cover
def _alpn_selection(self, _connection: SSL.Connection, alpn_protos: List[bytes]) -> bytes:
"""Callback to select alpn protocol."""
@@ -303,16 +311,3 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
return functools.partial(
cls, simple_http_resources=simple_http_resources,
timeout=timeout)
class _BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
"""BaseRequestHandler with logging."""
def log_message(self, format: str, *args: Any) -> None: # pylint: disable=redefined-builtin
"""Log arbitrary message."""
logger.debug("%s - - %s", self.client_address[0], format % args)
def handle(self) -> None:
"""Handle request."""
self.log_message("Incoming request")
socketserver.BaseRequestHandler.handle(self)

View File

@@ -163,7 +163,7 @@ def example_http():
# Register account and accept TOS
net = client.ClientNetwork(acc_key, user_agent=USER_AGENT)
directory = messages.Directory.from_json(net.get(DIRECTORY_URL).json())
directory = client.ClientV2.get_directory(DIRECTORY_URL, net)
client_acme = client.ClientV2(directory, net=net)
# Terms of Service URL is in client_acme.directory.meta.terms_of_service
@@ -215,8 +215,7 @@ def example_http():
try:
regr = client_acme.query_registration(regr)
except errors.Error as err:
if err.typ == messages.OLD_ERROR_PREFIX + 'unauthorized' \
or err.typ == messages.ERROR_PREFIX + 'unauthorized':
if err.typ == messages.ERROR_PREFIX + 'unauthorized':
# Status is deactivated.
pass
raise

View File

@@ -3,16 +3,15 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '1.25.0.dev0'
version = '2.3.0.dev0'
install_requires = [
'cryptography>=2.5.0',
'josepy>=1.10.0',
'PyOpenSSL>=17.3.0',
'josepy>=1.13.0',
'PyOpenSSL>=17.5.0',
'pyrfc3339',
'pytz>=2019.3',
'requests>=2.20.0',
'requests-toolbelt>=0.3.0',
'setuptools>=41.6.0',
]
@@ -46,6 +45,7 @@ setup(
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],

View File

@@ -92,8 +92,7 @@ class DNS01ResponseTest(unittest.TestCase):
self.response = self.chall.response(KEY)
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.msg.to_partial_json())
self.assertEqual({}, self.msg.to_partial_json())
def test_from_json(self):
from acme.challenges import DNS01Response
@@ -163,8 +162,7 @@ class HTTP01ResponseTest(unittest.TestCase):
self.response = self.chall.response(KEY)
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.msg.to_partial_json())
self.assertEqual({}, self.msg.to_partial_json())
def test_from_json(self):
from acme.challenges import HTTP01Response
@@ -185,7 +183,8 @@ class HTTP01ResponseTest(unittest.TestCase):
mock_get.return_value = mock.MagicMock(text=validation)
self.assertTrue(self.response.simple_verify(
self.chall, "local", KEY.public_key()))
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False,
timeout=mock.ANY)
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_bad_validation(self, mock_get):
@@ -201,7 +200,8 @@ class HTTP01ResponseTest(unittest.TestCase):
HTTP01Response.WHITESPACE_CUTSET))
self.assertTrue(self.response.simple_verify(
self.chall, "local", KEY.public_key()))
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False,
timeout=mock.ANY)
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_connection_error(self, mock_get):
@@ -217,6 +217,16 @@ class HTTP01ResponseTest(unittest.TestCase):
self.assertEqual("local:8080", urllib_parse.urlparse(
mock_get.mock_calls[0][1][0]).netloc)
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_timeout(self, mock_get):
self.response.simple_verify(self.chall, "local", KEY.public_key())
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False,
timeout=30)
mock_get.reset_mock()
self.response.simple_verify(self.chall, "local", KEY.public_key(), timeout=1234)
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False,
timeout=1234)
class HTTP01Test(unittest.TestCase):
@@ -274,8 +284,7 @@ class TLSALPN01ResponseTest(unittest.TestCase):
}
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.response.to_partial_json())
self.assertEqual({}, self.response.to_partial_json())
def test_from_json(self):
from acme.challenges import TLSALPN01Response
@@ -328,12 +337,12 @@ class TLSALPN01ResponseTest(unittest.TestCase):
mock_gethostbyname.assert_called_once_with('foo.com')
mock_probe_sni.assert_called_once_with(
host=b'127.0.0.1', port=self.response.PORT, name=b'foo.com',
alpn_protocols=['acme-tls/1'])
alpn_protocols=[b'acme-tls/1'])
self.response.probe_cert('foo.com', host='8.8.8.8')
mock_probe_sni.assert_called_with(
host=b'8.8.8.8', port=mock.ANY, name=b'foo.com',
alpn_protocols=['acme-tls/1'])
alpn_protocols=[b'acme-tls/1'])
@mock.patch('acme.challenges.TLSALPN01Response.probe_cert')
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
@@ -461,8 +470,6 @@ class DNSResponseTest(unittest.TestCase):
from acme.challenges import DNSResponse
self.msg = DNSResponse(validation=self.validation)
self.jmsg_to = {
'resource': 'challenge',
'type': 'dns',
'validation': self.validation,
}
self.jmsg_from = {
@@ -492,7 +499,6 @@ class JWSPayloadRFC8555Compliant(unittest.TestCase):
from acme.challenges import HTTP01Response
challenge_body = HTTP01Response()
challenge_body.le_acme_version = 2
jobj = challenge_body.json_dumps(indent=2).encode()
# RFC8555 states that challenge responses must have an empty payload.

File diff suppressed because it is too large Load Diff

View File

@@ -314,6 +314,14 @@ class MakeCSRTest(unittest.TestCase):
def test_make_csr_without_hostname(self):
self.assertRaises(ValueError, self._call_with_key)
def test_make_csr_correct_version(self):
csr_pem = self._call_with_key(["a.example"])
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
self.assertEqual(csr.get_version(), 0,
"Expected CSR version to be v1 (encoded as 0), per RFC 2986, section 4")
class DumpPyopensslChainTest(unittest.TestCase):
"""Test for dump_pyopenssl_chain."""

View File

@@ -1,6 +1,7 @@
"""Tests for acme.fields."""
import datetime
import unittest
import warnings
import josepy as jose
import pytz
@@ -54,19 +55,5 @@ class RFC3339FieldTest(unittest.TestCase):
jose.DeserializationError, RFC3339Field.default_decoder, '')
class ResourceTest(unittest.TestCase):
"""Tests for acme.fields.Resource."""
def setUp(self):
from acme.fields import Resource
self.field = Resource('x')
def test_decode_good(self):
self.assertEqual('x', self.field.decode('x'))
def test_decode_wrong(self):
self.assertRaises(jose.DeserializationError, self.field.decode, 'y')
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,30 +0,0 @@
"""Tests for acme.magic_typing."""
import sys
import unittest
import warnings
from unittest import mock
class MagicTypingTest(unittest.TestCase):
"""Tests for acme.magic_typing."""
def test_import_success(self):
try:
import typing as temp_typing
except ImportError: # pragma: no cover
temp_typing = None # pragma: no cover
typing_class_mock = mock.MagicMock()
text_mock = mock.MagicMock()
typing_class_mock.Text = text_mock
sys.modules['typing'] = typing_class_mock
if 'acme.magic_typing' in sys.modules:
del sys.modules['acme.magic_typing'] # pragma: no cover
with warnings.catch_warnings():
warnings.filterwarnings("ignore", category=DeprecationWarning)
from acme.magic_typing import Text
self.assertEqual(Text, text_mock)
del sys.modules['acme.magic_typing']
sys.modules['typing'] = temp_typing
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,7 +1,9 @@
"""Tests for acme.messages."""
from typing import Dict
import unittest
import contextlib
from unittest import mock
import warnings
import josepy as jose
@@ -17,7 +19,7 @@ class ErrorTest(unittest.TestCase):
"""Tests for acme.messages.Error."""
def setUp(self):
from acme.messages import Error, ERROR_PREFIX
from acme.messages import Error, ERROR_PREFIX, Identifier, IDENTIFIER_FQDN
self.error = Error.with_code('malformed', detail='foo', title='title')
self.jobj = {
'detail': 'foo',
@@ -25,6 +27,9 @@ class ErrorTest(unittest.TestCase):
'type': ERROR_PREFIX + 'malformed',
}
self.error_custom = Error(typ='custom', detail='bar')
self.identifier = Identifier(typ=IDENTIFIER_FQDN, value='example.com')
self.subproblem = Error.with_code('caa', detail='bar', title='title', identifier=self.identifier)
self.error_with_subproblems = Error.with_code('malformed', detail='foo', title='title', subproblems=[self.subproblem])
self.empty_error = Error()
def test_default_typ(self):
@@ -39,6 +44,14 @@ class ErrorTest(unittest.TestCase):
from acme.messages import Error
hash(Error.from_json(self.error.to_json()))
def test_from_json_with_subproblems(self):
from acme.messages import Error
parsed_error = Error.from_json(self.error_with_subproblems.to_json())
self.assertEqual(1, len(parsed_error.subproblems))
self.assertEqual(self.subproblem, parsed_error.subproblems[0])
def test_description(self):
self.assertEqual('The request message was malformed', self.error.description)
self.assertIsNone(self.error_custom.description)
@@ -73,6 +86,23 @@ class ErrorTest(unittest.TestCase):
str(self.error),
u"{0.typ} :: {0.description} :: {0.detail} :: {0.title}"
.format(self.error))
self.assertEqual(
str(self.error_with_subproblems),
(u"{0.typ} :: {0.description} :: {0.detail} :: {0.title}\n"+
u"Problem for {1.identifier.value}: {1.typ} :: {1.description} :: {1.detail} :: {1.title}").format(
self.error_with_subproblems, self.subproblem))
# this test is based on a minimal reproduction of a contextmanager/immutable
# exception related error: https://github.com/python/cpython/issues/99856
def test_setting_traceback(self):
self.assertIsNone(self.error_custom.__traceback__)
try:
1/0
except ZeroDivisionError as e:
self.error_custom.__traceback__ = e.__traceback__
self.assertIsNotNone(self.error_custom.__traceback__)
class ConstantTest(unittest.TestCase):
@@ -119,8 +149,8 @@ class DirectoryTest(unittest.TestCase):
def setUp(self):
from acme.messages import Directory
self.dir = Directory({
'new-reg': 'reg',
mock.MagicMock(resource_type='new-cert'): 'cert',
'newReg': 'reg',
'newCert': 'cert',
'meta': Directory.Meta(
terms_of_service='https://example.com/acme/terms',
website='https://www.example.com/',
@@ -133,26 +163,23 @@ class DirectoryTest(unittest.TestCase):
Directory({'foo': 'bar'})
def test_getitem(self):
self.assertEqual('reg', self.dir['new-reg'])
from acme.messages import NewRegistration
self.assertEqual('reg', self.dir[NewRegistration])
self.assertEqual('reg', self.dir[NewRegistration()])
self.assertEqual('reg', self.dir['newReg'])
def test_getitem_fails_with_key_error(self):
self.assertRaises(KeyError, self.dir.__getitem__, 'foo')
def test_getattr(self):
self.assertEqual('reg', self.dir.new_reg)
self.assertEqual('reg', self.dir.newReg)
def test_getattr_fails_with_attribute_error(self):
self.assertRaises(AttributeError, self.dir.__getattr__, 'foo')
def test_to_json(self):
self.assertEqual(self.dir.to_json(), {
'new-reg': 'reg',
'new-cert': 'cert',
'newReg': 'reg',
'newCert': 'cert',
'meta': {
'terms-of-service': 'https://example.com/acme/terms',
'termsOfService': 'https://example.com/acme/terms',
'website': 'https://www.example.com/',
'caaIdentities': ['example.com'],
},
@@ -272,7 +299,7 @@ class UpdateRegistrationTest(unittest.TestCase):
def test_empty(self):
from acme.messages import UpdateRegistration
jstring = '{"resource": "reg"}'
self.assertEqual(jstring, UpdateRegistration().json_dumps())
self.assertEqual('{}', UpdateRegistration().json_dumps())
self.assertEqual(
UpdateRegistration(), UpdateRegistration.json_loads(jstring))
@@ -320,7 +347,7 @@ class ChallengeBodyTest(unittest.TestCase):
error=error)
self.jobj_to = {
'uri': 'http://challb',
'url': 'http://challb',
'status': self.status,
'type': 'dns',
'token': 'evaGxfADs6pSRb2LAv9IZf17Dt3juxGJ-PCt92wr-oA',
@@ -367,20 +394,17 @@ class AuthorizationTest(unittest.TestCase):
chall=challenges.DNS(
token=b'DGyRejmCefe7v4NfDGDKfA')),
)
combinations = ((0,), (1,))
from acme.messages import Authorization
from acme.messages import Identifier
from acme.messages import IDENTIFIER_FQDN
identifier = Identifier(typ=IDENTIFIER_FQDN, value='example.com')
self.authz = Authorization(
identifier=identifier, combinations=combinations,
challenges=self.challbs)
identifier=identifier, challenges=self.challbs)
self.jobj_from = {
'identifier': identifier.to_json(),
'challenges': [challb.to_json() for challb in self.challbs],
'combinations': combinations,
}
def test_from_json(self):
@@ -391,12 +415,6 @@ class AuthorizationTest(unittest.TestCase):
from acme.messages import Authorization
hash(Authorization.from_json(self.jobj_from))
def test_resolved_combinations(self):
self.assertEqual(self.authz.resolved_combinations, (
(self.challbs[0],),
(self.challbs[1],),
))
class AuthorizationResourceTest(unittest.TestCase):
"""Tests for acme.messages.AuthorizationResource."""
@@ -487,7 +505,6 @@ class JWSPayloadRFC8555Compliant(unittest.TestCase):
from acme.messages import NewAuthorization
new_order = NewAuthorization()
new_order.le_acme_version = 2
jobj = new_order.json_dumps(indent=2).encode()
# RFC8555 states that JWS bodies must not have a resource field.

View File

@@ -136,20 +136,18 @@ def included_in_paths(filepath: str, paths: Iterable[str]) -> bool:
return any(fnmatch.fnmatch(filepath, path) for path in paths)
def parse_defines(apachectl: str) -> Dict[str, str]:
def parse_defines(define_cmd: List[str]) -> Dict[str, str]:
"""
Gets Defines from httpd process and returns a dictionary of
the defined variables.
:param str apachectl: Path to apachectl executable
:param list define_cmd: httpd command to dump defines
:returns: dictionary of defined variables
:rtype: dict
"""
variables: Dict[str, str] = {}
define_cmd = [apachectl, "-t", "-D",
"DUMP_RUN_CFG"]
matches = parse_from_subprocess(define_cmd, r"Define: ([^ \n]*)")
try:
matches.remove("DUMP_RUN_CFG")
@@ -165,33 +163,31 @@ def parse_defines(apachectl: str) -> Dict[str, str]:
return variables
def parse_includes(apachectl: str) -> List[str]:
def parse_includes(inc_cmd: List[str]) -> List[str]:
"""
Gets Include directives from httpd process and returns a list of
their values.
:param str apachectl: Path to apachectl executable
:param list inc_cmd: httpd command to dump includes
:returns: list of found Include directive values
:rtype: list of str
"""
inc_cmd: List[str] = [apachectl, "-t", "-D", "DUMP_INCLUDES"]
return parse_from_subprocess(inc_cmd, r"\(.*\) (.*)")
def parse_modules(apachectl: str) -> List[str]:
def parse_modules(mod_cmd: List[str]) -> List[str]:
"""
Get loaded modules from httpd process, and return the list
of loaded module names.
:param str apachectl: Path to apachectl executable
:param list mod_cmd: httpd command to dump loaded modules
:returns: list of found LoadModule module names
:rtype: list of str
"""
mod_cmd = [apachectl, "-t", "-D", "DUMP_MODULES"]
return parse_from_subprocess(mod_cmd, r"(.*)_module")

View File

@@ -118,7 +118,8 @@ class ApacheBlockNode(ApacheDirectiveNode):
# pylint: disable=unused-argument
def add_child_directive(self, name: str, parameters: Optional[List[str]] = None,
position: int = None) -> ApacheDirectiveNode: # pragma: no cover
position: Optional[int] = None
) -> ApacheDirectiveNode: # pragma: no cover
"""Adds a new DirectiveNode to the sequence of children"""
new_dir = ApacheDirectiveNode(name=assertions.PASS,
parameters=assertions.PASS,

View File

@@ -85,6 +85,10 @@ class OsOptions:
self.restart_cmd = ['apache2ctl', 'graceful'] if not restart_cmd else restart_cmd
self.restart_cmd_alt = restart_cmd_alt
self.conftest_cmd = ['apache2ctl', 'configtest'] if not conftest_cmd else conftest_cmd
syntax_tests_cmd_base = [ctl, '-t', '-D']
self.get_defines_cmd = syntax_tests_cmd_base + ['DUMP_RUN_CFG']
self.get_includes_cmd = syntax_tests_cmd_base + ['DUMP_INCLUDES']
self.get_modules_cmd = syntax_tests_cmd_base + ['DUMP_MODULES']
self.enmod = enmod
self.dismod = dismod
self.le_vhost_ext = le_vhost_ext
@@ -166,6 +170,17 @@ class ApacheConfigurator(common.Configurator):
return apache_util.find_ssl_apache_conf("old")
return apache_util.find_ssl_apache_conf("current")
def _override_cmds(self) -> None:
"""
Set our various command binaries to whatever the user has overridden for apachectl
"""
self.options.version_cmd[0] = self.options.ctl
self.options.restart_cmd[0] = self.options.ctl
self.options.conftest_cmd[0] = self.options.ctl
self.options.get_modules_cmd[0] = self.options.ctl
self.options.get_includes_cmd[0] = self.options.ctl
self.options.get_defines_cmd[0] = self.options.ctl
def _prepare_options(self) -> None:
"""
Set the values possibly changed by command line parameters to
@@ -181,10 +196,7 @@ class ApacheConfigurator(common.Configurator):
else:
setattr(self.options, o, getattr(self.OS_DEFAULTS, o))
# Special cases
self.options.version_cmd[0] = self.options.ctl
self.options.restart_cmd[0] = self.options.ctl
self.options.conftest_cmd[0] = self.options.ctl
self._override_cmds()
@classmethod
def add_parser_arguments(cls, add: Callable[..., None]) -> None:
@@ -354,12 +366,9 @@ class ApacheConfigurator(common.Configurator):
self.version = self.get_version()
logger.debug('Apache version is %s',
'.'.join(str(i) for i in self.version))
if self.version < (2, 2):
if self.version < (2, 4):
raise errors.NotSupportedError(
"Apache Version {0} not supported.".format(str(self.version)))
elif self.version < (2, 4):
logger.warning('Support for Apache 2.2 is deprecated and will be removed in a '
'future release.')
# Recover from previous crash before Augeas initialization to have the
# correct parse tree from the get go.
@@ -479,9 +488,9 @@ class ApacheConfigurator(common.Configurator):
if HAS_APACHECONFIG:
apache_vars = {
"defines": apache_util.parse_defines(self.options.ctl),
"includes": apache_util.parse_includes(self.options.ctl),
"modules": apache_util.parse_modules(self.options.ctl),
"defines": apache_util.parse_defines(self.options.get_defines_cmd),
"includes": apache_util.parse_includes(self.options.get_includes_cmd),
"modules": apache_util.parse_modules(self.options.get_modules_cmd),
}
metadata["apache_vars"] = apache_vars
@@ -803,7 +812,7 @@ class ApacheConfigurator(common.Configurator):
return self._find_best_vhost(target, filtered_vhosts, filter_defaults)
def _find_best_vhost(
self, target_name: str, vhosts: List[obj.VirtualHost] = None,
self, target_name: str, vhosts: Optional[List[obj.VirtualHost]] = None,
filter_defaults: bool = True
) -> Optional[obj.VirtualHost]:
"""Finds the best vhost for a target_name.
@@ -1176,46 +1185,6 @@ class ApacheConfigurator(common.Configurator):
vhost.aliases.add(serveralias)
vhost.name = servername
def is_name_vhost(self, target_addr: obj.Addr) -> bool:
"""Returns if vhost is a name based vhost
NameVirtualHost was deprecated in Apache 2.4 as all VirtualHosts are
now NameVirtualHosts. If version is earlier than 2.4, check if addr
has a NameVirtualHost directive in the Apache config
:param certbot_apache._internal.obj.Addr target_addr: vhost address
:returns: Success
:rtype: bool
"""
# Mixed and matched wildcard NameVirtualHost with VirtualHost
# behavior is undefined. Make sure that an exact match exists
# search for NameVirtualHost directive for ip_addr
# note ip_addr can be FQDN although Apache does not recommend it
return (self.version >= (2, 4) or
bool(self.parser.find_dir("NameVirtualHost", str(target_addr))))
def add_name_vhost(self, addr: obj.Addr) -> None:
"""Adds NameVirtualHost directive for given address.
:param addr: Address that will be added as NameVirtualHost directive
:type addr: :class:`~certbot_apache._internal.obj.Addr`
"""
loc = parser.get_aug_path(self.parser.loc["name"])
if addr.get_port() == "443":
self.parser.add_dir_to_ifmodssl(
loc, "NameVirtualHost", [str(addr)])
else:
self.parser.add_dir(loc, "NameVirtualHost", [str(addr)])
msg = "Setting {0} to be NameBasedVirtualHost\n".format(addr)
logger.debug(msg)
self.save_notes += msg
def prepare_server_https(self, port: str, temp: bool = False) -> None:
"""Prepare the server for HTTPS.
@@ -1363,8 +1332,7 @@ class ApacheConfigurator(common.Configurator):
"""
if self.options.handle_modules:
if self.version >= (2, 4) and ("socache_shmcb_module" not in
self.parser.modules):
if "socache_shmcb_module" not in self.parser.modules:
self.enable_mod("socache_shmcb", temp=temp)
if "ssl_module" not in self.parser.modules:
self.enable_mod("ssl", temp=temp)
@@ -1451,10 +1419,6 @@ class ApacheConfigurator(common.Configurator):
# for the new directives; For these reasons... this is tacked
# on after fully creating the new vhost
# Now check if addresses need to be added as NameBasedVhost addrs
# This is for compliance with versions of Apache < 2.4
self._add_name_vhost_if_necessary(ssl_vhost)
return ssl_vhost
def _get_new_vh_path(self, orig_matches: List[str], new_matches: List[str]) -> Optional[str]:
@@ -1753,40 +1717,6 @@ class ApacheConfigurator(common.Configurator):
aliases = (self.parser.aug.get(match) for match in matches)
return self.domain_in_names(aliases, target_name)
def _add_name_vhost_if_necessary(self, vhost: obj.VirtualHost) -> None:
"""Add NameVirtualHost Directives if necessary for new vhost.
NameVirtualHosts was a directive in Apache < 2.4
https://httpd.apache.org/docs/2.2/mod/core.html#namevirtualhost
:param vhost: New virtual host that was recently created.
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
"""
need_to_save: bool = False
# See if the exact address appears in any other vhost
# Remember 1.1.1.1:* == 1.1.1.1 -> hence any()
for addr in vhost.addrs:
# In Apache 2.2, when a NameVirtualHost directive is not
# set, "*" and "_default_" will conflict when sharing a port
addrs = {addr,}
if addr.get_addr() in ("*", "_default_"):
addrs.update(obj.Addr((a, addr.get_port(),))
for a in ("*", "_default_"))
for test_vh in self.vhosts:
if (vhost.filep != test_vh.filep and
any(test_addr in addrs for
test_addr in test_vh.addrs) and not self.is_name_vhost(addr)):
self.add_name_vhost(addr)
logger.info("Enabling NameVirtualHosts on %s", addr)
need_to_save = True
break
if need_to_save:
self.save()
def find_vhost_by_id(self, id_str: str) -> obj.VirtualHost:
"""
Searches through VirtualHosts and tries to match the id in a comment
@@ -2002,12 +1932,6 @@ class ApacheConfigurator(common.Configurator):
:param unused_options: Not currently used
:type unused_options: Not Available
"""
min_apache_ver = (2, 3, 3)
if self.get_version() < min_apache_ver:
raise errors.PluginError(
"Unable to set OCSP directives.\n"
"Apache version is below 2.3.3.")
if "socache_shmcb_module" not in self.parser.modules:
self.enable_mod("socache_shmcb")
@@ -2188,10 +2112,7 @@ class ApacheConfigurator(common.Configurator):
general_vh.filep, ssl_vhost.filep)
def _set_https_redirection_rewrite_rule(self, vhost: obj.VirtualHost) -> None:
if self.get_version() >= (2, 3, 9):
self.parser.add_dir(vhost.path, "RewriteRule", constants.REWRITE_HTTPS_ARGS_WITH_END)
else:
self.parser.add_dir(vhost.path, "RewriteRule", constants.REWRITE_HTTPS_ARGS)
self.parser.add_dir(vhost.path, "RewriteRule", constants.REWRITE_HTTPS_ARGS)
def _verify_no_certbot_redirect(self, vhost: obj.VirtualHost) -> None:
"""Checks to see if a redirect was already installed by certbot.
@@ -2223,9 +2144,6 @@ class ApacheConfigurator(common.Configurator):
rewrite_args_dict[dir_path].append(match)
if rewrite_args_dict:
redirect_args = [constants.REWRITE_HTTPS_ARGS,
constants.REWRITE_HTTPS_ARGS_WITH_END]
for dir_path, args_paths in rewrite_args_dict.items():
arg_vals = [self.parser.aug.get(x) for x in args_paths]
@@ -2237,7 +2155,7 @@ class ApacheConfigurator(common.Configurator):
raise errors.PluginEnhancementAlreadyPresent(
"Certbot has already enabled redirection")
if arg_vals in redirect_args:
if arg_vals == constants.REWRITE_HTTPS_ARGS:
raise errors.PluginEnhancementAlreadyPresent(
"Certbot has already enabled redirection")
@@ -2306,12 +2224,6 @@ class ApacheConfigurator(common.Configurator):
if ssl_vhost.aliases:
serveralias = "ServerAlias " + " ".join(ssl_vhost.aliases)
rewrite_rule_args: List[str]
if self.get_version() >= (2, 3, 9):
rewrite_rule_args = constants.REWRITE_HTTPS_ARGS_WITH_END
else:
rewrite_rule_args = constants.REWRITE_HTTPS_ARGS
return (
f"<VirtualHost {' '.join(str(addr) for addr in self._get_proposed_addrs(ssl_vhost))}>\n"
f"{servername} \n"
@@ -2319,7 +2231,7 @@ class ApacheConfigurator(common.Configurator):
f"ServerSignature Off\n"
f"\n"
f"RewriteEngine On\n"
f"RewriteRule {' '.join(rewrite_rule_args)}\n"
f"RewriteRule {' '.join(constants.REWRITE_HTTPS_ARGS)}\n"
"\n"
f"ErrorLog {self.options.logs_root}/redirect.error.log\n"
f"LogLevel warn\n"

View File

@@ -32,6 +32,8 @@ ALL_SSL_OPTIONS_HASHES: List[str] = [
'5cc003edd93fb9cd03d40c7686495f8f058f485f75b5e764b789245a386e6daf',
'007cd497a56a3bb8b6a2c1aeb4997789e7e38992f74e44cc5d13a625a738ac73',
'34783b9e2210f5c4a23bced2dfd7ec289834716673354ed7c7abf69fe30192a3',
'61466bc2f98a623c02be8a5ee916ead1655b0ce883bdc936692076ea499ff5ce',
'3fd812e3e87fe5c645d3682a511b2a06c8286f19594f28e280f17cd6af1301b5',
]
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""
@@ -40,18 +42,14 @@ AUGEAS_LENS_DIR = pkg_resources.resource_filename(
"""Path to the Augeas lens directory"""
REWRITE_HTTPS_ARGS: List[str] = [
"^", "https://%{SERVER_NAME}%{REQUEST_URI}", "[L,NE,R=permanent]"]
"""Apache version<2.3.9 rewrite rule arguments used for redirections to
https vhost"""
REWRITE_HTTPS_ARGS_WITH_END: List[str] = [
"^", "https://%{SERVER_NAME}%{REQUEST_URI}", "[END,NE,R=permanent]"]
"""Apache version >= 2.3.9 rewrite rule arguments used for redirections to
https vhost"""
OLD_REWRITE_HTTPS_ARGS: List[List[str]] = [
["^", "https://%{SERVER_NAME}%{REQUEST_URI}", "[L,QSA,R=permanent]"],
["^", "https://%{SERVER_NAME}%{REQUEST_URI}", "[END,QSA,R=permanent]"]]
["^", "https://%{SERVER_NAME}%{REQUEST_URI}", "[END,QSA,R=permanent]"],
["^", "https://%{SERVER_NAME}%{REQUEST_URI}", "[L,NE,R=permanent]"]]
HSTS_ARGS: List[str] = ["always", "set", "Strict-Transport-Security",
"\"max-age=31536000\""]

View File

@@ -24,22 +24,6 @@ logger = logging.getLogger(__name__)
class ApacheHttp01(common.ChallengePerformer):
"""Class that performs HTTP-01 challenges within the Apache configurator."""
CONFIG_TEMPLATE22_PRE = """\
RewriteEngine on
RewriteRule ^/\\.well-known/acme-challenge/([A-Za-z0-9-_=]+)$ {0}/$1 [L]
"""
CONFIG_TEMPLATE22_POST = """\
<Directory {0}>
Order Allow,Deny
Allow from all
</Directory>
<Location /.well-known/acme-challenge>
Order Allow,Deny
Allow from all
</Location>
"""
CONFIG_TEMPLATE24_PRE = """\
RewriteEngine on
RewriteRule ^/\\.well-known/acme-challenge/([A-Za-z0-9-_=]+)$ {0}/$1 [END]
@@ -90,11 +74,7 @@ class ApacheHttp01(common.ChallengePerformer):
"""Make sure that we have the needed modules available for http01"""
if self.configurator.conf("handle-modules"):
needed_modules = ["rewrite"]
if self.configurator.version < (2, 4):
needed_modules.append("authz_host")
else:
needed_modules.append("authz_core")
needed_modules = ["rewrite", "authz_core"]
for mod in needed_modules:
if mod + "_module" not in self.configurator.parser.modules:
self.configurator.enable_mod(mod, temp=True)
@@ -131,15 +111,8 @@ class ApacheHttp01(common.ChallengePerformer):
self.configurator.reverter.register_file_creation(
True, self.challenge_conf_post)
if self.configurator.version < (2, 4):
config_template_pre = self.CONFIG_TEMPLATE22_PRE
config_template_post = self.CONFIG_TEMPLATE22_POST
else:
config_template_pre = self.CONFIG_TEMPLATE24_PRE
config_template_post = self.CONFIG_TEMPLATE24_POST
config_text_pre = config_template_pre.format(self.challenge_dir)
config_text_post = config_template_post.format(self.challenge_dir)
config_text_pre = self.CONFIG_TEMPLATE24_PRE.format(self.challenge_dir)
config_text_post = self.CONFIG_TEMPLATE24_POST.format(self.challenge_dir)
logger.debug("writing a pre config file with text:\n %s", config_text_pre)
with open(self.challenge_conf_pre, "w") as new_conf:
@@ -184,15 +157,13 @@ class ApacheHttp01(common.ChallengePerformer):
def _set_up_challenges(self) -> List[KeyAuthorizationChallengeResponse]:
if not os.path.isdir(self.challenge_dir):
old_umask = filesystem.umask(0o022)
try:
filesystem.makedirs(self.challenge_dir, 0o755)
except OSError as exception:
if exception.errno not in (errno.EEXIST, errno.EISDIR):
raise errors.PluginError(
"Couldn't create root for http-01 challenge")
finally:
filesystem.umask(old_umask)
with filesystem.temp_umask(0o022):
try:
filesystem.makedirs(self.challenge_dir, 0o755)
except OSError as exception:
if exception.errno not in (errno.EEXIST, errno.EISDIR):
raise errors.PluginError(
"Couldn't create root for http-01 challenge")
responses = []
for achall in self.achalls:

View File

@@ -1,8 +1,6 @@
""" Distribution specific override class for CentOS family (RHEL, Fedora) """
import logging
from typing import Any
from typing import cast
from typing import List
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
@@ -11,7 +9,6 @@ from certbot_apache._internal.configurator import OsOptions
from certbot import errors
from certbot import util
from certbot.errors import MisconfigurationError
logger = logging.getLogger(__name__)
@@ -25,6 +22,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
vhost_files="*.conf",
logs_root="/var/log/httpd",
ctl="apachectl",
apache_bin="httpd",
version_cmd=['apachectl', '-v'],
restart_cmd=['apachectl', 'graceful'],
restart_cmd_alt=['apachectl', 'restart'],
@@ -51,6 +49,42 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
else:
raise
def _rhel9_or_newer(self) -> bool:
os_name, os_version = util.get_os_info()
rhel_derived = os_name in [
"centos", "centos linux",
"cloudlinux",
"ol", "oracle",
"rhel", "redhatenterpriseserver", "red hat enterprise linux server",
"scientific", "scientific linux",
]
# It is important that the loose version comparison below is not made
# if the OS is not RHEL derived. See
# https://github.com/certbot/certbot/issues/9481.
if not rhel_derived:
return False
at_least_v9 = util.parse_loose_version(os_version) >= util.parse_loose_version('9')
return at_least_v9
def _override_cmds(self) -> None:
super()._override_cmds()
# As of RHEL 9, apachectl can't be passed flags like "-v" or "-t -D", so
# instead use options.bin (i.e. httpd) for version_cmd and the various
# get_X commands
if self._rhel9_or_newer():
if not self.options.bin:
raise ValueError("OS option apache_bin must be set for CentOS") # pragma: no cover
self.options.version_cmd[0] = self.options.bin
self.options.get_modules_cmd[0] = self.options.bin
self.options.get_includes_cmd[0] = self.options.bin
self.options.get_defines_cmd[0] = self.options.bin
if not self.options.restart_cmd_alt: # pragma: no cover
raise ValueError("OS option restart_cmd_alt must be set for CentOS.")
self.options.restart_cmd_alt[0] = self.options.ctl
def _try_restart_fedora(self) -> None:
"""
Tries to restart httpd using systemctl to generate the self signed key pair.
@@ -64,97 +98,11 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
# Finish with actual config check to see if systemctl restart helped
super().config_test()
def _prepare_options(self) -> None:
"""
Override the options dictionary initialization in order to support
alternative restart cmd used in CentOS.
"""
super()._prepare_options()
if not self.options.restart_cmd_alt: # pragma: no cover
raise ValueError("OS option restart_cmd_alt must be set for CentOS.")
self.options.restart_cmd_alt[0] = self.options.ctl
def get_parser(self) -> "CentOSParser":
"""Initializes the ApacheParser"""
return CentOSParser(
self.options.server_root, self, self.options.vhost_root, self.version)
def _deploy_cert(self, *args: Any, **kwargs: Any) -> None: # pylint: disable=arguments-differ
"""
Override _deploy_cert in order to ensure that the Apache configuration
has "LoadModule ssl_module..." before parsing the VirtualHost configuration
that was created by Certbot
"""
super()._deploy_cert(*args, **kwargs)
if self.version < (2, 4, 0):
self._deploy_loadmodule_ssl_if_needed()
def _deploy_loadmodule_ssl_if_needed(self) -> None:
"""
Add "LoadModule ssl_module <pre-existing path>" to main httpd.conf if
it doesn't exist there already.
"""
loadmods = self.parser.find_dir("LoadModule", "ssl_module", exclude=False)
correct_ifmods: List[str] = []
loadmod_args: List[str] = []
loadmod_paths: List[str] = []
for m in loadmods:
noarg_path = m.rpartition("/")[0]
path_args = self.parser.get_all_args(noarg_path)
if loadmod_args:
if loadmod_args != path_args:
msg = ("Certbot encountered multiple LoadModule directives "
"for LoadModule ssl_module with differing library paths. "
"Please remove or comment out the one(s) that are not in "
"use, and run Certbot again.")
raise MisconfigurationError(msg)
else:
loadmod_args = [arg for arg in path_args if arg]
centos_parser: CentOSParser = cast(CentOSParser, self.parser)
if centos_parser.not_modssl_ifmodule(noarg_path):
if centos_parser.loc["default"] in noarg_path:
# LoadModule already in the main configuration file
if "ifmodule/" in noarg_path.lower() or "ifmodule[1]" in noarg_path.lower():
# It's the first or only IfModule in the file
return
# Populate the list of known !mod_ssl.c IfModules
nodir_path = noarg_path.rpartition("/directive")[0]
correct_ifmods.append(nodir_path)
else:
loadmod_paths.append(noarg_path)
if not loadmod_args:
# Do not try to enable mod_ssl
return
# Force creation as the directive wasn't found from the beginning of
# httpd.conf
rootconf_ifmod = self.parser.create_ifmod(
parser.get_aug_path(self.parser.loc["default"]),
"!mod_ssl.c", beginning=True)
# parser.get_ifmod returns a path postfixed with "/", remove that
self.parser.add_dir(rootconf_ifmod[:-1], "LoadModule", loadmod_args)
correct_ifmods.append(rootconf_ifmod[:-1])
self.save_notes += "Added LoadModule ssl_module to main configuration.\n"
# Wrap LoadModule mod_ssl inside of <IfModule !mod_ssl.c> if it's not
# configured like this already.
for loadmod_path in loadmod_paths:
nodir_path = loadmod_path.split("/directive")[0]
# Remove the old LoadModule directive
self.parser.aug.remove(loadmod_path)
# Create a new IfModule !mod_ssl.c if not already found on path
ssl_ifmod = self.parser.get_ifmod(nodir_path, "!mod_ssl.c", beginning=True)[:-1]
if ssl_ifmod not in correct_ifmods:
self.parser.add_dir(ssl_ifmod, "LoadModule", loadmod_args)
correct_ifmods.append(ssl_ifmod)
self.save_notes += ("Wrapped pre-existing LoadModule ssl_module "
"inside of <IfModule !mod_ssl> block.\n")
class CentOSParser(parser.ApacheParser):
"""CentOS specific ApacheParser override class"""
@@ -174,33 +122,3 @@ class CentOSParser(parser.ApacheParser):
defines = apache_util.parse_define_file(self.sysconfig_filep, "OPTIONS")
for k, v in defines.items():
self.variables[k] = v
def not_modssl_ifmodule(self, path: str) -> bool:
"""Checks if the provided Augeas path has argument !mod_ssl"""
if "ifmodule" not in path.lower():
return False
# Trim the path to the last ifmodule
workpath = path.lower()
while workpath:
# Get path to the last IfModule (ignore the tail)
parts = workpath.rpartition("ifmodule")
if not parts[0]:
# IfModule not found
break
ifmod_path = parts[0] + parts[1]
# Check if ifmodule had an index
if parts[2].startswith("["):
# Append the index from tail
ifmod_path += parts[2].partition("/")[0]
# Get the original path trimmed to correct length
# This is required to preserve cases
ifmod_real_path = path[0:len(ifmod_path)]
if "!mod_ssl.c" in self.get_all_args(ifmod_real_path):
return True
# Set the workpath to the heading part
workpath = parts[0]
return False

View File

@@ -47,6 +47,7 @@ class ApacheParser:
arg_var_interpreter: Pattern = re.compile(r"\$\{[^ \}]*}")
fnmatch_chars: Set[str] = {"*", "?", "\\", "[", "]"}
# pylint: disable=unused-argument
def __init__(self, root: str, configurator: "ApacheConfigurator",
vhostroot: str, version: Tuple[int, ...] = (2, 4)) -> None:
# Note: Order is important here.
@@ -74,9 +75,8 @@ class ApacheParser:
self.loc: Dict[str, str] = {"root": self._find_config_root()}
self.parse_file(self.loc["root"])
if version >= (2, 4):
# Look up variables from httpd and add to DOM if not already parsed
self.update_runtime_variables()
# Look up variables from httpd and add to DOM if not already parsed
self.update_runtime_variables()
# This problem has been fixed in Augeas 1.0
self.standardize_excl()
@@ -95,11 +95,6 @@ class ApacheParser:
self.parse_file(os.path.abspath(vhostroot) + "/" +
self.configurator.options.vhost_files)
# check to see if there were unparsed define statements
if version < (2, 4):
if self.find_dir("Define", exclude=False):
raise errors.PluginError("Error parsing runtime variables")
def check_parsing_errors(self, lens: str) -> None:
"""Verify Augeas can parse all of the lens files.
@@ -302,7 +297,7 @@ class ApacheParser:
def update_defines(self) -> None:
"""Updates the dictionary of known variables in the configuration"""
self.variables = apache_util.parse_defines(self.configurator.options.ctl)
self.variables = apache_util.parse_defines(self.configurator.options.get_defines_cmd)
def update_includes(self) -> None:
"""Get includes from httpd process, and add them to DOM if needed"""
@@ -312,7 +307,7 @@ class ApacheParser:
# configuration files
_ = self.find_dir("Include")
matches = apache_util.parse_includes(self.configurator.options.ctl)
matches = apache_util.parse_includes(self.configurator.options.get_includes_cmd)
if matches:
for i in matches:
if not self.parsed_in_current(i):
@@ -321,7 +316,7 @@ class ApacheParser:
def update_modules(self) -> None:
"""Get loaded modules from httpd process, and add them to DOM"""
matches = apache_util.parse_modules(self.configurator.options.ctl)
matches = apache_util.parse_modules(self.configurator.options.get_modules_cmd)
for mod in matches:
self.add_mod(mod.strip())
@@ -382,7 +377,7 @@ class ApacheParser:
for i, arg in enumerate(args):
self.aug.set("%s/arg[%d]" % (nvh_path, i + 1), arg)
def get_ifmod(self, aug_conf_path: str, mod: str, beginning: bool = False) -> str:
def get_ifmod(self, aug_conf_path: str, mod: str) -> str:
"""Returns the path to <IfMod mod> and creates one if it doesn't exist.
:param str aug_conf_path: Augeas configuration path
@@ -399,35 +394,26 @@ class ApacheParser:
if_mods = self.aug.match(("%s/IfModule/*[self::arg='%s']" %
(aug_conf_path, mod)))
if not if_mods:
return self.create_ifmod(aug_conf_path, mod, beginning)
return self.create_ifmod(aug_conf_path, mod)
# Strip off "arg" at end of first ifmod path
return if_mods[0].rpartition("arg")[0]
def create_ifmod(self, aug_conf_path: str, mod: str, beginning: bool = False) -> str:
def create_ifmod(self, aug_conf_path: str, mod: str) -> str:
"""Creates a new <IfMod mod> and returns its path.
:param str aug_conf_path: Augeas configuration path
:param str mod: module ie. mod_ssl.c
:param bool beginning: If the IfModule should be created to the beginning
of augeas path DOM tree.
:returns: Augeas path of the newly created IfModule directive.
The path may be dynamic, i.e. .../IfModule[last()]
:rtype: str
"""
if beginning:
c_path_arg = "{}/IfModule[1]/arg".format(aug_conf_path)
# Insert IfModule before the first directive
self.aug.insert("{}/directive[1]".format(aug_conf_path),
"IfModule", True)
retpath = "{}/IfModule[1]/".format(aug_conf_path)
else:
c_path = "{}/IfModule[last() + 1]".format(aug_conf_path)
c_path_arg = "{}/IfModule[last()]/arg".format(aug_conf_path)
self.aug.set(c_path, "")
retpath = "{}/IfModule[last()]/".format(aug_conf_path)
c_path = "{}/IfModule[last() + 1]".format(aug_conf_path)
c_path_arg = "{}/IfModule[last()]/arg".format(aug_conf_path)
self.aug.set(c_path, "")
retpath = "{}/IfModule[last()]/".format(aug_conf_path)
self.aug.set(c_path_arg, mod)
return retpath
@@ -587,20 +573,6 @@ class ApacheParser:
return ordered_matches
def get_all_args(self, match: str) -> List[Optional[str]]:
"""
Tries to fetch all arguments for a directive. See get_arg.
Note that if match is an ancestor node, it returns all names of
child directives as well as the list of arguments.
"""
if match[-1] != "/":
match = match + "/"
allargs = self.aug.match(match + '*')
return [self.get_arg(arg) for arg in allargs]
def get_arg(self, match: str) -> Optional[str]:
"""Uses augeas.get to get argument value and interprets result.

View File

@@ -2,7 +2,7 @@
# manually, Certbot will be unable to automatically provide future security
# updates. Instead, Certbot will print and log an error message with a path to
# the up-to-date file that you will need to refer to when manually updating
# this file.
# this file. Contents are based on https://ssl-config.mozilla.org
SSLEngine on

View File

@@ -2,7 +2,7 @@
# manually, Certbot will be unable to automatically provide future security
# updates. Instead, Certbot will print and log an error message with a path to
# the up-to-date file that you will need to refer to when manually updating
# this file.
# this file. Contents are based on https://ssl-config.mozilla.org
SSLEngine on

View File

@@ -1,7 +1,7 @@
from setuptools import find_packages
from setuptools import setup
version = '1.25.0.dev0'
version = '2.3.0.dev0'
install_requires = [
# We specify the minimum acme and certbot version as the current plugin
@@ -38,6 +38,7 @@ setup(
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -1,13 +1,9 @@
"""Tests for AugeasParserNode classes"""
from typing import List
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import os
import util
from unittest import mock
from certbot import errors

View File

@@ -2,11 +2,7 @@
"""Test for certbot_apache._internal.configurator AutoHSTS functionality"""
import re
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot import errors
from certbot_apache._internal import constants

View File

@@ -1,228 +0,0 @@
"""Test for certbot_apache._internal.configurator for CentOS 6 overrides"""
import unittest
from unittest import mock
from certbot.compat import os
from certbot.errors import MisconfigurationError
from certbot_apache._internal import obj
from certbot_apache._internal import override_centos
from certbot_apache._internal import parser
import util
def get_vh_truth(temp_dir, config_name):
"""Return the ground truth for the specified directory."""
prefix = os.path.join(
temp_dir, config_name, "httpd/conf.d")
aug_pre = "/files" + prefix
vh_truth = [
obj.VirtualHost(
os.path.join(prefix, "test.example.com.conf"),
os.path.join(aug_pre, "test.example.com.conf/VirtualHost"),
{obj.Addr.fromstring("*:80")},
False, True, "test.example.com"),
obj.VirtualHost(
os.path.join(prefix, "ssl.conf"),
os.path.join(aug_pre, "ssl.conf/VirtualHost"),
{obj.Addr.fromstring("_default_:443")},
True, True, None)
]
return vh_truth
class CentOS6Tests(util.ApacheTest):
"""Tests for CentOS 6"""
def setUp(self): # pylint: disable=arguments-differ
test_dir = "centos6_apache/apache"
config_root = "centos6_apache/apache/httpd"
vhost_root = "centos6_apache/apache/httpd/conf.d"
super().setUp(test_dir=test_dir,
config_root=config_root,
vhost_root=vhost_root)
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
version=(2, 2, 15), os_info="centos")
self.vh_truth = get_vh_truth(
self.temp_dir, "centos6_apache/apache")
def test_get_parser(self):
self.assertIsInstance(self.config.parser, override_centos.CentOSParser)
def test_get_virtual_hosts(self):
"""Make sure all vhosts are being properly found."""
vhs = self.config.get_virtual_hosts()
self.assertEqual(len(vhs), 2)
found = 0
for vhost in vhs:
for centos_truth in self.vh_truth:
if vhost == centos_truth:
found += 1
break
else:
raise Exception("Missed: %s" % vhost) # pragma: no cover
self.assertEqual(found, 2)
@mock.patch("certbot_apache._internal.configurator.display_util.notify")
def test_loadmod_default(self, unused_mock_notify):
ssl_loadmods = self.config.parser.find_dir(
"LoadModule", "ssl_module", exclude=False)
self.assertEqual(len(ssl_loadmods), 1)
# Make sure the LoadModule ssl_module is in ssl.conf (default)
self.assertIn("ssl.conf", ssl_loadmods[0])
# ...and that it's not inside of <IfModule>
self.assertNotIn("IfModule", ssl_loadmods[0])
# Get the example vhost
self.config.assoc["test.example.com"] = self.vh_truth[0]
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.config.save()
post_loadmods = self.config.parser.find_dir(
"LoadModule", "ssl_module", exclude=False)
# We should now have LoadModule ssl_module in root conf and ssl.conf
self.assertEqual(len(post_loadmods), 2)
for lm in post_loadmods:
# lm[:-7] removes "/arg[#]" from the path
arguments = self.config.parser.get_all_args(lm[:-7])
self.assertEqual(arguments, ["ssl_module", "modules/mod_ssl.so"])
# ...and both of them should be wrapped in <IfModule !mod_ssl.c>
# lm[:-17] strips off /directive/arg[1] from the path.
ifmod_args = self.config.parser.get_all_args(lm[:-17])
self.assertIn("!mod_ssl.c", ifmod_args)
@mock.patch("certbot_apache._internal.configurator.display_util.notify")
def test_loadmod_multiple(self, unused_mock_notify):
sslmod_args = ["ssl_module", "modules/mod_ssl.so"]
# Adds another LoadModule to main httpd.conf in addtition to ssl.conf
self.config.parser.add_dir(self.config.parser.loc["default"], "LoadModule",
sslmod_args)
self.config.save()
pre_loadmods = self.config.parser.find_dir(
"LoadModule", "ssl_module", exclude=False)
# LoadModules are not within IfModule blocks
self.assertIs(any("ifmodule" in m.lower() for m in pre_loadmods), False)
self.config.assoc["test.example.com"] = self.vh_truth[0]
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
post_loadmods = self.config.parser.find_dir(
"LoadModule", "ssl_module", exclude=False)
for mod in post_loadmods:
with self.subTest(mod=mod):
# pylint: disable=no-member
self.assertIs(self.config.parser.not_modssl_ifmodule(mod), True)
@mock.patch("certbot_apache._internal.configurator.display_util.notify")
def test_loadmod_rootconf_exists(self, unused_mock_notify):
sslmod_args = ["ssl_module", "modules/mod_ssl.so"]
rootconf_ifmod = self.config.parser.get_ifmod(
parser.get_aug_path(self.config.parser.loc["default"]),
"!mod_ssl.c", beginning=True)
self.config.parser.add_dir(rootconf_ifmod[:-1], "LoadModule", sslmod_args)
self.config.save()
# Get the example vhost
self.config.assoc["test.example.com"] = self.vh_truth[0]
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.config.save()
root_loadmods = self.config.parser.find_dir(
"LoadModule", "ssl_module",
start=parser.get_aug_path(self.config.parser.loc["default"]),
exclude=False)
mods = [lm for lm in root_loadmods if self.config.parser.loc["default"] in lm]
self.assertEqual(len(mods), 1)
# [:-7] removes "/arg[#]" from the path
self.assertEqual(
self.config.parser.get_all_args(mods[0][:-7]),
sslmod_args)
@mock.patch("certbot_apache._internal.configurator.display_util.notify")
def test_neg_loadmod_already_on_path(self, unused_mock_notify):
loadmod_args = ["ssl_module", "modules/mod_ssl.so"]
ifmod = self.config.parser.get_ifmod(
self.vh_truth[1].path, "!mod_ssl.c", beginning=True)
self.config.parser.add_dir(ifmod[:-1], "LoadModule", loadmod_args)
self.config.parser.add_dir(self.vh_truth[1].path, "LoadModule", loadmod_args)
self.config.save()
pre_loadmods = self.config.parser.find_dir(
"LoadModule", "ssl_module", start=self.vh_truth[1].path, exclude=False)
self.assertEqual(len(pre_loadmods), 2)
# The ssl.conf now has two LoadModule directives, one inside of
# !mod_ssl.c IfModule
self.config.assoc["test.example.com"] = self.vh_truth[0]
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.config.save()
# Ensure that the additional LoadModule wasn't written into the IfModule
post_loadmods = self.config.parser.find_dir(
"LoadModule", "ssl_module", start=self.vh_truth[1].path, exclude=False)
self.assertEqual(len(post_loadmods), 1)
def test_loadmod_non_duplicate(self):
# the modules/mod_ssl.so exists in ssl.conf
sslmod_args = ["ssl_module", "modules/mod_somethingelse.so"]
rootconf_ifmod = self.config.parser.get_ifmod(
parser.get_aug_path(self.config.parser.loc["default"]),
"!mod_ssl.c", beginning=True)
self.config.parser.add_dir(rootconf_ifmod[:-1], "LoadModule", sslmod_args)
self.config.save()
self.config.assoc["test.example.com"] = self.vh_truth[0]
pre_matches = self.config.parser.find_dir("LoadModule",
"ssl_module", exclude=False)
self.assertRaises(MisconfigurationError, self.config.deploy_cert,
"random.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
post_matches = self.config.parser.find_dir("LoadModule",
"ssl_module", exclude=False)
# Make sure that none was changed
self.assertEqual(pre_matches, post_matches)
@mock.patch("certbot_apache._internal.configurator.display_util.notify")
def test_loadmod_not_found(self, unused_mock_notify):
# Remove all existing LoadModule ssl_module... directives
orig_loadmods = self.config.parser.find_dir("LoadModule",
"ssl_module",
exclude=False)
for mod in orig_loadmods:
noarg_path = mod.rpartition("/")[0]
self.config.parser.aug.remove(noarg_path)
self.config.save()
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
post_loadmods = self.config.parser.find_dir("LoadModule",
"ssl_module",
exclude=False)
self.assertEqual(post_loadmods, [])
def test_no_ifmod_search_false(self):
#pylint: disable=no-member
self.assertIs(self.config.parser.not_modssl_ifmodule(
"/path/does/not/include/ifmod"
), False)
self.assertIs(self.config.parser.not_modssl_ifmodule(
""
), False)
self.assertIs(self.config.parser.not_modssl_ifmodule(
"/path/includes/IfModule/but/no/arguments"
), False)
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -1,16 +1,12 @@
"""Test for certbot_apache._internal.configurator for Centos overrides"""
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot import errors
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache._internal import obj
from certbot_apache._internal import override_centos
from certbot_apache._internal import obj
import util
@@ -88,10 +84,8 @@ class FedoraRestartTest(util.ApacheTest):
['systemctl', 'restart', 'httpd'])
class MultipleVhostsTestCentOS(util.ApacheTest):
"""Multiple vhost tests for CentOS / RHEL family of distros"""
_multiprocess_can_split_ = True
class UseCorrectApacheExecutableTest(util.ApacheTest):
"""Make sure the various CentOS/RHEL versions use the right httpd executable"""
def setUp(self): # pylint: disable=arguments-differ
test_dir = "centos7_apache/apache"
@@ -101,6 +95,55 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
config_root=config_root,
vhost_root=vhost_root)
@mock.patch("certbot.util.get_os_info")
def test_old_centos_rhel_and_fedora(self, mock_get_os_info):
for os_info in [("centos", "7"), ("rhel", "7"), ("fedora", "28"), ("scientific", "6")]:
mock_get_os_info.return_value = os_info
config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
os_info="centos")
self.assertEqual(config.options.ctl, "apachectl")
self.assertEqual(config.options.bin, "httpd")
self.assertEqual(config.options.version_cmd, ["apachectl", "-v"])
self.assertEqual(config.options.restart_cmd, ["apachectl", "graceful"])
self.assertEqual(config.options.restart_cmd_alt, ["apachectl", "restart"])
self.assertEqual(config.options.conftest_cmd, ["apachectl", "configtest"])
self.assertEqual(config.options.get_defines_cmd, ["apachectl", "-t", "-D", "DUMP_RUN_CFG"])
self.assertEqual(config.options.get_includes_cmd, ["apachectl", "-t", "-D", "DUMP_INCLUDES"])
self.assertEqual(config.options.get_modules_cmd, ["apachectl", "-t", "-D", "DUMP_MODULES"])
@mock.patch("certbot.util.get_os_info")
def test_new_rhel_derived(self, mock_get_os_info):
for os_info in [("centos", "9"), ("rhel", "9"), ("oracle", "9")]:
mock_get_os_info.return_value = os_info
config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
os_info=os_info[0])
self.assertEqual(config.options.ctl, "apachectl")
self.assertEqual(config.options.bin, "httpd")
self.assertEqual(config.options.version_cmd, ["httpd", "-v"])
self.assertEqual(config.options.restart_cmd, ["apachectl", "graceful"])
self.assertEqual(config.options.restart_cmd_alt, ["apachectl", "restart"])
self.assertEqual(config.options.conftest_cmd, ["apachectl", "configtest"])
self.assertEqual(config.options.get_defines_cmd, ["httpd", "-t", "-D", "DUMP_RUN_CFG"])
self.assertEqual(config.options.get_includes_cmd, ["httpd", "-t", "-D", "DUMP_INCLUDES"])
self.assertEqual(config.options.get_modules_cmd, ["httpd", "-t", "-D", "DUMP_MODULES"])
class MultipleVhostsTestCentOS(util.ApacheTest):
"""Multiple vhost tests for CentOS / RHEL family of distros"""
_multiprocess_can_split_ = True
@mock.patch("certbot.util.get_os_info")
def setUp(self, mock_get_os_info): # pylint: disable=arguments-differ
test_dir = "centos7_apache/apache"
config_root = "centos7_apache/apache/httpd"
vhost_root = "centos7_apache/apache/httpd/conf.d"
super().setUp(test_dir=test_dir,
config_root=config_root,
vhost_root=vhost_root)
mock_get_os_info.return_value = ("centos", "9")
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
os_info="centos")
@@ -124,9 +167,9 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
)
def mock_get_cfg(command):
"""Mock httpd process stdout"""
if command == ['apachectl', '-t', '-D', 'DUMP_RUN_CFG']:
if command == ['httpd', '-t', '-D', 'DUMP_RUN_CFG']:
return define_val
elif command == ['apachectl', '-t', '-D', 'DUMP_MODULES']:
elif command == ['httpd', '-t', '-D', 'DUMP_MODULES']:
return mod_val
return ""
mock_get.side_effect = mock_get_cfg
@@ -135,7 +178,7 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
with mock.patch("certbot.util.get_os_info") as mock_osi:
# Make sure we have the have the CentOS httpd constants
mock_osi.return_value = ("centos", "7")
mock_osi.return_value = ("centos", "9")
self.config.parser.update_runtime_variables()
self.assertEqual(mock_get.call_count, 3)
@@ -170,7 +213,7 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
with mock.patch("certbot.util.get_os_info") as mock_osi:
# Make sure we have the have the CentOS httpd constants
mock_osi.return_value = ("centos", "7")
mock_osi.return_value = ("centos", "9")
self.config.parser.update_runtime_variables()
self.assertIn("mock_define", self.config.parser.variables)

View File

@@ -1,11 +1,7 @@
"""Test for certbot_apache._internal.configurator implementations of reverter"""
import shutil
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot import errors
import util

View File

@@ -5,11 +5,7 @@ import shutil
import socket
import tempfile
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from acme import challenges
from certbot import achallenges
@@ -443,18 +439,6 @@ class MultipleVhostsTest(util.ApacheTest):
"SSLCertificateChainFile", "two/cert_chain.pem",
self.vh_truth[1].path))
def test_is_name_vhost(self):
addr = obj.Addr.fromstring("*:80")
self.assertIs(self.config.is_name_vhost(addr), True)
self.config.version = (2, 2)
self.assertIs(self.config.is_name_vhost(addr), False)
def test_add_name_vhost(self):
self.config.add_name_vhost(obj.Addr.fromstring("*:443"))
self.config.add_name_vhost(obj.Addr.fromstring("*:80"))
self.assertTrue(self.config.parser.find_dir("NameVirtualHost", "*:443", exclude=False))
self.assertTrue(self.config.parser.find_dir("NameVirtualHost", "*:80"))
def test_add_listen_80(self):
mock_find = mock.Mock()
mock_add_dir = mock.Mock()
@@ -642,9 +626,6 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertIs(ssl_vhost.ssl, True)
self.assertIs(ssl_vhost.enabled, False)
self.assertEqual(self.config.is_name_vhost(self.vh_truth[0]),
self.config.is_name_vhost(ssl_vhost))
self.assertEqual(len(self.config.vhosts), 13)
def test_clean_vhost_ssl(self):
@@ -721,21 +702,6 @@ class MultipleVhostsTest(util.ApacheTest):
# pylint: disable=protected-access
self.assertIs(self.config._get_ssl_vhost_path("example_path").endswith(".conf"), True)
def test_add_name_vhost_if_necessary(self):
# pylint: disable=protected-access
self.config.add_name_vhost = mock.Mock()
self.config.version = (2, 2)
self.config._add_name_vhost_if_necessary(self.vh_truth[0])
self.assertIs(self.config.add_name_vhost.called, True)
new_addrs = set()
for addr in self.vh_truth[0].addrs:
new_addrs.add(obj.Addr(("_default_", addr.get_port(),)))
self.vh_truth[0].addrs = new_addrs
self.config._add_name_vhost_if_necessary(self.vh_truth[0])
self.assertEqual(self.config.add_name_vhost.call_count, 2)
@mock.patch("certbot_apache._internal.configurator.http_01.ApacheHttp01.perform")
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.restart")
def test_perform(self, mock_restart, mock_http_perform):
@@ -946,20 +912,6 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertEqual(len(stapling_cache_aug_path), 1)
@mock.patch("certbot.util.exe_exists")
def test_ocsp_unsupported_apache_version(self, mock_exe):
mock_exe.return_value = True
self.config.parser.update_runtime_variables = mock.Mock()
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["socache_shmcb_module"] = None
self.config.get_version = mock.Mock(return_value=(2, 2, 0))
self.config.choose_vhost("certbot.demo")
self.assertRaises(errors.PluginError,
self.config.enhance, "certbot.demo", "staple-ocsp")
def test_get_http_vhost_third_filter(self):
ssl_vh = obj.VirtualHost(
"fp", "ap", {obj.Addr(("*", "443"))},
@@ -1137,7 +1089,7 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.parser.modules["rewrite_module"] = None
self.config.parser.update_runtime_variables = mock.Mock()
mock_exe.return_value = True
self.config.get_version = mock.Mock(return_value=(2, 2, 0))
self.config.get_version = mock.Mock(return_value=(2, 4, 0))
ssl_vhost = self.config.choose_vhost("certbot.demo")
@@ -1567,9 +1519,6 @@ class MultiVhostsTest(util.ApacheTest):
self.assertIs(ssl_vhost.ssl, True)
self.assertIs(ssl_vhost.enabled, False)
self.assertEqual(self.config.is_name_vhost(self.vh_truth[1]),
self.config.is_name_vhost(ssl_vhost))
mock_path = "certbot_apache._internal.configurator.ApacheConfigurator._get_new_vh_path"
with mock.patch(mock_path) as mock_getpath:
mock_getpath.return_value = None

View File

@@ -1,11 +1,7 @@
"""Test for certbot_apache._internal.configurator for Debian overrides"""
import shutil
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot import errors
from certbot.compat import os

View File

@@ -1,10 +1,6 @@
"""Test certbot_apache._internal.display_ops."""
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot import errors
from certbot.display import util as display_util

View File

@@ -1,10 +1,6 @@
"""Tests for DualParserNode implementation"""
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot_apache._internal import assertions
from certbot_apache._internal import augeasparser

View File

@@ -1,10 +1,6 @@
"""Test for certbot_apache._internal.entrypoint for override class resolution"""
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot_apache._internal import configurator
from certbot_apache._internal import entrypoint

View File

@@ -1,10 +1,6 @@
"""Test for certbot_apache._internal.configurator for Fedora 29+ overrides"""
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot import errors
from certbot.compat import filesystem

View File

@@ -1,10 +1,6 @@
"""Test for certbot_apache._internal.configurator for Gentoo overrides"""
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot import errors
from certbot.compat import filesystem

View File

@@ -2,11 +2,7 @@
import unittest
import errno
from typing import List
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from acme import challenges
from certbot import achallenges
@@ -53,15 +49,6 @@ class ApacheHttp01Test(util.ApacheTest):
def test_empty_perform(self):
self.assertEqual(len(self.http.perform()), 0)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.enable_mod")
def test_enable_modules_apache_2_2(self, mock_enmod):
self.config.version = (2, 2)
del self.config.parser.modules["authz_host_module"]
del self.config.parser.modules["mod_authz_host.c"]
enmod_calls = self.common_enable_modules_test(mock_enmod)
self.assertEqual(enmod_calls[0][0][0], "authz_host")
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.enable_mod")
def test_enable_modules_apache_2_4(self, mock_enmod):
del self.config.parser.modules["authz_core_module"]
@@ -143,21 +130,12 @@ class ApacheHttp01Test(util.ApacheTest):
self.config.config.http01_port = 12345
self.assertRaises(errors.PluginError, self.http.perform)
def test_perform_1_achall_apache_2_2(self):
self.combinations_perform_test(num_achalls=1, minor_version=2)
def test_perform_1_achall_apache_2_4(self):
self.combinations_perform_test(num_achalls=1, minor_version=4)
def test_perform_2_achall_apache_2_2(self):
self.combinations_perform_test(num_achalls=2, minor_version=2)
def test_perform_2_achall_apache_2_4(self):
self.combinations_perform_test(num_achalls=2, minor_version=4)
def test_perform_3_achall_apache_2_2(self):
self.combinations_perform_test(num_achalls=3, minor_version=2)
def test_perform_3_achall_apache_2_4(self):
self.combinations_perform_test(num_achalls=3, minor_version=4)
@@ -230,10 +208,7 @@ class ApacheHttp01Test(util.ApacheTest):
self.assertIn("RewriteRule", pre_conf_contents)
self.assertIn(self.http.challenge_dir, post_conf_contents)
if self.config.version < (2, 4):
self.assertIn("Allow from all", post_conf_contents)
else:
self.assertIn("Require all granted", post_conf_contents)
self.assertIn("Require all granted", post_conf_contents)
def _test_challenge_file(self, achall):
name = os.path.join(self.http.challenge_dir, achall.chall.encode("token"))

View File

@@ -1,11 +1,7 @@
"""Tests for certbot_apache._internal.parser."""
import shutil
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot import errors
from certbot.compat import os
@@ -370,15 +366,6 @@ class ParserInitTest(util.ApacheTest):
ApacheParser, os.path.relpath(self.config_path), self.config,
"/dummy/vhostpath", version=(2, 4, 22))
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_unparseable(self, mock_cfg):
from certbot_apache._internal.parser import ApacheParser
mock_cfg.return_value = ('Define: TEST')
self.assertRaises(
errors.PluginError,
ApacheParser, os.path.relpath(self.config_path), self.config,
"/dummy/vhostpath", version=(2, 2, 22))
def test_root_normalized(self):
from certbot_apache._internal.parser import ApacheParser

View File

@@ -1,10 +1,6 @@
"""Tests for ApacheConfigurator for AugeasParserNode classes"""
import unittest
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
import util

View File

@@ -1,9 +0,0 @@
This directory holds Apache 2.0 module-specific configuration files;
any files in this directory which have the ".conf" extension will be
processed as Apache configuration files.
Files are processed in alphabetical order, so if using configuration
directives which depend on, say, mod_perl being loaded, ensure that
these are placed in a filename later in the sort order than "perl.conf".

View File

@@ -1,222 +0,0 @@
#
# This is the Apache server configuration file providing SSL support.
# It contains the configuration directives to instruct the server how to
# serve pages over an https connection. For detailing information about these
# directives see <URL:http://httpd.apache.org/docs/2.2/mod/mod_ssl.html>
#
# Do NOT simply read the instructions in here without understanding
# what they do. They're here only as hints or reminders. If you are unsure
# consult the online docs. You have been warned.
#
LoadModule ssl_module modules/mod_ssl.so
#
# When we also provide SSL we have to listen to the
# the HTTPS port in addition.
#
Listen 443
##
## SSL Global Context
##
## All SSL configuration in this context applies both to
## the main server and all SSL-enabled virtual hosts.
##
# Pass Phrase Dialog:
# Configure the pass phrase gathering process.
# The filtering dialog program (`builtin' is an internal
# terminal dialog) has to provide the pass phrase on stdout.
SSLPassPhraseDialog builtin
# Inter-Process Session Cache:
# Configure the SSL Session Cache: First the mechanism
# to use and second the expiring timeout (in seconds).
SSLSessionCache shmcb:/var/cache/mod_ssl/scache(512000)
SSLSessionCacheTimeout 300
# Semaphore:
# Configure the path to the mutual exclusion semaphore the
# SSL engine uses internally for inter-process synchronization.
SSLMutex default
# Pseudo Random Number Generator (PRNG):
# Configure one or more sources to seed the PRNG of the
# SSL library. The seed data should be of good random quality.
# WARNING! On some platforms /dev/random blocks if not enough entropy
# is available. This means you then cannot use the /dev/random device
# because it would lead to very long connection times (as long as
# it requires to make more entropy available). But usually those
# platforms additionally provide a /dev/urandom device which doesn't
# block. So, if available, use this one instead. Read the mod_ssl User
# Manual for more details.
SSLRandomSeed startup file:/dev/urandom 256
SSLRandomSeed connect builtin
#SSLRandomSeed startup file:/dev/random 512
#SSLRandomSeed connect file:/dev/random 512
#SSLRandomSeed connect file:/dev/urandom 512
#
# Use "SSLCryptoDevice" to enable any supported hardware
# accelerators. Use "openssl engine -v" to list supported
# engine names. NOTE: If you enable an accelerator and the
# server does not start, consult the error logs and ensure
# your accelerator is functioning properly.
#
SSLCryptoDevice builtin
#SSLCryptoDevice ubsec
##
## SSL Virtual Host Context
##
<VirtualHost _default_:443>
# General setup for the virtual host, inherited from global configuration
#DocumentRoot "/var/www/html"
#ServerName www.example.com:443
# Use separate log files for the SSL virtual host; note that LogLevel
# is not inherited from httpd.conf.
ErrorLog logs/ssl_error_log
TransferLog logs/ssl_access_log
LogLevel warn
# SSL Engine Switch:
# Enable/Disable SSL for this virtual host.
SSLEngine on
# SSL Protocol support:
# List the enable protocol levels with which clients will be able to
# connect. Disable SSLv2 access by default:
SSLProtocol all -SSLv2
# SSL Cipher Suite:
# List the ciphers that the client is permitted to negotiate.
# See the mod_ssl documentation for a complete list.
SSLCipherSuite DEFAULT:!EXP:!SSLv2:!DES:!IDEA:!SEED:+3DES
# Server Certificate:
# Point SSLCertificateFile at a PEM encoded certificate. If
# the certificate is encrypted, then you will be prompted for a
# pass phrase. Note that a kill -HUP will prompt again. A new
# certificate can be generated using the genkey(1) command.
SSLCertificateFile /etc/pki/tls/certs/localhost.crt
# Server Private Key:
# If the key is not combined with the certificate, use this
# directive to point at the key file. Keep in mind that if
# you've both a RSA and a DSA private key you can configure
# both in parallel (to also allow the use of DSA ciphers, etc.)
SSLCertificateKeyFile /etc/pki/tls/private/localhost.key
# Server Certificate Chain:
# Point SSLCertificateChainFile at a file containing the
# concatenation of PEM encoded CA certificates which form the
# certificate chain for the server certificate. Alternatively
# the referenced file can be the same as SSLCertificateFile
# when the CA certificates are directly appended to the server
# certificate for convinience.
#SSLCertificateChainFile /etc/pki/tls/certs/server-chain.crt
# Certificate Authority (CA):
# Set the CA certificate verification path where to find CA
# certificates for client authentication or alternatively one
# huge file containing all of them (file must be PEM encoded)
#SSLCACertificateFile /etc/pki/tls/certs/ca-bundle.crt
# Client Authentication (Type):
# Client certificate verification type and depth. Types are
# none, optional, require and optional_no_ca. Depth is a
# number which specifies how deeply to verify the certificate
# issuer chain before deciding the certificate is not valid.
#SSLVerifyClient require
#SSLVerifyDepth 10
# Access Control:
# With SSLRequire you can do per-directory access control based
# on arbitrary complex boolean expressions containing server
# variable checks and other lookup directives. The syntax is a
# mixture between C and Perl. See the mod_ssl documentation
# for more details.
#<Location />
#SSLRequire ( %{SSL_CIPHER} !~ m/^(EXP|NULL)/ \
# and %{SSL_CLIENT_S_DN_O} eq "Snake Oil, Ltd." \
# and %{SSL_CLIENT_S_DN_OU} in {"Staff", "CA", "Dev"} \
# and %{TIME_WDAY} >= 1 and %{TIME_WDAY} <= 5 \
# and %{TIME_HOUR} >= 8 and %{TIME_HOUR} <= 20 ) \
# or %{REMOTE_ADDR} =~ m/^192\.76\.162\.[0-9]+$/
#</Location>
# SSL Engine Options:
# Set various options for the SSL engine.
# o FakeBasicAuth:
# Translate the client X.509 into a Basic Authorisation. This means that
# the standard Auth/DBMAuth methods can be used for access control. The
# user name is the `one line' version of the client's X.509 certificate.
# Note that no password is obtained from the user. Every entry in the user
# file needs this password: `xxj31ZMTZzkVA'.
# o ExportCertData:
# This exports two additional environment variables: SSL_CLIENT_CERT and
# SSL_SERVER_CERT. These contain the PEM-encoded certificates of the
# server (always existing) and the client (only existing when client
# authentication is used). This can be used to import the certificates
# into CGI scripts.
# o StdEnvVars:
# This exports the standard SSL/TLS related `SSL_*' environment variables.
# Per default this exportation is switched off for performance reasons,
# because the extraction step is an expensive operation and is usually
# useless for serving static content. So one usually enables the
# exportation for CGI and SSI requests only.
# o StrictRequire:
# This denies access when "SSLRequireSSL" or "SSLRequire" applied even
# under a "Satisfy any" situation, i.e. when it applies access is denied
# and no other module can change it.
# o OptRenegotiate:
# This enables optimized SSL connection renegotiation handling when SSL
# directives are used in per-directory context.
#SSLOptions +FakeBasicAuth +ExportCertData +StrictRequire
<Files ~ "\.(cgi|shtml|phtml|php3?)$">
SSLOptions +StdEnvVars
</Files>
<Directory "/var/www/cgi-bin">
SSLOptions +StdEnvVars
</Directory>
# SSL Protocol Adjustments:
# The safe and default but still SSL/TLS standard compliant shutdown
# approach is that mod_ssl sends the close notify alert but doesn't wait for
# the close notify alert from client. When you need a different shutdown
# approach you can use one of the following variables:
# o ssl-unclean-shutdown:
# This forces an unclean shutdown when the connection is closed, i.e. no
# SSL close notify alert is send or allowed to received. This violates
# the SSL/TLS standard but is needed for some brain-dead browsers. Use
# this when you receive I/O errors because of the standard approach where
# mod_ssl sends the close notify alert.
# o ssl-accurate-shutdown:
# This forces an accurate shutdown when the connection is closed, i.e. a
# SSL close notify alert is send and mod_ssl waits for the close notify
# alert of the client. This is 100% SSL/TLS standard compliant, but in
# practice often causes hanging connections with brain-dead browsers. Use
# this only for browsers where you know that their SSL implementation
# works correctly.
# Notice: Most problems of broken clients are also related to the HTTP
# keep-alive facility, so you usually additionally want to disable
# keep-alive for those clients, too. Use variable "nokeepalive" for this.
# Similarly, one has to force some clients to use HTTP/1.0 to workaround
# their broken HTTP/1.1 implementation. Use variables "downgrade-1.0" and
# "force-response-1.0" for this.
SetEnvIf User-Agent ".*MSIE.*" \
nokeepalive ssl-unclean-shutdown \
downgrade-1.0 force-response-1.0
# Per-Server Logging:
# The home of a custom SSL log file. Use this when you want a
# compact non-error SSL logfile on a virtual host basis.
CustomLog logs/ssl_request_log \
"%t %h %{SSL_PROTOCOL}x %{SSL_CIPHER}x \"%r\" %b"
</VirtualHost>

View File

@@ -1,7 +0,0 @@
<VirtualHost *:80>
ServerName test.example.com
ServerAdmin webmaster@dummy-host.example.com
DocumentRoot /var/www/htdocs
ErrorLog logs/dummy-host.example.com-error_log
CustomLog logs/dummy-host.example.com-access_log common
</VirtualHost>

View File

@@ -1,11 +0,0 @@
#
# This configuration file enables the default "Welcome"
# page if there is no default index page present for
# the root URL. To disable the Welcome page, comment
# out all the lines below.
#
<LocationMatch "^/+$">
Options -Indexes
ErrorDocument 403 /error/noindex.html
</LocationMatch>

File diff suppressed because it is too large Load Diff

View File

@@ -4,11 +4,7 @@ import unittest
import augeas
import josepy as jose
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from unittest import mock
from certbot.compat import os
from certbot.plugins import common
@@ -123,6 +119,7 @@ def get_apache_configurator(
# Custom virtualhost path was requested
config.config.apache_vhost_root = conf_vhost_path
config.config.apache_ctl = config_class.OS_DEFAULTS.ctl
config.config.apache_bin = config_class.OS_DEFAULTS.bin
config.prepare()
return config

View File

@@ -33,20 +33,23 @@ def assert_elliptic_key(key: str, curve: Type[EllipticCurve]) -> None:
key = load_pem_private_key(data=privkey1, password=None, backend=default_backend())
assert isinstance(key, EllipticCurvePrivateKey)
assert isinstance(key.curve, curve)
assert isinstance(key, EllipticCurvePrivateKey), f"should be an EC key but was {type(key)}"
assert isinstance(key.curve, curve), f"should have curve {curve} but was {key.curve}"
def assert_rsa_key(key: str) -> None:
def assert_rsa_key(key: str, key_size: Optional[int] = None) -> None:
"""
Asserts that the key at the given path is an RSA key.
:param str key: path to key
:param int key_size: if provided, assert that the RSA key is of this size
"""
with open(key, 'rb') as file:
privkey1 = file.read()
key = load_pem_private_key(data=privkey1, password=None, backend=default_backend())
assert isinstance(key, RSAPrivateKey)
if key_size:
assert key_size == key.key_size
def assert_hook_execution(probe_path: str, probe_content: str) -> None:
@@ -122,7 +125,7 @@ def assert_equals_world_read_permissions(file1: str, file2: str) -> None:
mode_file1 = os.stat(file1).st_mode & 0o004
mode_file2 = os.stat(file2).st_mode & 0o004
else:
everybody = win32security.ConvertStringSidToSid(EVERYBODY_SID)
everybody = win32security.ConvertStringSidToSid(EVERYBODY_SID) # pylint: disable=used-before-assignment
security1 = win32security.GetFileSecurity(file1, win32security.DACL_SECURITY_INFORMATION)
dacl1 = security1.GetSecurityDescriptorDacl()
@@ -132,7 +135,7 @@ def assert_equals_world_read_permissions(file1: str, file2: str) -> None:
'TrusteeType': win32security.TRUSTEE_IS_USER,
'Identifier': everybody,
})
mode_file1 = mode_file1 & ntsecuritycon.FILE_GENERIC_READ
mode_file1 = mode_file1 & ntsecuritycon.FILE_GENERIC_READ # pylint: disable=used-before-assignment
security2 = win32security.GetFileSecurity(file2, win32security.DACL_SECURITY_INFORMATION)
dacl2 = security2.GetSecurityDescriptorDacl()

View File

@@ -17,8 +17,8 @@ class IntegrationTestsContext:
self.request = request
if hasattr(request.config, 'workerinput'): # Worker node
self.worker_id = request.config.workerinput['workerid'] # type: ignore[attr-defined]
acme_xdist = request.config.workerinput['acme_xdist'] # type: ignore[attr-defined]
self.worker_id = request.config.workerinput['workerid']
acme_xdist = request.config.workerinput['acme_xdist']
else: # Primary node
self.worker_id = 'primary'
acme_xdist = request.config.acme_xdist # type: ignore[attr-defined]
@@ -29,8 +29,8 @@ class IntegrationTestsContext:
self.http_01_port = acme_xdist['http_port'][self.worker_id]
self.other_port = acme_xdist['other_port'][self.worker_id]
# Challtestsrv REST API, that exposes entrypoints to register new DNS entries,
# is listening on challtestsrv_port.
self.challtestsrv_port = acme_xdist['challtestsrv_port']
# is listening on challtestsrv_url.
self.challtestsrv_url = acme_xdist['challtestsrv_url']
self.workspace = tempfile.mkdtemp()
self.config_dir = os.path.join(self.workspace, 'conf')
@@ -44,17 +44,17 @@ class IntegrationTestsContext:
"assert not os.environ.get('CERTBOT_DOMAIN').startswith('fail'); "
"data = {{'host':'_acme-challenge.{{0}}.'.format(os.environ.get('CERTBOT_DOMAIN')),"
"'value':os.environ.get('CERTBOT_VALIDATION')}}; "
"request = requests.post('http://localhost:{1}/set-txt', data=json.dumps(data)); "
"request = requests.post('{1}/set-txt', data=json.dumps(data)); "
"request.raise_for_status(); "
'"'
).format(sys.executable, self.challtestsrv_port)
).format(sys.executable, self.challtestsrv_url)
self.manual_dns_cleanup_hook = (
'{0} -c "import os; import requests; import json; '
"data = {{'host':'_acme-challenge.{{0}}.'.format(os.environ.get('CERTBOT_DOMAIN'))}}; "
"request = requests.post('http://localhost:{1}/clear-txt', data=json.dumps(data)); "
"request = requests.post('{1}/clear-txt', data=json.dumps(data)); "
"request.raise_for_status(); "
'"'
).format(sys.executable, self.challtestsrv_port)
).format(sys.executable, self.challtestsrv_url)
def cleanup(self) -> None:
"""Cleanup the integration test context."""

View File

@@ -8,6 +8,7 @@ import subprocess
import time
from typing import Iterable
from typing import Generator
from typing import Tuple
from typing import Type
from cryptography.hazmat.primitives.asymmetric.ec import EllipticCurve
@@ -77,7 +78,15 @@ def test_registration_override(context: IntegrationTestsContext) -> None:
context.certbot(['register', '--email', 'ex1@domain.org,ex2@domain.org'])
context.certbot(['update_account', '--email', 'example@domain.org'])
stdout1, _ = context.certbot(['show_account'])
context.certbot(['update_account', '--email', 'ex1@domain.org,ex2@domain.org'])
stdout2, _ = context.certbot(['show_account'])
# https://github.com/letsencrypt/boulder/issues/6144
if context.acme_server != 'boulder-v2':
assert 'example@domain.org' in stdout1, "New email should be present"
assert 'example@domain.org' not in stdout2, "Old email should not be present"
assert 'ex1@domain.org, ex2@domain.org' in stdout2, "New emails should be present"
def test_prepare_plugins(context: IntegrationTestsContext) -> None:
@@ -103,7 +112,7 @@ def test_http_01(context: IntegrationTestsContext) -> None:
assert_hook_execution(context.hook_probe, 'deploy')
assert_saved_renew_hook(context.config_dir, certname)
assert_saved_lineage_option(context.config_dir, certname, 'key_type', 'rsa')
assert_saved_lineage_option(context.config_dir, certname, 'key_type', 'ecdsa')
def test_manual_http_auth(context: IntegrationTestsContext) -> None:
@@ -306,23 +315,22 @@ def test_graceful_renew_it_is_time(context: IntegrationTestsContext) -> None:
def test_renew_with_changed_private_key_complexity(context: IntegrationTestsContext) -> None:
"""Test proper renew with updated private key complexity."""
certname = context.get_domain('renew')
context.certbot(['-d', certname, '--rsa-key-size', '4096'])
context.certbot(['-d', certname, '--key-type', 'rsa', '--rsa-key-size', '4096'])
key1 = join(context.config_dir, 'archive', certname, 'privkey1.pem')
assert os.stat(key1).st_size > 3000 # 4096 bits keys takes more than 3000 bytes
assert_rsa_key(key1, 4096)
assert_cert_count_for_lineage(context.config_dir, certname, 1)
context.certbot(['renew'])
assert_cert_count_for_lineage(context.config_dir, certname, 2)
key2 = join(context.config_dir, 'archive', certname, 'privkey2.pem')
assert os.stat(key2).st_size > 3000
assert_rsa_key(key2, 4096)
context.certbot(['renew', '--rsa-key-size', '2048'])
assert_cert_count_for_lineage(context.config_dir, certname, 3)
key3 = join(context.config_dir, 'archive', certname, 'privkey3.pem')
assert os.stat(key3).st_size < 1800 # 2048 bits keys takes less than 1800 bytes
assert_rsa_key(key3, 2048)
def test_renew_ignoring_directory_hooks(context: IntegrationTestsContext) -> None:
@@ -428,41 +436,88 @@ def test_reuse_key(context: IntegrationTestsContext) -> None:
with open(join(context.config_dir, 'archive/{0}/privkey1.pem').format(certname), 'r') as file:
privkey1 = file.read()
with open(join(context.config_dir, 'archive/{0}/cert1.pem').format(certname), 'r') as file:
cert1 = file.read()
with open(join(context.config_dir, 'archive/{0}/privkey2.pem').format(certname), 'r') as file:
privkey2 = file.read()
with open(join(context.config_dir, 'archive/{0}/cert2.pem').format(certname), 'r') as file:
cert2 = file.read()
assert privkey1 == privkey2
context.certbot(['--cert-name', certname, '--domains', certname, '--force-renewal'])
with open(join(context.config_dir, 'archive/{0}/privkey3.pem').format(certname), 'r') as file:
privkey3 = file.read()
with open(join(context.config_dir, 'archive/{0}/cert3.pem').format(certname), 'r') as file:
cert3 = file.read()
assert privkey2 != privkey3
context.certbot(['--cert-name', certname, '--domains', certname,
'--reuse-key','--force-renewal'])
context.certbot(['renew', '--cert-name', certname, '--no-reuse-key', '--force-renewal'])
context.certbot(['renew', '--cert-name', certname, '--force-renewal'])
with open(join(context.config_dir, 'archive/{0}/privkey4.pem').format(certname), 'r') as file:
privkey4 = file.read()
context.certbot(['renew', '--cert-name', certname, '--no-reuse-key', '--force-renewal'])
with open(join(context.config_dir, 'archive/{0}/privkey5.pem').format(certname), 'r') as file:
privkey5 = file.read()
context.certbot(['renew', '--cert-name', certname, '--force-renewal'])
with open(join(context.config_dir, 'archive/{0}/privkey6.pem').format(certname), 'r') as file:
privkey6 = file.read()
assert privkey3 == privkey4
assert privkey4 != privkey5
assert privkey5 != privkey6
with open(join(context.config_dir, 'archive/{0}/cert1.pem').format(certname), 'r') as file:
cert1 = file.read()
with open(join(context.config_dir, 'archive/{0}/cert2.pem').format(certname), 'r') as file:
cert2 = file.read()
with open(join(context.config_dir, 'archive/{0}/cert3.pem').format(certname), 'r') as file:
cert3 = file.read()
assert len({cert1, cert2, cert3}) == 3
def test_new_key(context: IntegrationTestsContext) -> None:
"""Tests --new-key and its interactions with --reuse-key"""
def private_key(generation: int) -> Tuple[str, str]:
pk_path = join(context.config_dir, f'archive/{certname}/privkey{generation}.pem')
with open(pk_path, 'r') as file:
return file.read(), pk_path
certname = context.get_domain('newkey')
context.certbot(['--domains', certname, '--reuse-key',
'--key-type', 'ecdsa', '--elliptic-curve', 'secp384r1'])
privkey1, _ = private_key(1)
# renew: --new-key should replace the key, but keep reuse_key and the key type + params
context.certbot(['renew', '--cert-name', certname, '--new-key'])
privkey2, privkey2_path = private_key(2)
assert privkey1 != privkey2
assert_saved_lineage_option(context.config_dir, certname, 'reuse_key', 'True')
assert_elliptic_key(privkey2_path, SECP384R1)
# certonly: it should replace the key but the elliptic curve will change
context.certbot(['certonly', '-d', certname, '--reuse-key', '--new-key'])
privkey3, privkey3_path = private_key(3)
assert privkey2 != privkey3
assert_saved_lineage_option(context.config_dir, certname, 'reuse_key', 'True')
assert_elliptic_key(privkey3_path, SECP256R1)
# certonly: it should be possible to change the key type and keep reuse_key
context.certbot(['certonly', '-d', certname, '--reuse-key', '--new-key', '--key-type', 'rsa',
'--rsa-key-size', '4096', '--cert-name', certname])
privkey4, privkey4_path = private_key(4)
assert privkey3 != privkey4
assert_saved_lineage_option(context.config_dir, certname, 'reuse_key', 'True')
assert_rsa_key(privkey4_path, 4096)
# certonly: it should not be possible to change a key parameter without --new-key
with pytest.raises(subprocess.CalledProcessError) as error:
context.certbot(['certonly', '-d', certname, '--key-type', 'rsa', '--reuse-key',
'--rsa-key-size', '2048'])
assert 'Unable to change the --rsa-key-size' in error.value.stderr
# certonly: not specifying --key-type should keep the existing key type (non-interactively).
context.certbot(['certonly', '-d', certname, '--no-reuse-key'])
privkey5, privkey5_path = private_key(5)
assert_rsa_key(privkey5_path, 2048)
assert privkey4 != privkey5
def test_incorrect_key_type(context: IntegrationTestsContext) -> None:
with pytest.raises(subprocess.CalledProcessError):
context.certbot(['--key-type="failwhale"'])
@@ -490,24 +545,24 @@ def test_ecdsa(context: IntegrationTestsContext) -> None:
def test_default_key_type(context: IntegrationTestsContext) -> None:
"""Test default key type is RSA"""
"""Test default key type is ECDSA"""
certname = context.get_domain('renew')
context.certbot([
'certonly',
'--cert-name', certname, '-d', certname
])
filename = join(context.config_dir, 'archive/{0}/privkey1.pem').format(certname)
assert_rsa_key(filename)
assert_elliptic_key(filename, SECP256R1)
def test_default_curve_type(context: IntegrationTestsContext) -> None:
"""test that the curve used when not specifying any is secp256r1"""
def test_default_rsa_size(context: IntegrationTestsContext) -> None:
"""test that the RSA key size used when not specifying any is 2048"""
certname = context.get_domain('renew')
context.certbot([
'--key-type', 'ecdsa', '--cert-name', certname, '-d', certname
'--key-type', 'rsa', '--cert-name', certname, '-d', certname
])
key1 = join(context.config_dir, 'archive/{0}/privkey1.pem'.format(certname))
assert_elliptic_key(key1, SECP256R1)
assert_rsa_key(key1, 2048)
@pytest.mark.parametrize('curve,curve_cls,skip_servers', [
@@ -558,7 +613,6 @@ def test_renew_with_ec_keys(context: IntegrationTestsContext) -> None:
# to the lineage key type, Certbot should keep the lineage key type. The curve will still
# change to the default value, in order to stay consistent with the behavior of certonly.
context.certbot(['certonly', '--force-renewal', '-d', certname])
assert_cert_count_for_lineage(context.config_dir, certname, 3)
key3 = join(context.config_dir, 'archive', certname, 'privkey3.pem')
assert 200 < os.stat(key3).st_size < 250 # ec keys of 256 bits are ~225 bytes
assert_elliptic_key(key3, SECP256R1)
@@ -572,14 +626,12 @@ def test_renew_with_ec_keys(context: IntegrationTestsContext) -> None:
context.certbot(['certonly', '--force-renewal', '-d', certname,
'--key-type', 'rsa', '--cert-name', certname])
assert_cert_count_for_lineage(context.config_dir, certname, 4)
key4 = join(context.config_dir, 'archive', certname, 'privkey4.pem')
assert_rsa_key(key4)
# We expect that the previous behavior of requiring both --cert-name and
# --key-type to be set to not apply to the renew subcommand.
context.certbot(['renew', '--force-renewal', '--key-type', 'ecdsa'])
assert_cert_count_for_lineage(context.config_dir, certname, 5)
key5 = join(context.config_dir, 'archive', certname, 'privkey5.pem')
assert 200 < os.stat(key5).st_size < 250 # ec keys of 256 bits are ~225 bytes
assert_elliptic_key(key5, SECP256R1)

View File

@@ -9,6 +9,7 @@ import pytest
from certbot_integration_tests.certbot_tests import context as certbot_context
from certbot_integration_tests.nginx_tests import nginx_config as config
from certbot_integration_tests.utils import certbot_call
from certbot_integration_tests.utils import constants
from certbot_integration_tests.utils import misc
@@ -28,7 +29,7 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
self.nginx_config_path = os.path.join(self.nginx_root, 'nginx.conf')
self.nginx_config: str
default_server = request.param['default_server'] # type: ignore[attr-defined]
default_server = request.param['default_server']
self.process = self._start_nginx(default_server)
def cleanup(self) -> None:
@@ -65,4 +66,4 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
def _stop_nginx(self) -> None:
assert self.process.poll() is None
self.process.terminate()
self.process.wait()
self.process.wait(constants.MAX_SUBPROCESS_WAIT)

View File

@@ -21,7 +21,7 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
self.request = request
if hasattr(request.config, 'workerinput'): # Worker node
self._dns_xdist = request.config.workerinput['dns_xdist'] # type: ignore[attr-defined]
self._dns_xdist = request.config.workerinput['dns_xdist']
else: # Primary node
self._dns_xdist = request.config.dns_xdist # type: ignore[attr-defined]

View File

@@ -18,6 +18,7 @@ from typing import Dict
from typing import List
from typing import Mapping
from typing import Optional
from typing import Tuple
from typing import Type
import requests
@@ -63,6 +64,7 @@ class ACMEServer:
self._stdout = sys.stdout if stdout else open(os.devnull, 'w') # pylint: disable=consider-using-with
self._dns_server = dns_server
self._http_01_port = http_01_port
self._preterminate_cmds_args: List[Tuple[Tuple[Any, ...], Dict[str, Any]]] = []
if http_01_port != DEFAULT_HTTP_01_PORT:
if self._acme_type != 'pebble' or self._proxy:
raise ValueError('setting http_01_port is not currently supported '
@@ -85,6 +87,7 @@ class ACMEServer:
"""Stop the test stack, and clean its resources"""
print('=> Tear down the test infrastructure...')
try:
self._run_preterminate_cmds()
for process in self._processes:
try:
process.terminate()
@@ -94,17 +97,7 @@ class ACMEServer:
if e.errno != errno.ESRCH:
raise
for process in self._processes:
process.wait()
if os.path.exists(os.path.join(self._workspace, 'boulder')):
# Boulder docker generates build artifacts owned by root with 0o744 permissions.
# If we started the acme server from a normal user that has access to the Docker
# daemon, this user will not be able to delete these artifacts from the host.
# We need to do it through a docker.
process = self._launch_process(['docker', 'run', '--rm', '-v',
'{0}:/workspace'.format(self._workspace),
'alpine', 'rm', '-rf', '/workspace/boulder'])
process.wait()
process.wait(MAX_SUBPROCESS_WAIT)
finally:
if os.path.exists(self._workspace):
shutil.rmtree(self._workspace)
@@ -122,27 +115,20 @@ class ACMEServer:
def _construct_acme_xdist(self, acme_server: str, nodes: List[str]) -> None:
"""Generate and return the acme_xdist dict"""
acme_xdist = {'acme_server': acme_server, 'challtestsrv_port': CHALLTESTSRV_PORT}
acme_xdist: Dict[str, Any] = {'acme_server': acme_server}
# Directory and ACME port are set implicitly in the docker-compose.yml
# files of Boulder/Pebble.
if acme_server == 'pebble':
acme_xdist['directory_url'] = PEBBLE_DIRECTORY_URL
acme_xdist['challtestsrv_url'] = PEBBLE_CHALLTESTSRV_URL
else: # boulder
acme_xdist['directory_url'] = BOULDER_V2_DIRECTORY_URL
acme_xdist['challtestsrv_url'] = BOULDER_V2_CHALLTESTSRV_URL
acme_xdist['http_port'] = {
node: port for (node, port) in # pylint: disable=unnecessary-comprehension
zip(nodes, range(5200, 5200 + len(nodes)))
}
acme_xdist['https_port'] = {
node: port for (node, port) in # pylint: disable=unnecessary-comprehension
zip(nodes, range(5100, 5100 + len(nodes)))
}
acme_xdist['other_port'] = {
node: port for (node, port) in # pylint: disable=unnecessary-comprehension
zip(nodes, range(5300, 5300 + len(nodes)))
}
acme_xdist['http_port'] = dict(zip(nodes, range(5200, 5200 + len(nodes))))
acme_xdist['https_port'] = dict(zip(nodes, range(5100, 5100 + len(nodes))))
acme_xdist['other_port'] = dict(zip(nodes, range(5300, 5300 + len(nodes))))
self.acme_xdist = acme_xdist
@@ -182,7 +168,7 @@ class ACMEServer:
# Wait for the ACME CA server to be up.
print('=> Waiting for pebble instance to respond...')
misc.check_until_timeout(self.acme_xdist['directory_url']) # type: ignore[arg-type]
misc.check_until_timeout(self.acme_xdist['directory_url'])
print('=> Finished pebble instance deployment.')
@@ -194,7 +180,7 @@ class ACMEServer:
# Load Boulder from git, that includes a docker-compose.yml ready for production.
process = self._launch_process(['git', 'clone', 'https://github.com/letsencrypt/boulder',
'--single-branch', '--depth=1', instance_path])
process.wait()
process.wait(MAX_SUBPROCESS_WAIT)
# Allow Boulder to ignore usual limit rate policies, useful for tests.
os.rename(join(instance_path, 'test/rate-limit-policies-b.yml'),
@@ -209,6 +195,17 @@ class ACMEServer:
with open(join(instance_path, 'test/config/va{}.json'.format(suffix)), 'w') as f:
f.write(json.dumps(config, indent=2, separators=(',', ': ')))
# This command needs to be run before we try and terminate running processes because
# docker-compose up doesn't always respond to SIGTERM. See
# https://github.com/certbot/certbot/pull/9435.
self._register_preterminate_cmd(['docker-compose', 'down'], cwd=instance_path)
# Boulder docker generates build artifacts owned by root with 0o744 permissions.
# If we started the acme server from a normal user that has access to the Docker
# daemon, this user will not be able to delete these artifacts from the host.
# We need to do it through a docker.
self._register_preterminate_cmd(['docker', 'run', '--rm', '-v',
'{0}:/workspace'.format(self._workspace), 'alpine', 'rm',
'-rf', '/workspace/boulder'])
try:
# Launch the Boulder server
self._launch_process(['docker-compose', 'up', '--force-recreate'], cwd=instance_path)
@@ -216,12 +213,14 @@ class ACMEServer:
# Wait for the ACME CA server to be up.
print('=> Waiting for boulder instance to respond...')
misc.check_until_timeout(
self.acme_xdist['directory_url'], attempts=300) # type: ignore[arg-type]
self.acme_xdist['directory_url'], attempts=300)
if not self._dns_server:
# Configure challtestsrv to answer any A record request with ip of the docker host.
response = requests.post('http://localhost:{0}/set-default-ipv4'.format(
CHALLTESTSRV_PORT), json={'ip': '10.77.77.1'}
response = requests.post(
f'{BOULDER_V2_CHALLTESTSRV_URL}/set-default-ipv4',
json={'ip': '10.77.77.1'},
timeout=10
)
response.raise_for_status()
except BaseException:
@@ -230,7 +229,7 @@ class ACMEServer:
process = self._launch_process([
'docker-compose', 'logs'], cwd=instance_path, force_stderr=True
)
process.wait()
process.wait(MAX_SUBPROCESS_WAIT)
raise
print('=> Finished boulder instance deployment.')
@@ -259,6 +258,17 @@ class ACMEServer:
self._processes.append(process)
return process
def _register_preterminate_cmd(self, *args: Any, **kwargs: Any) -> None:
self._preterminate_cmds_args.append((args, kwargs))
def _run_preterminate_cmds(self) -> None:
for args, kwargs in self._preterminate_cmds_args:
process = self._launch_process(*args, **kwargs)
process.wait(MAX_SUBPROCESS_WAIT)
# It's unlikely to matter, but let's clear the list of cleanup commands
# once they've been run.
self._preterminate_cmds_args.clear()
def main() -> None:
# pylint: disable=missing-function-docstring

View File

@@ -2,8 +2,11 @@
DEFAULT_HTTP_01_PORT = 5002
TLS_ALPN_01_PORT = 5001
CHALLTESTSRV_PORT = 8055
BOULDER_V2_CHALLTESTSRV_URL = f'http://10.77.77.77:{CHALLTESTSRV_PORT}'
BOULDER_V2_DIRECTORY_URL = 'http://localhost:4001/directory'
PEBBLE_DIRECTORY_URL = 'https://localhost:14000/dir'
PEBBLE_MANAGEMENT_URL = 'https://localhost:15000'
PEBBLE_CHALLTESTSRV_URL = f'http://localhost:{CHALLTESTSRV_PORT}'
MOCK_OCSP_SERVER_PORT = 4002
PEBBLE_ALTERNATE_ROOTS = 2
MAX_SUBPROCESS_WAIT = 120

View File

@@ -17,6 +17,8 @@ from typing import Type
from pkg_resources import resource_filename
from certbot_integration_tests.utils import constants
BIND_DOCKER_IMAGE = "internetsystemsconsortium/bind9:9.16"
BIND_BIND_ADDRESS = ("127.0.0.1", 45953)
@@ -67,8 +69,8 @@ class DNSServer:
if self.process:
try:
self.process.terminate()
self.process.wait()
except BaseException as e:
self.process.wait(constants.MAX_SUBPROCESS_WAIT)
except BaseException as e: # pylint: disable=broad-except
print("BIND9 did not stop cleanly: {}".format(e), file=sys.stderr)
shutil.rmtree(self.bind_root, ignore_errors=True)

View File

@@ -4,8 +4,8 @@ or outside during setup/teardown of the integration tests environment.
"""
import contextlib
import errno
import functools
import http.server as SimpleHTTPServer
import multiprocessing
import os
import re
import shutil
@@ -13,6 +13,7 @@ import socketserver
import stat
import sys
import tempfile
import threading
import time
import warnings
from typing import Generator
@@ -64,9 +65,9 @@ def check_until_timeout(url: str, attempts: int = 30) -> None:
for _ in range(attempts):
time.sleep(1)
try:
if requests.get(url, verify=False).status_code == 200:
if requests.get(url, verify=False, timeout=10).status_code == 200:
return
except requests.exceptions.ConnectionError:
except requests.exceptions.RequestException:
pass
raise ValueError('Error, url did not respond after {0} attempts: {1}'.format(attempts, url))
@@ -80,10 +81,6 @@ class GracefulTCPServer(socketserver.TCPServer):
allow_reuse_address = True
def _run_server(port: int) -> None:
GracefulTCPServer(('', port), SimpleHTTPServer.SimpleHTTPRequestHandler).serve_forever()
@contextlib.contextmanager
def create_http_server(port: int) -> Generator[str, None, None]:
"""
@@ -93,30 +90,20 @@ def create_http_server(port: int) -> Generator[str, None, None]:
:param int port: the TCP port to use
:return str: the temporary webroot attached to this server
"""
current_cwd = os.getcwd()
webroot = tempfile.mkdtemp()
process = multiprocessing.Process(target=_run_server, args=(port,))
try:
# SimpleHTTPServer is designed to serve files from the current working directory at the
# time it starts. So we temporarily change the cwd to our crafted webroot before launch.
with tempfile.TemporaryDirectory() as webroot:
# Setting the directory argument of SimpleHTTPRequestHandler causes
# files to be served from that directory.
handler = functools.partial(SimpleHTTPServer.SimpleHTTPRequestHandler, directory=webroot)
server = GracefulTCPServer(('', port), handler)
thread = threading.Thread(target=server.serve_forever)
thread.start()
try:
os.chdir(webroot)
process.start()
check_until_timeout('http://localhost:{0}/'.format(port))
yield webroot
finally:
os.chdir(current_cwd)
check_until_timeout('http://localhost:{0}/'.format(port))
yield webroot
finally:
try:
if process.is_alive():
process.terminate()
process.join() # Block until process is effectively terminated
finally:
shutil.rmtree(webroot)
server.shutdown()
thread.join()
server.server_close()
def list_renewal_hooks_dirs(config_dir: str) -> List[str]:
@@ -344,7 +331,9 @@ def get_acme_issuers(context: IntegrationTestsContext) -> List[Certificate]:
issuers = []
for i in range(PEBBLE_ALTERNATE_ROOTS + 1):
request = requests.get(PEBBLE_MANAGEMENT_URL + '/intermediates/{}'.format(i), verify=False)
request = requests.get(PEBBLE_MANAGEMENT_URL + '/intermediates/{}'.format(i),
verify=False,
timeout=10)
issuers.append(load_pem_x509_certificate(request.content, default_backend()))
return issuers

View File

@@ -11,7 +11,7 @@ import requests
from certbot_integration_tests.utils.constants import DEFAULT_HTTP_01_PORT
from certbot_integration_tests.utils.constants import MOCK_OCSP_SERVER_PORT
PEBBLE_VERSION = 'v2.3.0'
PEBBLE_VERSION = 'v2.3.1'
ASSETS_PATH = pkg_resources.resource_filename('certbot_integration_tests', 'assets')
@@ -31,7 +31,7 @@ def _fetch_asset(asset: str, suffix: str) -> str:
if not os.path.exists(asset_path):
asset_url = ('https://github.com/letsencrypt/pebble/releases/download/{0}/{1}_{2}'
.format(PEBBLE_VERSION, asset, suffix))
response = requests.get(asset_url)
response = requests.get(asset_url, timeout=30)
response.raise_for_status()
with open(asset_path, 'wb') as file_h:
file_h.write(response.content)

View File

@@ -23,10 +23,12 @@ from certbot_integration_tests.utils.misc import GracefulTCPServer
class _ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
# pylint: disable=missing-function-docstring
def do_POST(self) -> None:
request = requests.get(PEBBLE_MANAGEMENT_URL + '/intermediate-keys/0', verify=False)
request = requests.get(PEBBLE_MANAGEMENT_URL + '/intermediate-keys/0',
verify=False, timeout=10)
issuer_key = serialization.load_pem_private_key(request.content, None, default_backend())
request = requests.get(PEBBLE_MANAGEMENT_URL + '/intermediates/0', verify=False)
request = requests.get(PEBBLE_MANAGEMENT_URL + '/intermediates/0',
verify=False, timeout=10)
issuer_cert = x509.load_pem_x509_certificate(request.content, default_backend())
content_len = int(self.headers.get('Content-Length'))
@@ -34,7 +36,7 @@ class _ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
ocsp_request = ocsp.load_der_ocsp_request(self.rfile.read(content_len))
response = requests.get('{0}/cert-status-by-serial/{1}'.format(
PEBBLE_MANAGEMENT_URL, str(hex(ocsp_request.serial_number)).replace('0x', '')),
verify=False
verify=False, timeout=10
)
if not response.ok:

View File

@@ -21,7 +21,7 @@ def _create_proxy(mapping: Mapping[str, str]) -> Type[BaseHTTPServer.BaseHTTPReq
headers = {key.lower(): value for key, value in self.headers.items()}
backend = [backend for pattern, backend in mapping.items()
if re.match(pattern, headers['host'])][0]
response = requests.get(backend + self.path, headers=headers)
response = requests.get(backend + self.path, headers=headers, timeout=10)
self.send_response(response.status_code)
for key, value in response.headers.items():

View File

@@ -15,7 +15,6 @@ if parse_version(setuptools_version) < parse_version(min_setuptools_version):
install_requires = [
'coverage',
'cryptography',
'docker-compose',
'pyopenssl',
'pytest',
'pytest-cov',
@@ -51,6 +50,7 @@ setup(
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],

View File

@@ -33,7 +33,6 @@ from acme import messages
from certbot import achallenges
from certbot import errors as le_errors
from certbot._internal.display import obj as display_obj
from certbot.display import util as display_util
from certbot.tests import acme_util
DESCRIPTION = """
@@ -339,7 +338,7 @@ def setup_logging(args: argparse.Namespace) -> None:
def setup_display() -> None:
""""Prepares a display utility instance for the Certbot plugins """
displayer = display_util.NoninteractiveDisplay(sys.stdout)
displayer = display_obj.NoninteractiveDisplay(sys.stdout)
display_obj.set_display(displayer)

View File

@@ -15,6 +15,9 @@ from acme import errors as acme_errors
logger = logging.getLogger(__name__)
_VALIDATION_TIMEOUT = 10
class Validator:
"""Collection of functions to test a live webserver's configuration"""
@@ -43,9 +46,12 @@ class Validator:
"""Test whether webserver redirects to secure connection."""
url = "http://{0}:{1}".format(name, port)
if headers:
response = requests.get(url, headers=headers, allow_redirects=False)
response = requests.get(url, headers=headers,
allow_redirects=False,
timeout=_VALIDATION_TIMEOUT)
else:
response = requests.get(url, allow_redirects=False)
response = requests.get(url, allow_redirects=False,
timeout=_VALIDATION_TIMEOUT)
redirect_location = response.headers.get("location", "")
# We're checking that the redirect we added behaves correctly.
@@ -65,15 +71,19 @@ class Validator:
"""Test whether webserver redirects."""
url = "http://{0}:{1}".format(name, port)
if headers:
response = requests.get(url, headers=headers, allow_redirects=False)
response = requests.get(url, headers=headers,
allow_redirects=False,
timeout=_VALIDATION_TIMEOUT)
else:
response = requests.get(url, allow_redirects=False)
response = requests.get(url, allow_redirects=False,
timeout=_VALIDATION_TIMEOUT)
return response.status_code in range(300, 309)
def hsts(self, name: str) -> bool:
"""Test for HTTP Strict Transport Security header"""
headers = requests.get("https://" + name).headers
headers = requests.get("https://" + name,
timeout=_VALIDATION_TIMEOUT).headers
hsts_header = headers.get("strict-transport-security")
if not hsts_header:

View File

@@ -1,7 +1,7 @@
from setuptools import find_packages
from setuptools import setup
version = '1.25.0.dev0'
version = '2.3.0.dev0'
install_requires = [
'certbot',
@@ -29,6 +29,7 @@ setup(
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],

View File

@@ -39,7 +39,7 @@ The Token needed by Certbot requires ``Zone:DNS:Edit`` permissions for only the
zones you need certificates for.
Using Cloudflare Tokens also requires at least version 2.3.1 of the ``cloudflare``
python module. If the version that automatically installed with this plugin is
Python module. If the version that automatically installed with this plugin is
older than that, and you can't upgrade it on your system, you'll have to stick to
the Global key.
@@ -77,6 +77,18 @@ file. This warning will be emitted each time Certbot uses the credentials file,
including for renewal, and cannot be silenced except by addressing the issue
(e.g., by using a command like ``chmod 600`` to restrict access to the file).
.. note::
Please note that the ``cloudflare`` Python module used by the plugin has
additional methods of providing credentials to the module, e.g. environment
variables or the ``cloudflare.cfg`` configuration file. These methods are not
supported by Certbot. If any of those additional methods of providing
credentials is being used, they must provide the same credentials (i.e.,
email and API key *or* an API token) as the credentials file provided to
Certbot. If there is a discrepancy, the ``cloudflare`` Python module will
raise an error. Also note that the credentials provided to Certbot will take
precedence over any other method of providing credentials to the ``cloudflare``
Python module.
Examples
--------

View File

@@ -82,8 +82,9 @@ class Authenticator(dns_common.DNSAuthenticator):
if not self.credentials: # pragma: no cover
raise errors.Error("Plugin has not been prepared.")
if self.credentials.conf('api-token'):
return _CloudflareClient(None, self.credentials.conf('api-token'))
return _CloudflareClient(self.credentials.conf('email'), self.credentials.conf('api-key'))
return _CloudflareClient(api_token = self.credentials.conf('api-token'))
return _CloudflareClient(email = self.credentials.conf('email'),
api_key = self.credentials.conf('api-key'))
class _CloudflareClient:
@@ -91,8 +92,19 @@ class _CloudflareClient:
Encapsulates all communication with the Cloudflare API.
"""
def __init__(self, email: Optional[str], api_key: str) -> None:
self.cf = CloudFlare.CloudFlare(email, api_key)
def __init__(self, email: Optional[str] = None, api_key: Optional[str] = None,
api_token: Optional[str] = None) -> None:
if email:
# If an email was specified, we're using an email/key combination and not a token.
# We can't use named arguments in this case, as it would break compatibility with
# the Cloudflare library since version 2.10.1, as the `token` argument was used for
# tokens and keys alike and the `key` argument did not exist in earlier versions.
self.cf = CloudFlare.CloudFlare(email, api_key)
else:
# If no email was specified, we're using just a token. Let's use the named argument
# for simplicity, which is compatible with all (current) versions of the Cloudflare
# library.
self.cf = CloudFlare.CloudFlare(token=api_token)
def add_txt_record(self, domain: str, record_name: str, record_content: str,
record_ttl: int) -> None:

View File

@@ -4,7 +4,7 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '1.25.0.dev0'
version = '2.3.0.dev0'
install_requires = [
'cloudflare>=1.5.1',
@@ -51,6 +51,7 @@ setup(
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -1,12 +1,9 @@
"""Tests for certbot_dns_cloudflare._internal.dns_cloudflare."""
import unittest
from unittest import mock
import CloudFlare
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot import errors
from certbot.compat import os

View File

@@ -1,190 +0,0 @@
Copyright 2015 Electronic Frontier Foundation and others
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
Apache License
Version 2.0, January 2004
http://www.apache.org/licenses/
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
1. Definitions.
"License" shall mean the terms and conditions for use, reproduction,
and distribution as defined by Sections 1 through 9 of this document.
"Licensor" shall mean the copyright owner or entity authorized by
the copyright owner that is granting the License.
"Legal Entity" shall mean the union of the acting entity and all
other entities that control, are controlled by, or are under common
control with that entity. For the purposes of this definition,
"control" means (i) the power, direct or indirect, to cause the
direction or management of such entity, whether by contract or
otherwise, or (ii) ownership of fifty percent (50%) or more of the
outstanding shares, or (iii) beneficial ownership of such entity.
"You" (or "Your") shall mean an individual or Legal Entity
exercising permissions granted by this License.
"Source" form shall mean the preferred form for making modifications,
including but not limited to software source code, documentation
source, and configuration files.
"Object" form shall mean any form resulting from mechanical
transformation or translation of a Source form, including but
not limited to compiled object code, generated documentation,
and conversions to other media types.
"Work" shall mean the work of authorship, whether in Source or
Object form, made available under the License, as indicated by a
copyright notice that is included in or attached to the work
(an example is provided in the Appendix below).
"Derivative Works" shall mean any work, whether in Source or Object
form, that is based on (or derived from) the Work and for which the
editorial revisions, annotations, elaborations, or other modifications
represent, as a whole, an original work of authorship. For the purposes
of this License, Derivative Works shall not include works that remain
separable from, or merely link (or bind by name) to the interfaces of,
the Work and Derivative Works thereof.
"Contribution" shall mean any work of authorship, including
the original version of the Work and any modifications or additions
to that Work or Derivative Works thereof, that is intentionally
submitted to Licensor for inclusion in the Work by the copyright owner
or by an individual or Legal Entity authorized to submit on behalf of
the copyright owner. For the purposes of this definition, "submitted"
means any form of electronic, verbal, or written communication sent
to the Licensor or its representatives, including but not limited to
communication on electronic mailing lists, source code control systems,
and issue tracking systems that are managed by, or on behalf of, the
Licensor for the purpose of discussing and improving the Work, but
excluding communication that is conspicuously marked or otherwise
designated in writing by the copyright owner as "Not a Contribution."
"Contributor" shall mean Licensor and any individual or Legal Entity
on behalf of whom a Contribution has been received by Licensor and
subsequently incorporated within the Work.
2. Grant of Copyright License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
copyright license to reproduce, prepare Derivative Works of,
publicly display, publicly perform, sublicense, and distribute the
Work and such Derivative Works in Source or Object form.
3. Grant of Patent License. Subject to the terms and conditions of
this License, each Contributor hereby grants to You a perpetual,
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
(except as stated in this section) patent license to make, have made,
use, offer to sell, sell, import, and otherwise transfer the Work,
where such license applies only to those patent claims licensable
by such Contributor that are necessarily infringed by their
Contribution(s) alone or by combination of their Contribution(s)
with the Work to which such Contribution(s) was submitted. If You
institute patent litigation against any entity (including a
cross-claim or counterclaim in a lawsuit) alleging that the Work
or a Contribution incorporated within the Work constitutes direct
or contributory patent infringement, then any patent licenses
granted to You under this License for that Work shall terminate
as of the date such litigation is filed.
4. Redistribution. You may reproduce and distribute copies of the
Work or Derivative Works thereof in any medium, with or without
modifications, and in Source or Object form, provided that You
meet the following conditions:
(a) You must give any other recipients of the Work or
Derivative Works a copy of this License; and
(b) You must cause any modified files to carry prominent notices
stating that You changed the files; and
(c) You must retain, in the Source form of any Derivative Works
that You distribute, all copyright, patent, trademark, and
attribution notices from the Source form of the Work,
excluding those notices that do not pertain to any part of
the Derivative Works; and
(d) If the Work includes a "NOTICE" text file as part of its
distribution, then any Derivative Works that You distribute must
include a readable copy of the attribution notices contained
within such NOTICE file, excluding those notices that do not
pertain to any part of the Derivative Works, in at least one
of the following places: within a NOTICE text file distributed
as part of the Derivative Works; within the Source form or
documentation, if provided along with the Derivative Works; or,
within a display generated by the Derivative Works, if and
wherever such third-party notices normally appear. The contents
of the NOTICE file are for informational purposes only and
do not modify the License. You may add Your own attribution
notices within Derivative Works that You distribute, alongside
or as an addendum to the NOTICE text from the Work, provided
that such additional attribution notices cannot be construed
as modifying the License.
You may add Your own copyright statement to Your modifications and
may provide additional or different license terms and conditions
for use, reproduction, or distribution of Your modifications, or
for any such Derivative Works as a whole, provided Your use,
reproduction, and distribution of the Work otherwise complies with
the conditions stated in this License.
5. Submission of Contributions. Unless You explicitly state otherwise,
any Contribution intentionally submitted for inclusion in the Work
by You to the Licensor shall be under the terms and conditions of
this License, without any additional terms or conditions.
Notwithstanding the above, nothing herein shall supersede or modify
the terms of any separate license agreement you may have executed
with Licensor regarding such Contributions.
6. Trademarks. This License does not grant permission to use the trade
names, trademarks, service marks, or product names of the Licensor,
except as required for reasonable and customary use in describing the
origin of the Work and reproducing the content of the NOTICE file.
7. Disclaimer of Warranty. Unless required by applicable law or
agreed to in writing, Licensor provides the Work (and each
Contributor provides its Contributions) on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
implied, including, without limitation, any warranties or conditions
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
PARTICULAR PURPOSE. You are solely responsible for determining the
appropriateness of using or redistributing the Work and assume any
risks associated with Your exercise of permissions under this License.
8. Limitation of Liability. In no event and under no legal theory,
whether in tort (including negligence), contract, or otherwise,
unless required by applicable law (such as deliberate and grossly
negligent acts) or agreed to in writing, shall any Contributor be
liable to You for damages, including any direct, indirect, special,
incidental, or consequential damages of any character arising as a
result of this License or out of the use or inability to use the
Work (including but not limited to damages for loss of goodwill,
work stoppage, computer failure or malfunction, or any and all
other commercial damages or losses), even if such Contributor
has been advised of the possibility of such damages.
9. Accepting Warranty or Additional Liability. While redistributing
the Work or Derivative Works thereof, You may choose to offer,
and charge a fee for, acceptance of support, warranty, indemnity,
or other liability obligations and/or rights consistent with this
License. However, in accepting such obligations, You may act only
on Your own behalf and on Your sole responsibility, not on behalf
of any other Contributor, and only if You agree to indemnify,
defend, and hold each Contributor harmless for any liability
incurred by, or claims asserted against, such Contributor by reason
of your accepting any such warranty or additional liability.
END OF TERMS AND CONDITIONS

View File

@@ -1,7 +0,0 @@
include LICENSE.txt
include README.rst
recursive-include docs *
recursive-include tests *
include certbot_dns_cloudxns/py.typed
global-exclude __pycache__
global-exclude *.py[cod]

View File

@@ -1 +0,0 @@
CloudXNS DNS Authenticator plugin for Certbot

View File

@@ -1,90 +0,0 @@
"""
The `~certbot_dns_cloudxns.dns_cloudxns` plugin automates the process of
completing a ``dns-01`` challenge (`~acme.challenges.DNS01`) by creating, and
subsequently removing, TXT records using the CloudXNS API.
.. note::
The plugin is not installed by default. It can be installed by heading to
`certbot.eff.org <https://certbot.eff.org/instructions#wildcard>`_, choosing your system and
selecting the Wildcard tab.
Named Arguments
---------------
======================================== =====================================
``--dns-cloudxns-credentials`` CloudXNS credentials_ INI file.
(Required)
``--dns-cloudxns-propagation-seconds`` The number of seconds to wait for DNS
to propagate before asking the ACME
server to verify the DNS record.
(Default: 30)
======================================== =====================================
Credentials
-----------
Use of this plugin requires a configuration file containing CloudXNS API
credentials, obtained from your CloudXNS
`API page <https://www.cloudxns.net/en/AccountManage/apimanage.html>`_.
.. code-block:: ini
:name: credentials.ini
:caption: Example credentials file:
# CloudXNS API credentials used by Certbot
dns_cloudxns_api_key = 1234567890abcdef1234567890abcdef
dns_cloudxns_secret_key = 1122334455667788
The path to this file can be provided interactively or using the
``--dns-cloudxns-credentials`` command-line argument. Certbot records the path
to this file for use during renewal, but does not store the file's contents.
.. caution::
You should protect these API credentials as you would the password to your
CloudXNS account. Users who can read this file can use these credentials to
issue arbitrary API calls on your behalf. Users who can cause Certbot to run
using these credentials can complete a ``dns-01`` challenge to acquire new
certificates or revoke existing certificates for associated domains, even if
those domains aren't being managed by this server.
Certbot will emit a warning if it detects that the credentials file can be
accessed by other users on your system. The warning reads "Unsafe permissions
on credentials configuration file", followed by the path to the credentials
file. This warning will be emitted each time Certbot uses the credentials file,
including for renewal, and cannot be silenced except by addressing the issue
(e.g., by using a command like ``chmod 600`` to restrict access to the file).
Examples
--------
.. code-block:: bash
:caption: To acquire a certificate for ``example.com``
certbot certonly \\
--dns-cloudxns \\
--dns-cloudxns-credentials ~/.secrets/certbot/cloudxns.ini \\
-d example.com
.. code-block:: bash
:caption: To acquire a single certificate for both ``example.com`` and
``www.example.com``
certbot certonly \\
--dns-cloudxns \\
--dns-cloudxns-credentials ~/.secrets/certbot/cloudxns.ini \\
-d example.com \\
-d www.example.com
.. code-block:: bash
:caption: To acquire a certificate for ``example.com``, waiting 60 seconds
for DNS propagation
certbot certonly \\
--dns-cloudxns \\
--dns-cloudxns-credentials ~/.secrets/certbot/cloudxns.ini \\
--dns-cloudxns-propagation-seconds 60 \\
-d example.com
"""

View File

@@ -1 +0,0 @@
"""Internal implementation of `~certbot_dns_cloudxns.dns_cloudxns` plugin."""

View File

@@ -1,93 +0,0 @@
"""DNS Authenticator for CloudXNS DNS."""
import logging
from typing import Any
from typing import Callable
from typing import Optional
from lexicon.providers import cloudxns
from requests import HTTPError
from certbot import errors
from certbot.plugins import dns_common
from certbot.plugins import dns_common_lexicon
from certbot.plugins.dns_common import CredentialsConfiguration
logger = logging.getLogger(__name__)
ACCOUNT_URL = 'https://www.cloudxns.net/en/AccountManage/apimanage.html'
class Authenticator(dns_common.DNSAuthenticator):
"""DNS Authenticator for CloudXNS DNS
This Authenticator uses the CloudXNS DNS API to fulfill a dns-01 challenge.
"""
description = 'Obtain certificates using a DNS TXT record (if you are using CloudXNS for DNS).'
ttl = 60
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.credentials: Optional[CredentialsConfiguration] = None
@classmethod
def add_parser_arguments(cls, add: Callable[..., None],
default_propagation_seconds: int = 30) -> None:
super().add_parser_arguments(add, default_propagation_seconds)
add('credentials', help='CloudXNS credentials INI file.')
def more_info(self) -> str:
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
'the CloudXNS API.'
def _setup_credentials(self) -> None:
self.credentials = self._configure_credentials(
'credentials',
'CloudXNS credentials INI file',
{
'api-key': 'API key for CloudXNS account, obtained from {0}'.format(ACCOUNT_URL),
'secret-key': 'Secret key for CloudXNS account, obtained from {0}'
.format(ACCOUNT_URL)
}
)
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
self._get_cloudxns_client().add_txt_record(domain, validation_name, validation)
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
self._get_cloudxns_client().del_txt_record(domain, validation_name, validation)
def _get_cloudxns_client(self) -> "_CloudXNSLexiconClient":
if not self.credentials: # pragma: no cover
raise errors.Error("Plugin has not been prepared.")
return _CloudXNSLexiconClient(self.credentials.conf('api-key'),
self.credentials.conf('secret-key'),
self.ttl)
class _CloudXNSLexiconClient(dns_common_lexicon.LexiconClient):
"""
Encapsulates all communication with the CloudXNS via Lexicon.
"""
def __init__(self, api_key: str, secret_key: str, ttl: int) -> None:
super().__init__()
config = dns_common_lexicon.build_lexicon_config('cloudxns', {
'ttl': ttl,
}, {
'auth_username': api_key,
'auth_token': secret_key,
})
self.provider = cloudxns.Provider(config)
def _handle_http_error(self, e: HTTPError, domain_name: str) -> Optional[errors.PluginError]:
hint = None
if str(e).startswith('400 Client Error:'):
hint = 'Are your API key and Secret key values correct?'
hint_disp = f' ({hint})' if hint else ''
return errors.PluginError(f'Error determining zone identifier for {domain_name}: '
f'{e}.{hint_disp}')

View File

@@ -1 +0,0 @@
/_build/

View File

@@ -1,20 +0,0 @@
# Minimal makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
SPHINXPROJ = certbot-dns-cloudxns
SOURCEDIR = .
BUILDDIR = _build
# Put it first so that "make" without argument is like "make help".
help:
@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
.PHONY: help Makefile
# Catch-all target: route all unknown targets to Sphinx using the new
# "make mode" option. $(O) is meant as a shortcut for $(SPHINXOPTS).
%: Makefile
@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)

View File

@@ -1,5 +0,0 @@
=================
API Documentation
=================
Certbot plugins implement the Certbot plugins API, and do not otherwise have an external API.

View File

@@ -1,181 +0,0 @@
# -*- coding: utf-8 -*-
#
# certbot-dns-cloudxns documentation build configuration file, created by
# sphinx-quickstart on Wed May 10 16:05:50 2017.
#
# This file is execfile()d with the current directory set to its
# containing dir.
#
# Note that not all possible configuration values are present in this
# autogenerated file.
#
# All configuration values have a default; values that are commented out
# serve to show the default.
# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
# -- General configuration ------------------------------------------------
# If your documentation needs a minimal Sphinx version, state it here.
#
needs_sphinx = '1.0'
# Add any Sphinx extension module names here, as strings. They can be
# extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
# ones.
extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode']
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
source_suffix = '.rst'
# The master toctree document.
master_doc = 'index'
# General information about the project.
project = u'certbot-dns-cloudxns'
copyright = u'2017, Certbot Project'
author = u'Certbot Project'
# The version info for the project you're documenting, acts as replacement for
# |version| and |release|, also used in various other places throughout the
# built documents.
#
# The short X.Y version.
version = u'0'
# The full version, including alpha/beta/rc tags.
release = u'0'
# The language for content autogenerated by Sphinx. Refer to documentation
# for a list of supported languages.
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = 'en'
# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This patterns also effect to html_static_path and html_extra_path
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
default_role = 'py:obj'
# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = False
# -- Options for HTML output ----------------------------------------------
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
#
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the
# documentation.
#
# html_theme_options = {}
# Add any paths that contain custom static files (such as style sheets) here,
# relative to this directory. They are copied after the builtin static files,
# so a file named "default.css" will overwrite the builtin "default.css".
#html_static_path = ['_static']
# -- Options for HTMLHelp output ------------------------------------------
# Output file base name for HTML help builder.
htmlhelp_basename = 'certbot-dns-cloudxnsdoc'
# -- Options for LaTeX output ---------------------------------------------
latex_elements = {
# The paper size ('letterpaper' or 'a4paper').
#
# 'papersize': 'letterpaper',
# The font size ('10pt', '11pt' or '12pt').
#
# 'pointsize': '10pt',
# Additional stuff for the LaTeX preamble.
#
# 'preamble': '',
# Latex figure (float) alignment
#
# 'figure_align': 'htbp',
}
# Grouping the document tree into LaTeX files. List of tuples
# (source start file, target name, title,
# author, documentclass [howto, manual, or own class]).
latex_documents = [
(master_doc, 'certbot-dns-cloudxns.tex', u'certbot-dns-cloudxns Documentation',
u'Certbot Project', 'manual'),
]
# -- Options for manual page output ---------------------------------------
# One entry per manual page. List of tuples
# (source start file, name, description, authors, manual section).
man_pages = [
(master_doc, 'certbot-dns-cloudxns', u'certbot-dns-cloudxns Documentation',
[author], 1)
]
# -- Options for Texinfo output -------------------------------------------
# Grouping the document tree into Texinfo files. List of tuples
# (source start file, target name, title, author,
# dir menu entry, description, category)
texinfo_documents = [
(master_doc, 'certbot-dns-cloudxns', u'certbot-dns-cloudxns Documentation',
author, 'certbot-dns-cloudxns', 'One line description of project.',
'Miscellaneous'),
]
# Example configuration for intersphinx: refer to the Python standard library.
intersphinx_mapping = {
'python': ('https://docs.python.org/', None),
'acme': ('https://acme-python.readthedocs.org/en/latest/', None),
'certbot': ('https://eff-certbot.readthedocs.io/en/stable/', None),
}

View File

@@ -1,28 +0,0 @@
.. certbot-dns-cloudxns documentation master file, created by
sphinx-quickstart on Wed May 10 16:05:50 2017.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to certbot-dns-cloudxns's documentation!
================================================
.. toctree::
:maxdepth: 2
:caption: Contents:
.. automodule:: certbot_dns_cloudxns
:members:
.. toctree::
:maxdepth: 1
api
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`

View File

@@ -1,36 +0,0 @@
@ECHO OFF
pushd %~dp0
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set SOURCEDIR=.
set BUILDDIR=_build
set SPHINXPROJ=certbot-dns-cloudxns
if "%1" == "" goto help
%SPHINXBUILD% >NUL 2>NUL
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.https://www.sphinx-doc.org/
exit /b 1
)
%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
goto end
:help
%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
:end
popd

View File

@@ -1,15 +0,0 @@
# readthedocs.org gives no way to change the install command to "pip
# install -e certbot-dns-cloudxns[docs]" (that would in turn install documentation
# dependencies), but it allows to specify a requirements.txt file at
# https://readthedocs.org/dashboard/letsencrypt/advanced/ (c.f. #259)
# Although ReadTheDocs certainly doesn't need to install the project
# in --editable mode (-e), just "pip install certbot-dns-cloudxns[docs]" does not work as
# expected and "pip install -e certbot-dns-cloudxns[docs]" must be used instead
# We also pin our dependencies for increased stability.
-c ../tools/requirements.txt
-e acme
-e certbot
-e certbot-dns-cloudxns[docs]

View File

@@ -1,73 +0,0 @@
import os
import sys
from setuptools import find_packages
from setuptools import setup
version = '1.25.0.dev0'
install_requires = [
'dns-lexicon>=3.2.1',
'setuptools>=41.6.0',
]
if not os.environ.get('SNAP_BUILD'):
install_requires.extend([
# We specify the minimum acme and certbot version as the current plugin
# version for simplicity. See
# https://github.com/certbot/certbot/issues/8761 for more info.
f'acme>={version}',
f'certbot>={version}',
])
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Unset SNAP_BUILD when building wheels '
'to include certbot dependencies.')
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
docs_extras = [
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
'sphinx_rtd_theme',
]
setup(
name='certbot-dns-cloudxns',
version=version,
description="CloudXNS DNS Authenticator plugin for Certbot",
url='https://github.com/certbot/certbot',
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: Apache Software License',
'Operating System :: POSIX :: Linux',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',
'Topic :: System :: Networking',
'Topic :: System :: Systems Administration',
'Topic :: Utilities',
],
packages=find_packages(),
include_package_data=True,
install_requires=install_requires,
extras_require={
'docs': docs_extras,
},
entry_points={
'certbot.plugins': [
'dns-cloudxns = certbot_dns_cloudxns._internal.dns_cloudxns:Authenticator',
],
},
)

Some files were not shown because too many files have changed in this diff Show More