Compare commits

...

212 Commits

Author SHA1 Message Date
Will Greenberg
a2f86b5514 Migrate master branch to main
We're a few years behind the curve on this one, but using "master" as a
programming term is a callous practice that explicitly uses the
historical institution of slavery as a cheap, racist metaphor. Switch to
using "main", as it's the new default in git and GitHub.
2024-09-26 14:51:10 -07:00
Brad Warren
4b51e3004c remove certbot_dns_route53.authenticator (#10014)
This is another and very minor piece of https://github.com/certbot/certbot/issues/9988.

We've done nothing to warn/migrate installations using the old `certbot-route53:auth` plugin name and installations like that still exist according to https://gist.github.com/bmw/aceb69020dceee50ba827ec17b22e08a. We could try to warn/migrate these users for a future release or decide it's niche enough that we'll just let it break, but I think it's easy enough to keep the simple shim around.

This PR just moves the code raising a deprecation warning into `_internal` as part of cleaning up all deprecation warnings I found in https://github.com/certbot/certbot/issues/9988. I manually tested this with a Certbot config using the `certbot-route53:auth` plugin name and renewal worked just fine.
2024-09-18 14:07:35 -07:00
ohemorange
018800c5cc specify channel in weekly mm message (#10013) 2024-09-16 12:31:52 -07:00
Brad Warren
2eb4154169 allow manually triggering GH actions (#10015) 2024-09-16 12:16:51 -07:00
Brad Warren
becc2c3fee Remove deprecated --dns-route53-propagation-seconds (#10010)
* remove dns-route53-prop-secs

* document design difference
2024-09-13 12:14:49 -07:00
ldlb
cb5382d4d5 Remove deprecated features:--manual-public-ip-logging-ok (#9991)
* Remove parameter '--manual-public-ip-logging-ok'

* Update changelog with removal of '--manual-public-ip-logging-ok' flag
2024-09-12 07:21:55 -07:00
ohemorange
6975e32998 Fix weekly mattermost notifier (#10009) 2024-09-11 11:11:47 -07:00
Brad Warren
62962357c5 add parenthesis (#10008) 2024-09-10 13:06:48 -07:00
ohemorange
343b540970 Use new mattermost action workflow (#10007) 2024-09-10 12:53:21 -07:00
ohemorange
089b7efacd Update syntax for mattermost webhooks (#10006) 2024-09-10 12:16:53 -07:00
Brad Warren
1584b0b58c add macos qol suggestions (#9995) 2024-09-09 12:34:00 -07:00
Brad Warren
141b15077c Update changelog for 3.0 and remove update_symlinks and {csr,key}_dir (#10004)
* update changelog to 3.0

we did a similar thing in https://github.com/certbot/certbot/pull/9461

* remove update_symlinks

* remove {csr,key}_dir
2024-09-09 12:31:25 -07:00
Brad Warren
ee2c4844b9 fix centos9 test (#9999) 2024-09-05 16:14:10 -07:00
Shubham Sharma
181813b9b2 add mijn.host (#10002) 2024-09-05 08:56:03 -07:00
Alexandre Detiste
43d0652b0d remove six leftovers (#9996) 2024-08-30 11:38:44 -07:00
Adrien Ferrand
80e68bec26 Update dependencies (27-08-2024) (#9993)
Update dependencies & proactively defends against major bump to Josepy 2+

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2024-08-28 07:22:22 -07:00
Brad Warren
7b2b2b1685 switch from gpg2 to gpg (#9985)
The `gnupg` package from Homebrew only installs a `gpg` binary, not a `gpg2` binary. I had previously worked around this by manually creating an alias, but I think we can do better.

GPG version 1 is ancient and [hasn't seen a release since 2006](https://gnupg.org/download/release_notes.html). Additionally, `gpg` has referred to GPG 2 in Ubuntu since at least 20.04 which is the oldest non-EOL'd version as of writing this so I think this change is safe to make.
2024-08-19 15:24:39 -07:00
Will Greenberg
c3c587001f Update python version to 3.12 and base to core24 in snaps (#9983)
Fixes #9872, originally merged in #9956.

To upgrade to python3.12 as 3.8 is reaching EOL, we need to upgrade the core snap that certbot is based on. The latest version is core24, so we're going with that for longevity. We will want to notify third party snaps to make changes as well. They can release their snaps to a version higher than certbot's, and their users will not be upgraded until the matching (or greater) version of certbot is released. They should do this as otherwise including these changes will break their plugins.

Key documents for this migration are https://snapcraft.io/docs/migrate-core22 and https://snapcraft.io/docs/migrate-core24. The discussion at https://forum.snapcraft.io/t/upgrading-classic-snap-to-core24-using-snapcraft-8-3-causes-python-3-12-errors-at-runtime/ is also relevant to understanding some changes, which may become unnecessary in future versions of snapcraft.


* Migrate primary certbot snap to core24 and python 3.12

* Migrate plugin snaps to core24 and python 3.12

* Migrate to core24 in build_remote

* Run snap tests using python 3.12

* Unstage pyvenv.cfg and set PYTHONPATH

---------

Co-authored-by: Erica Portnoy <ebportnoy@gmail.com>
Co-authored-by: Erica Portnoy <erica@eff.org>
2024-08-08 16:24:11 -07:00
Will Greenberg
281b724996 clarify docs (#9984)
Authored-by: Brad Warren <bmw@eff.org>
2024-08-08 16:16:28 -07:00
Will Greenberg
3d5714f499 dns_server: update BIND9 docker image (#9973)
The 9.16 image isn't published anymore
2024-07-30 22:13:48 +00:00
Will Greenberg
ba9f1939ab Merge pull request #9963 from certbot/test-no-centos7
remove centos7 test
2024-07-03 11:14:07 -07:00
Brad Warren
481c8c0600 remove centos7 test 2024-07-03 09:48:55 -07:00
OmniTroid
35b177a1a0 seperate->separate (#9954) 2024-06-21 06:35:42 -07:00
Will Greenberg
95976762ac certbot-compatibility-test: fix breaking tests (#9955)
Recently our test environments were upgraded to use Docker 26, which
enabled ipv6 loopback by default in containers. This caused tests to
start failing due to an nginx test config which was the sole listener
for ipv6.

This simply removes that ipv6 listen directive in the config, and the
archived version we use for testing.
2024-06-20 11:37:28 -07:00
Will Greenberg
bf64e7f4e4 Merge pull request #9953 from certbot/candidate-2.11.0
Candidate 2.11.0
2024-06-05 20:13:22 -07:00
Will Greenberg
9213154e44 Bump version to 2.12.0 2024-06-05 14:34:41 -07:00
Will Greenberg
810d50eb3d Add contents to certbot/CHANGELOG.md for next version 2024-06-05 14:34:41 -07:00
Will Greenberg
99a4129cd4 Remove built packages from git 2024-06-05 14:34:41 -07:00
Will Greenberg
8db8fcf26c Release 2.11.0 2024-06-05 14:34:40 -07:00
Will Greenberg
6d8fec7760 Update changelog for 2.11.0 release 2024-06-05 14:34:02 -07:00
Will Greenberg
4f3af45f5c Merge pull request #9952 from certbot/test-snap-config-nits
suggest snap_config nits
2024-06-05 10:33:26 -07:00
Brad Warren
8ebd8ea9fb suggest snap_config nits 2024-06-04 14:32:34 -07:00
Brad Warren
83d8fbbd75 Merge pull request #9950 from certbot/test-update-deps
update dependencies
2024-06-04 12:58:38 -07:00
Will Greenberg
0c49ab462f snap_config: oops kwargs are important i guess 2024-06-04 10:37:28 -07:00
Will Greenberg
35091d878f snap_config: switch to newer HttpAdapter interface 2024-06-03 18:13:31 -07:00
Brad Warren
c31f53a225 run tools/pinning/current/repin.sh 2024-05-31 10:10:46 -07:00
Brad Warren
d2a13c55f2 pin back mypy (#9939)
while working on https://github.com/certbot/certbot/issues/9938, i updated our dependencies which updated mypy introducing new errors that mypy wanted me to fix. i think this makes the regularly necessary process of updating our dependencies too tedious and we should instead pin our linters that do this to a specific version and update them manually as desired. we already do this with pylint in the lines above my changes in this PR for the same reason
2024-05-30 11:21:32 -07:00
Will Greenberg
de1ce7340f Merge pull request #9937 from ionos-cloud/docs_add_ionos_certbot_plugin
add IONOS Cloud DNS plugin to the documentation
2024-05-23 10:37:17 -07:00
Will Greenberg
929f9e944f Merge pull request #9944 from lukhnos/maintain-checklist-order
Ensure _scrub_checklist_input honors indices order (#9943)
2024-05-22 14:55:40 -07:00
Lukhnos Liu
6c422774d5 Ensure _scrub_checklist_input honors indices order (#9943)
This fixes a bug where, when a user requests a cert interactively, the
CSR's SANs are not listed in the order that the user has in mind. This
is because, during the input validation, the _scrub_checklist_input
method does not produce a list of tags (which represents the domain
names the user has requested a cert for) in the order of in the given
indices. As a result, the CN of the resulting cert, as well as the
directory name used to store the certs, may not always be what the user
has expected, which should be the first item chosen from the interactive
prompt.
2024-05-22 15:50:02 -04:00
Brad Warren
443ec2200f pin back cloudflare (#9940)
* pin back cloudflare

* update readme
2024-05-16 09:18:21 -07:00
zak905
38cbeb560c add IONOS Cloud DNS plugin to the documentation 2024-05-07 12:08:39 +02:00
Will Greenberg
873f979a25 Replace boulder tests with pebble (#9918)
Pebble 2.5.1 supports OCSP stapling, so we can finally replace all boulder tests/harnesses with the much simpler pebble setup.

Closes #9898

* Remove unused `--acme-server` argument

Since this argument is never set and always defaults to 'pebble', just
remove it to simplify assumptions about which test server's being used.

* Remove boulder option from integration tests

Now that pebble supports all of our test cases, we can move off of
the much more complicated boulder test harness.

* pebble_artifacts: bump to latest pebble release

* pebble_artifacts: fix download path

* certbot-ci: unzip pebble assets

* CI: rip out windows tests/jobs

* tox.ini: rm outdated Windows comment

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* ci: rm redundant integration test

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* acme_server: raise error if proxy and http-01 port are both set

* acme_server: rm vestigial preterimate commands stuff

---------

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2024-05-02 12:24:00 -07:00
Will Greenberg
2a41402f2a Merge pull request #9919 from certbot/unpin-poetry-tox
Unpin poetry and use tox >= v4
2024-04-10 11:54:31 -07:00
Brad Warren
6ecf3782ac document the github-releases credential (#9925) 2024-04-04 07:36:44 -07:00
Brad Warren
d1347fce9a Merge pull request #9927 from certbot/candidate-2.10.0
Candidate 2.10.0
2024-04-03 16:43:00 -07:00
Will Greenberg
9412ce9f05 Bump version to 2.11.0 2024-04-02 14:20:25 -07:00
Will Greenberg
fabe7bbc78 Add contents to certbot/CHANGELOG.md for next version 2024-04-02 14:20:25 -07:00
Will Greenberg
1e34fb8b51 Remove built packages from git 2024-04-02 14:20:25 -07:00
Will Greenberg
4d7d0d6d04 Release 2.10.0 2024-04-02 14:20:24 -07:00
Will Greenberg
cf77b3c3fa Update changelog for 2.10.0 release 2024-04-02 14:20:00 -07:00
Will Greenberg
a7674bd45a Merge pull request #9926 from certbot/docker-compose-v2
Switch to using docker compose v2
2024-04-02 14:16:45 -07:00
Will Greenberg
cdeac7a745 Remove CHANGELOG entry, update contributing docs 2024-04-02 13:47:56 -07:00
Will Greenberg
50b2097d38 conftest: use docker compose ls to test 2024-04-02 13:46:38 -07:00
Will Greenberg
30e7f23360 Switch to using docker compose v2
Azure recently dropped the `docker-compose` standalone executable (aka
docker-compose v1), and since it's not receiving updates anymore, let's
get with the times and update to v2 as well.
2024-04-02 12:36:29 -07:00
Brad Warren
248455a92b add back package signing (#9913)
* add packages to git commit

* rename deploy stage

* rename deploy jobs

* set up github releases

* remove v

* tweak release script

* remove publishing windows installer

* update changelog
2024-04-01 10:59:55 -07:00
Erica Portnoy
cca30ace31 actually completely unpin poetry 2024-03-29 12:03:04 -07:00
Erica Portnoy
90348bde4e allowlist the apache conf test farm test 2024-03-29 11:24:51 -07:00
Erica Portnoy
54dd12cd57 update azure environment passing to new required but undocumented format 2024-03-28 18:05:47 -07:00
Erica Portnoy
4e6934a4b6 new tox requires local scripts to be explicitly allowlisted 2024-03-28 17:36:02 -07:00
Erica Portnoy
57bb4e40b7 remove accidentally checked in file 2024-03-28 15:52:11 -07:00
Erica Portnoy
7f885292f9 why does this fix it????? 2024-03-28 15:49:20 -07:00
Erica Portnoy
8978e4dbff remove useless editable-legacy flag 2024-03-28 15:42:41 -07:00
Erica Portnoy
920b717c45 update poetry version using urllib3 workaround 2024-03-28 15:34:24 -07:00
Erica Portnoy
54b7b1883e use legacy editable mode for oldest tests 2024-03-27 16:27:42 -07:00
Erica Portnoy
87ab76fc7d allowlist apache conf test external 2024-03-27 15:44:28 -07:00
Erica Portnoy
4925f71933 work around undocumented lack of ability to reference multi-named envs 2024-03-27 15:11:21 -07:00
Erica Portnoy
39fda1d44d make minor changes to support tox v4 based on https://tox.wiki/en/latest/upgrading.html 2024-03-27 14:16:40 -07:00
Erica Portnoy
c8a1e30981 change tox pin to >= 4 and rerun pinning script 2024-03-27 14:05:59 -07:00
Brad Warren
7abf143394 update centos9stream ami (#9914) 2024-03-20 13:18:36 -07:00
ohemorange
f4e031f505 Add troubleshooting instructions to the finish_release script for snapcraft credential expiry. (#9896) 2024-02-08 21:31:36 +00:00
Brad Warren
2844fdd74a Merge pull request #9895 from certbot/candidate-2.9.0
Candidate 2.9.0
2024-02-08 13:05:29 -08:00
Erica Portnoy
3b183961a9 Bump version to 2.10.0 2024-02-08 11:46:08 -08:00
Erica Portnoy
76411ecca7 Add contents to certbot/CHANGELOG.md for next version 2024-02-08 11:46:08 -08:00
Erica Portnoy
725c64d581 Release 2.9.0 2024-02-08 11:46:07 -08:00
Erica Portnoy
99ae4ac5ef Update changelog for 2.9.0 release 2024-02-08 11:45:17 -08:00
Brad Warren
b8b759f1d2 update dependencies (#9893)
Fixes https://github.com/certbot/certbot/issues/9892 and https://github.com/certbot/certbot/security/dependabot

Upgrading the base docker image has been done in previous PRs like https://github.com/certbot/certbot/pull/9415. Doing this was needed because the [newer versions of `cryptography` need a newer version of rust](https://dev.azure.com/certbot/certbot/_build/results?buildId=7451&view=logs&j=fdd3565a-f3c6-5154-eca9-9ae03666f7bd&t=5dbd9851-46a4-524f-73a8-4028241afcde&l=475).

I ran the full test suite on this branch which you can see in the GitHub status checks below. The boulder tests should fail as they're to be fixed by https://github.com/certbot/certbot/pull/9889 but everything else should pass.
2024-02-07 17:55:30 -08:00
Brad Warren
8b5a017b05 use our own boulder rate limit file (#9889)
* use our own rate limit file

* clarify path
2024-02-07 17:33:07 -08:00
ohemorange
b7ef536ec3 Use the legacy snapcraft build until #9890 is fixed (#9891) 2024-02-07 16:29:08 -08:00
Simon Stier
282df74ee9 add 3rd party certbot-dns-stackit to the docs (#9885) 2024-02-02 08:38:55 -08:00
Alexis
0a565815f9 Docs: Reset requirements.txt path (#9877)
* Reset requirements.txt path

* Add requirements.txt path

* Test config path

* Change docs path

* Amend paths for successful builds

* Place copyright for epub

- Will amend copyright parameter at a later date
2024-02-01 08:27:45 -08:00
ohemorange
d33bbf35c2 Make reconfigure use staging server (#9870)
* Make reconfigure use staging server

* lint and imports

* Unset the account if it's been set in preparation for a dry run

* Add unit tests for checking we switch to staging and don't accidentally modify anything else

* add docstring

* Add test to make sure a requested new account id is saved

* update changelog

* set noninteractive mode for dry run

* error when account or server is set by the user

* switch to checking for changed values in account and server

* recommend using renew instead of certonly for forbidden fields

* change link to renew-reconfiguration
2024-01-26 12:09:20 -08:00
Brad Warren
714a0b348d offer poetry verbosity (#9881) 2024-01-24 16:15:26 -08:00
Alexis
7ca1b8f286 Merge pull request #9876 from certbot/zoraconpatch-yaml-error
Fix YAML Errors in "Formats" section
2024-01-18 10:45:09 -08:00
zoracon
be40e377d9 Move YAML file back and amend paths 2024-01-17 14:51:37 -08:00
zoracon
01cf4bae75 Amend YAML error on reeadthedocs yaml files 2024-01-17 14:46:12 -08:00
Will Greenberg
ef949f9149 Merge pull request #9858 from certbot/zoracon-patch-readthedocs-test
Move .readthedocs.yaml
2024-01-16 14:03:25 -08:00
ohemorange
926d0c7e0f Fix mypy joinpath errors (#9871)
* Fix mypy joinpath errors

* update changelog
2024-01-05 16:35:37 -08:00
Brad Warren
9d8eb6ccfd Add Python 3.12 support (#9852)
* add py312 support

* sed -i "s/\( *'Pro.*3\.1\)1\(',\)/\11\2\n\12\2/" */setup.py

* update pytest.ini comment

* upgrade macos version

* fixup changelog
2023-12-13 10:02:38 -08:00
Alexis
585f70e700 Create .readthedocs.yaml
Test moving config file in attempt to solve build errors
2023-12-07 18:52:05 -08:00
Alexis
21e24264f4 Bump Hardcoded RSA Default in API (#9855)
Rectifies: https://github.com/certbot/certbot/security/advisories/GHSA-pcq2-mjvr-m4jj
2023-12-06 13:00:55 -08:00
Brad Warren
cf78ad3a3d Merge pull request #9853 from certbot/candidate-2.8.0
Candidate 2.8.0
2023-12-05 16:48:55 -08:00
Will Greenberg
dccb92d57f Bump version to 2.9.0 2023-12-05 11:14:39 -08:00
Will Greenberg
f9d31faadc Add contents to certbot/CHANGELOG.md for next version 2023-12-05 11:14:39 -08:00
Will Greenberg
e9225d1cc2 Release 2.8.0 2023-12-05 11:14:38 -08:00
Will Greenberg
3dd1f0eea9 Update changelog for 2.8.0 release 2023-12-05 11:13:52 -08:00
Brad Warren
917e3aba6b add pkg_resources changelog (#9851) 2023-12-05 10:33:49 -08:00
Brad Warren
3833255980 update dependencies (#9848) 2023-12-05 10:33:31 -08:00
Francesco Colista
619654f317 Add support for Alpine Linux (#9834)
Signed-off-by: Francesco Colista <fcolista@alpinelinux.org>
2023-11-22 13:53:31 +01:00
Brad Warren
76f9a33e45 Upgrade the pinned version of pylint (#9839)
* upgrade pylint

* fix upgraded pylint

* downgrade pyopenssl

* remove unneeded ignores

* stop using text

* update sphinx-rtd-theme
2023-11-15 09:52:37 +01:00
Adrien Ferrand
5f67bb99a8 Full cleanup of pkg_resources (#9797)
Fixes #9606

This PRs removes some elements that were related to pkg_resources dependency and its deprecation.
2023-11-13 15:50:32 -08:00
Will Greenberg
d8392bf394 Merge pull request #9832 from certbot/candidate-2.7.4
Update master from 2.7.4 release
2023-11-01 11:36:29 -07:00
Brad Warren
6a89fcbc56 Merge branch 'master' into candidate-2.7.4 2023-11-01 07:50:54 -07:00
Brad Warren
2adaacab82 Bump version to 2.8.0 2023-11-01 06:24:20 -07:00
Brad Warren
2ae810c45a Add contents to certbot/CHANGELOG.md for next version 2023-11-01 06:24:19 -07:00
Brad Warren
b62133e3e1 Release 2.7.4 2023-11-01 06:24:18 -07:00
Brad Warren
a92bb44ff9 Update changelog for 2.7.4 release 2023-11-01 06:23:12 -07:00
Brad Warren
9650c25968 Fix change detection on mutable values (#9829) (#9830)
* handle mutable values

* add unit test

* add changelog entry

* fix typo

(cherry picked from commit c3c29afdca)
2023-11-01 00:10:11 +00:00
Brad Warren
c3c29afdca Fix change detection on mutable values (#9829)
* handle mutable values

* add unit test

* add changelog entry

* fix typo
2023-10-31 16:28:16 -07:00
Brad Warren
dca4ddd3d8 Prep for 2.7.4 (#9823)
* Set the delegated field in Lexicon config to bypass subdomain resolution (#9821)

The Lexicon-based DNS plugins use a mechanism to determine which actual segment of the input domain is actually the DNS zone in which the DNS-01 challenge has to be initiated (eg. `subdomain.domain.com` or `domain.com` for input `subdomain.domain.com`): they tries recursively to configure Lexicon and initiate authentication from the most specific to most generic domain segment, and select the first segment where Lexicon stop erroring out.

This mechanism broke with #9746 because now the plugins call Lexicon client instead of the underlying providers, and the client makes guess on the actual domain requested. Typically for `subdomain.domain.com` it will actually try to authenticate against `domain.com`, and so the mechanism above does not work anymore.

This PR fixes the issue by using the `delegated` field in Lexicon config each time the plugin needs it. This field is designed for this kind of purpose: it will instruct Lexicon what is the actual DNS zone domain instead of guessing it.

I tested the change with one of my OVH account. The expected behavior is re-established and the plugin is able to test `subdomain.domain.com` then `domain.com` as before.

Fixes #9791
Fixes #9818

(cherry picked from commit cf4f07d17e)

* add changelog entry for 9821 (#9822)

(cherry picked from commit 7bb85f8440)

---------

Co-authored-by: Adrien Ferrand <adferrand@users.noreply.github.com>
2023-10-30 10:34:30 -07:00
Brad Warren
7bb85f8440 add changelog entry for 9821 (#9822) 2023-10-28 00:04:11 +02:00
Adrien Ferrand
cf4f07d17e Set the delegated field in Lexicon config to bypass subdomain resolution (#9821)
The Lexicon-based DNS plugins use a mechanism to determine which actual segment of the input domain is actually the DNS zone in which the DNS-01 challenge has to be initiated (eg. `subdomain.domain.com` or `domain.com` for input `subdomain.domain.com`): they tries recursively to configure Lexicon and initiate authentication from the most specific to most generic domain segment, and select the first segment where Lexicon stop erroring out.

This mechanism broke with #9746 because now the plugins call Lexicon client instead of the underlying providers, and the client makes guess on the actual domain requested. Typically for `subdomain.domain.com` it will actually try to authenticate against `domain.com`, and so the mechanism above does not work anymore.

This PR fixes the issue by using the `delegated` field in Lexicon config each time the plugin needs it. This field is designed for this kind of purpose: it will instruct Lexicon what is the actual DNS zone domain instead of guessing it.

I tested the change with one of my OVH account. The expected behavior is re-established and the plugin is able to test `subdomain.domain.com` then `domain.com` as before.

Fixes #9791
Fixes #9818
2023-10-27 10:04:40 -07:00
Will Greenberg
36c78b3717 Merge pull request #9819 from certbot/candidate-2.7.3
Update master from 2.7.3 release
2023-10-26 14:01:31 -07:00
Will Greenberg
bf5475fa74 Merge pull request #9820 from certbot/2.7.3-update
Update 2.7.x from 2.7.3 release
2023-10-26 14:00:37 -07:00
Brad Warren
9bfc9dda5c Merge branch 'master' into candidate-2.7.3 2023-10-25 08:27:20 -07:00
Brad Warren
e904bd4e29 Bump version to 2.8.0 2023-10-24 13:43:22 -07:00
Brad Warren
d140a7df52 Add contents to certbot/CHANGELOG.md for next version 2023-10-24 13:43:22 -07:00
Brad Warren
bd550c09c2 Release 2.7.3 2023-10-24 13:43:20 -07:00
Brad Warren
01405a8fa6 Update changelog for 2.7.3 release 2023-10-24 13:42:05 -07:00
Brad Warren
5bf833fe28 2.7.3 prep (#9817)
* Update changelog for 2.7.2 release

* Release 2.7.2

* helpful: Add an edge case for arguments w/ contained spaces (#9813)

Fixes #9811

(cherry picked from commit 3ae9d7e03a)

* fixes #9805 (#9816)

(cherry picked from commit d1577280ad)

---------

Co-authored-by: Will Greenberg <willg@eff.org>
2023-10-24 12:49:04 -07:00
Brad Warren
d1577280ad fixes #9805 (#9816) 2023-10-24 12:27:19 -07:00
Will Greenberg
3ae9d7e03a helpful: Add an edge case for arguments w/ contained spaces (#9813)
Fixes #9811
2023-10-24 08:26:00 -07:00
Will Greenberg
5594ac20e0 Merge pull request #9809 from certbot/candidate-2.7.2
Candidate 2.7.2
2023-10-19 17:49:02 -07:00
Brad Warren
7f6000f1d4 Merge branch 'master' into candidate-2.7.2 2023-10-19 17:35:05 -07:00
Will Greenberg
1863c66179 Bump version to 2.8.0 2023-10-19 15:34:19 -07:00
Will Greenberg
185c20c71b Add contents to certbot/CHANGELOG.md for next version 2023-10-19 15:34:19 -07:00
Will Greenberg
a1b773cbdc Release 2.7.2 2023-10-19 15:34:18 -07:00
Will Greenberg
937eaef621 Update changelog for 2.7.2 release 2023-10-19 15:33:34 -07:00
Brad Warren
e40741955f Prep for 2.7.2 (#9808)
* helpful: fix handling of abbreviated ConfigArgparse arguments (#9796)

* helpful: fix handling of abbreviated ConfigArgparse arguments

ConfigArgparse allows for "abbreviated" arguments, i.e. just the prefix
of an argument, but it doesn't set the argument sources in these cases.
This commit checks for those cases and sets the sources appropriately.

* failing to find an action raises an error instead of logging

* Update changelog

* Add handling for short arguments, fix equals sign handling

These were silently being dropped before, possibly leading to instances
of `NamespaceConfig.set_by_user()` returning false negatives.

(cherry picked from commit 11e17ef77b)

* Fix finish_release.py (#9800)

* response is value

* rename vars

(cherry picked from commit a96fb4b6ce)

* Merge pull request #9762 from certbot/docs/yaml-config

Add YAML files for Readthedocs requirements

(cherry picked from commit 44046c70c3)

* Update Lexicon requirements to stabilize certbot-dns-ovh behavior (#9802)

* Update minimum Lexicon version required for certbot-dns-ovh

* Add types

* FIx mypy

* Fix lint

* Fix BOTH lint and mypy

(cherry picked from commit 5cf5f36f19)

* simplify code (#9807)

(cherry picked from commit 6f7b5ab1cd)

* Include linting fixes from 8a95c03

---------

Co-authored-by: Will Greenberg <willg@eff.org>
Co-authored-by: Alexis <alexis@eff.org>
Co-authored-by: Adrien Ferrand <adferrand@users.noreply.github.com>
2023-10-19 11:27:21 -07:00
Brad Warren
6f7b5ab1cd simplify code (#9807) 2023-10-18 14:32:07 -07:00
Adrien Ferrand
5cf5f36f19 Update Lexicon requirements to stabilize certbot-dns-ovh behavior (#9802)
* Update minimum Lexicon version required for certbot-dns-ovh

* Add types

* FIx mypy

* Fix lint

* Fix BOTH lint and mypy
2023-10-18 13:19:26 -07:00
Brad Warren
a96fb4b6ce Fix finish_release.py (#9800)
* response is value

* rename vars
2023-10-16 17:54:24 -07:00
Will Greenberg
11e17ef77b helpful: fix handling of abbreviated ConfigArgparse arguments (#9796)
* helpful: fix handling of abbreviated ConfigArgparse arguments

ConfigArgparse allows for "abbreviated" arguments, i.e. just the prefix
of an argument, but it doesn't set the argument sources in these cases.
This commit checks for those cases and sets the sources appropriately.

* failing to find an action raises an error instead of logging

* Update changelog

* Add handling for short arguments, fix equals sign handling

These were silently being dropped before, possibly leading to instances
of `NamespaceConfig.set_by_user()` returning false negatives.
2023-10-13 12:02:01 -07:00
Adrien Ferrand
8a95c030e6 Drop Python 3.7 support (#9792)
* Drop Python 3.7 support

* Fix lint and test

* Check for venv generation

* Update requirements

* Update oldest constaints and compatibility tests runtime
2023-10-13 06:57:42 -07:00
Brad Warren
d9d825ac50 Merge pull request #9794 from certbot/candidate-2.7.1
Update master from 2.7.1 release
2023-10-11 16:37:57 -07:00
Brad Warren
07b1b0d8b2 Merge pull request #9795 from certbot/candidate-2.7.1-2.7.x
Update 2.7.x from 2.7.1 release
2023-10-11 16:37:48 -07:00
Brad Warren
beec975379 Merge branch 'master' into candidate-2.7.1 2023-10-10 08:50:31 -07:00
Mattias Ellert
01d129dfca Adapt to Python 3.12.0rc2 (#9764)
The warning message changed from "datetime.utcfromtimestamp() is deprecated"
to "datetime.datetime.utcfromtimestamp() is deprecated"
2023-10-10 16:02:24 +02:00
Brad Warren
8bf21cad25 Bump version to 2.8.0 2023-10-10 06:40:53 -07:00
Brad Warren
dcac5ed8f0 Add contents to certbot/CHANGELOG.md for next version 2023-10-10 06:40:53 -07:00
Brad Warren
228e3f2a8d Release 2.7.1 2023-10-10 06:40:52 -07:00
Brad Warren
6624e0b65c Update changelog for 2.7.1 release 2023-10-10 06:39:19 -07:00
Brad Warren
21113d17c7 Prep for 2.7.1 (#9790)
* Bump setup.py's ConfigArgParse version (#9784)

I neglected to do this during #9678, so looks like some pip installs
are failing to get the minimum required version.

(cherry picked from commit 02efc8c5ca)

* Fix dnsimple typo (#9787)

Fixes https://github.com/certbot/certbot/issues/9786.

(cherry picked from commit 4e60a0d03a)

* update pinned dependencies (#9788)

This fixes the security alerts those with access can see at https://github.com/certbot/certbot/security/dependabot.

(cherry picked from commit 5849ff73fb)

* update changelog for configargparse (#9789)

I'd like to do a bug fix release for https://github.com/certbot/certbot/issues/9786. If we're doing one, I figure we may as well flag this change from https://github.com/certbot/certbot/pull/9784 too.

(cherry picked from commit 61773be971)

---------

Co-authored-by: Will Greenberg <willg@eff.org>
2023-10-06 18:59:26 +00:00
Brad Warren
61773be971 update changelog for configargparse (#9789)
I'd like to do a bug fix release for https://github.com/certbot/certbot/issues/9786. If we're doing one, I figure we may as well flag this change from https://github.com/certbot/certbot/pull/9784 too.
2023-10-06 11:39:19 -07:00
Brad Warren
5849ff73fb update pinned dependencies (#9788)
This fixes the security alerts those with access can see at https://github.com/certbot/certbot/security/dependabot.
2023-10-06 11:39:08 -07:00
Brad Warren
4e60a0d03a Fix dnsimple typo (#9787)
Fixes https://github.com/certbot/certbot/issues/9786.
2023-10-05 13:15:30 -07:00
Alexis
44046c70c3 Merge pull request #9762 from certbot/docs/yaml-config
Add YAML files for Readthedocs requirements
2023-10-05 09:24:02 -07:00
Will Greenberg
02efc8c5ca Bump setup.py's ConfigArgParse version (#9784)
I neglected to do this during #9678, so looks like some pip installs
are failing to get the minimum required version.
2023-10-04 16:22:13 -07:00
Brad Warren
0862e05754 Merge pull request #9780 from certbot/candidate-2.7.0
Candidate 2.7.0
2023-10-03 12:46:06 -07:00
Will Greenberg
08d1979bcb Bump version to 2.8.0 2023-10-03 11:22:04 -07:00
Will Greenberg
6c66764f25 Add contents to certbot/CHANGELOG.md for next version 2023-10-03 11:22:04 -07:00
Will Greenberg
c4642c2dfe Release 2.7.0 2023-10-03 11:22:02 -07:00
Will Greenberg
bcb7f371e3 Update changelog for 2.7.0 release 2023-10-03 11:21:15 -07:00
Adrien Ferrand
732a3ac962 Refactor Lexicon-based DNS plugins (#9746)
* Refactor Lexicon-based DNS plugins and upgrade minimal version of Lexicon

* Relax filterwarning to comply with envs where boto3 is not installed

* Update pinned dependencies

* Use our previous method to deprecate part of modules

* Safe import internally

* Add changelog

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2023-09-25 15:15:04 -07:00
Alexis
694c758db7 Swap out with updated AMI image IDs (#9770)
- Add comments for other OS
2023-09-20 13:03:53 -07:00
zoracon
f5cb0a156b Remove duplicate file
- was in the incorrect directory
2023-09-20 12:58:36 -07:00
zoracon
4178e8ffc4 Merge branch 'master' of https://github.com/certbot/certbot into docs/yaml-config 2023-09-20 12:55:59 -07:00
zoracon
a3353b5c42 Revert "Swap out with updated AMI image IDs"
This reverts commit 24c8825d22.
2023-09-20 12:55:48 -07:00
zoracon
24c8825d22 Swap out with updated AMI image IDs
- Add comments for other OS
2023-09-20 12:46:33 -07:00
Adrien Ferrand
23f9dfc655 Migrate pkg_resources usages to importlib.metadata (#9749)
* Migrate entrypoint logic from pkg_resources to importlib.metadata

* Usage of importlib_metadata up to Python 3.9 to align API behavior to Python 3.10

---------

Co-authored-by: Adrien Ferrand <adrien.ferrand@amadeus.com>
Co-authored-by: Adrien Ferrand <adrien.ferrand@arteris.com>
2023-09-12 08:18:57 -07:00
Adrien Ferrand
cc359dab46 Migrate pkg_resources usages to importlib.resources (#9748)
* Migrate pkg_resources API related to resources to importlib_resources

* Fix lint and mypy + pin lexicon

* Update filterwarnings

* Update oldest tests requirements

* Update pinned dependencies

* Fix for modern versions of python

* Fix assets load in nginx integration tests

* Fix a warning

* Isolate static generation from importlib.resource into a private function

---------

Co-authored-by: Adrien Ferrand <adrien.ferrand@amadeus.com>
2023-09-07 11:38:44 -07:00
zoracon
89902e26bf Add YAML files for Readthedocs requirements 2023-08-31 16:06:47 -07:00
Paulo Dias
b1978ff188 dns-google: fix condition to don't use private dns zones (#9744)
* dns-google: fix condition to don't use private dns zones

* update MD

* Fix condition

* fix condition

* update testdata

* fix identation

* update tests

* update changelog

* Update dns_google.py

* add test for split horizon dns google

* add dnsName to managed zones
2023-08-27 01:19:38 +02:00
Brad Warren
579b39dce1 Fix docs (#9755)
* update quickstart and remove os import

* simplify theme use

* list sphinx_rtd_theme as extension

Our docs builds failed last night, presumably because #9754 updated `sphinx_rtd_theme` which changed some unknown thing.

Looking into it, our usage of this project was very unconventional. Following the code comment I deleted in this PR to https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally, simple instructions are given to put the following in your `conf.py` file:
```
extensions = [
    ...
    'sphinx_rtd_theme',
]

html_theme = "sphinx_rtd_theme"
```
I did this instead of the more complicated logic we were using and all builds passed locally. I also triggered a build on readthedocs with these changes which also passed.
2023-08-25 12:22:14 -07:00
Brad Warren
9b4b99f3e8 Update dependencies (#9754)
This takes care of the dependabot alerts those with access can see at https://github.com/certbot/certbot/security/dependabot.

Pinning back `cython` is needed because without it, our full test suite will fail when trying to build `pyyaml` on ARM systems.
2023-08-24 17:05:54 -07:00
Alexis
3e84b94308 Merge pull request #9739 from certbot/CI/workflow-patch-forks
Skip Mattermost Job for Forked Repos
2023-07-24 13:12:28 -07:00
Alexis
2cb2cb0575 Update merged.yaml 2023-07-24 12:11:40 -07:00
Mattias Ellert
ddd4b31b1c Mock in Python 3.12 finds more errors in mock syntax. (#9734) 2023-07-21 16:44:48 -07:00
Will Greenberg
68d812e6dd Add pytz as a dependency for integration tests (#9737) 2023-07-19 13:10:35 -07:00
Mattias Ellert
6effedc2f4 Do not call deprecated datetime.utcnow() and datetime.utcfromtimestamp() (#9735)
* Do not call deprecated datetime.utcnow() and datetime.utcfromtimestamp()

* Ignore DeprecationWarnings from importing dependencies

$ python3 -Wdefault
Python 3.12.0b4 (main, Jul 12 2023, 00:00:00) [GCC 13.1.1 20230614 (Red Hat 13.1.1-4)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import pkg_resources
/usr/lib/python3.12/site-packages/pkg_resources/__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API
  warnings.warn("pkg_resources is deprecated as an API", DeprecationWarning)
>>> import pytz
/usr/lib/python3.12/site-packages/pytz/tzinfo.py:27: DeprecationWarning: datetime.utcfromtimestamp() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.fromtimestamp(timestamp, datetime.UTC).
  _epoch = datetime.utcfromtimestamp(0)

* Used pytz.UTC consistently for clarity
2023-07-18 15:44:25 -07:00
Zachary Ware
c31d3a2cfd Update PyPI links (#9733)
Switch from the legacy pypi.python.org/pypi/ to the canonical pypi.org/project/
2023-07-15 15:58:00 -07:00
Brad Warren
e6572e695b deprecate python 3.7 support (#9730) 2023-07-10 08:06:04 +10:00
Brad Warren
a7674548ab Fix snap builds (#9729)
* release script change

* fix setup.py

* match setup.py logic
2023-07-07 13:14:05 +10:00
Michael Cassaniti
436b7fbe28 post renewal hook: Add RENEWED_DOMAINS and FAILED_DOMAINS as environment variables (#9724)
* renewal hook: Add RENEWED_DOMAINS and FAILED_DOMAINS as environment variables

* renewal hook: Updated documentation

* renewal hook: Updated CHANGELOG

* renew post hook: Add limit on variable sizes
2023-07-06 06:56:31 -07:00
alexzorin
d0e11c81b1 Repin dependencies to fix security alerts (#9717)
* repin current

* repin oldest

* csr must have version set to zero

* only set PIP_USE_PEP517 for macOS

* experiment with brew update git failure workaround
2023-07-05 06:40:02 -07:00
Leon G
4fc4d536c1 Add LooseVersion class with risky comparison, deprecate parse_loose_version (#9646)
* Replace parse_loose_version with LooseVersion

* Fix LooseVersion docstring

* Strengthen LooseVersion comparison

* Update changelog
2023-06-21 07:57:50 -07:00
alexzorin
b1e5efac3c disco: print the name of the plugin if it fails to load (#9719) 2023-06-16 08:26:15 -07:00
alexzorin
539d48d438 letstest: replace buster with bullseye (#9718) 2023-06-12 06:56:53 -07:00
Alex Gaynor
ae6268ea3c Remove workaround that's not relevant since py2 isn't supported (#9716) 2023-06-11 06:44:58 +10:00
Charles Hong
2d8a274eb5 Update using.rst (#9714)
Add a link to the third-party DNS authentication plugin using SOLIDserver
2023-06-08 18:40:58 +10:00
Remi Rampin
ff8afe827b Update GitHub repo location letsencrypt -> certbot (#9713)
* Update GitHub repo location letsencrypt -> certbot

* Revert changes to CHANGELOG
2023-06-08 10:27:28 +10:00
Will Greenberg
468f4749b8 Revert change to NamespaceConfig's constructor (#9709)
* Revert change to NamespaceConfig's constructor

NamespaceConfig's argument sources dict is now set with a method,
and raises a runtime error if one isn't set when set_by_user() is
called.

* Actually update CHANGELOG to reflect the set_by_user changes

* linter appeasement

* configuration: update docs, add test

This test ensures that calling `set_by_user` without an initialized
sources dict raises a RuntimeError.
2023-06-07 15:16:14 -07:00
Will Greenberg
a5d223d1e5 Replace (most) global state in cli/__init__.py (#9678)
* Rewrite helpful_test to appease the linter

* Use public interface to access argparse sources dict

* HelpfulParser builds ArgumentSources dict, stores it in NamespaceConfig

After arguments/config files/user prompted input have been parsed, we
build a mapping of Namespace options to an ArgumentSource value. These
generally come from argparse's builtin "source_to_settings" dict, but
we also add a source value representing dynamic values set at runtime.

This dict is then passed to NamespaceConfig, which can then be queried
directly or via the "set_by_user" method, which replaces the global
"set_by_cli" and "option_was_set" functions.

* Use NamespaceConfig.set_by_user instead of set_by_cli/option_was_set

This involves passing the NamespaceConfig around to more functions
than before, removes the need for most of the global state shenanigans
needed by set_by_cli and friends.

* Set runtime config values on the NamespaceConfig object

This'll correctly mark them as being "runtime" values in the
ArgumentSources dict

* Bump oldest configargparse version

We need a version that has get_source_to_settings_dict()

* Add more cli unit tests, use ArgumentSource.DEFAULT by default

One of the tests revealed that ConfigArgParse's source dict excludes
arguments it considers unimportant/irrelevant. We now mark all arguments
as having a DEFAULT source by default, and update them otherwise.

* Mark more argument sources as RUNTIME

* Removes some redundant helpful_test.py, moves one to cli_test.py

We were already testing most of these cases in cli_test.py, only
with a more complete HelpfulArgumentParser setup. And since the hsts/no-hsts
test was manually performing the kind of argument adding that cli
already does out of the box, I figured the cli tests were a more natural
place for it.

* appease the linter

* Various fixups from review

* Add windows compatability fix

* Add test ensuring relevant_values behaves properly

* Build sources dict in a more predictable manner

The dict is now built in a defined order: first defaults, then config
files, then env vars, then command line args. This way we eliminate the
possibility of undefined behavior if configargparse puts an arg's entry
in multiple source dicts.

* remove superfluous update to sources dict

* remove duplicate constant defines, resolve circular import situation
2023-05-30 17:12:51 -07:00
Alexis
b5661e84e8 Update README.rst (#9693)
* Update README.rst

Updating with newer info about keys and server support and removing redundant wording

* Adjust from feedback
2023-05-23 10:58:40 +10:00
alexzorin
aa270b37a2 docs: add "Choosing dependency versions" to contributing.rst (#9681)
* docs: add "Choosing dependency versions" to contributing.rst

* change a word
2023-05-12 07:52:02 +10:00
Brad Warren
35209d921d bump stale limit (#9691) 2023-05-09 17:06:47 -07:00
Brad Warren
0ac8e10c85 Merge pull request #9692 from certbot/candidate-2.6.0
Release Certbot 2.6.0
2023-05-09 15:52:33 -07:00
Erica Portnoy
36bfddbf4e Bump version to 2.7.0 2023-05-09 12:45:29 -07:00
Erica Portnoy
721c4665e6 Add contents to certbot/CHANGELOG.md for next version 2023-05-09 12:45:29 -07:00
Erica Portnoy
013621d04e Release 2.6.0 2023-05-09 12:45:28 -07:00
Erica Portnoy
e0e2bfe13a Update changelog for 2.6.0 release 2023-05-09 12:44:36 -07:00
alexzorin
d2e2a92cdd update farm tests (#9687)
* letstest: -ubuntu18.04 +centos9stream +debian11

* letstest: username for centos 9 stream is ec2-user

This is mentioned on https://centos.org/download/aws-images/

* ensure mod_ssl is installed

in centos 9 stream, apache has to be restarted after mod_ssl is
installed, or the snakeoil certificates will not be present and
apache won't start.

this also removes nghttp2 being installed as the relevant bug
is long fixed.
2023-05-08 14:37:14 -07:00
alexzorin
6e52695faa dns-rfc2136: add test coverage for PR #9672 (#9684)
* dns-rfc2136: add test coverage for PR #9672

* fix compatibility with oldest dnspython

* rename test to be more descriptive

Co-authored-by: ohemorange <ebportnoy@gmail.com>

---------

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2023-05-08 14:34:40 -07:00
Brad Warren
5b5a2efdc9 squelch warnings (#9689) 2023-05-04 10:42:49 -07:00
✨ Q (it/its) ✨
8a0b0f63de Support unknown ACME challenge types (#9680)
This is, to my knowledge, an entirely inconsequential PR to add support for entirely novel challenge types.

Presently in the [`challb_to_achall` function](399b932a86/certbot/certbot/_internal/auth_handler.py (L367)) if the challenge type is not of a type known to certbot an error is thrown. This check is mostly pointless as an authenticator would not request a challenge unknown to it. This check does however forbid any plugins from supporting entirely novel challenges not of the key authorisation form.

* support unknown ACME challenge types

* add to changelog

* update tests

---------

Co-authored-by: Brad Warren <bmw@eff.org>
2023-04-26 08:23:11 -07:00
alexzorin
10fba2ee3f docs: clarify --dry-run documentation (#9683)
* remove pointless paragraph about --server and wildcards

* docs: update help text for --dry-run and --staging

* docs: update "Changing the ACME Server" for --dry-run

* add note about webserver reloads
2023-04-25 16:43:18 -07:00
alexzorin
67f14f177b ignore invalid plugin selection choices (#9665)
* plugins: ensure --installer/--authenticator is properly filtered

* fix windows failure in test
2023-04-25 11:27:32 +10:00
Phil Martin
f378ec4a0f Optionally sign initial SOA query (#9672)
* Optionally sign initial SOA query

Added configuration file option to enable signing of the initial SOA query when determining the authoritative nameserver for the zone. Default is disabled.

* Better handling of sign_query configuration and fix lint issues

* Update str casting to match 5503d12395

* Update certbot/CHANGELOG.md

Co-authored-by: alexzorin <alex@zorin.au>

* Update certbot/CHANGELOG.md

Co-authored-by: alexzorin <alex@zorin.au>

* Update dns_rfc2136.py

Updated with feedback from certbot/certbot#9672

---------

Co-authored-by: alexzorin <alex@zorin.au>
2023-04-25 11:25:57 +10:00
Jawshua
b0d0a83277 google: use Application Default Credentials where available (#9670)
* google: use Application Default Credentials where available

* Updated custom role documentation
2023-04-22 07:58:18 +10:00
Will Greenberg
399b932a86 Merge pull request #9673 from certbot/types-dns-common-get
types: CredentialsConfiguration.conf can return None
2023-04-17 17:45:00 -07:00
Alex Zorin
b9ec3155f7 amend rtype 2023-04-18 08:14:11 +10:00
Alex Zorin
ef5f4cae04 fix cast formatting 2023-04-18 08:13:28 +10:00
Brad Warren
31094bc547 rewrite coverage tests (#9669)
In addition to the speed improvements in CI, the speed improvements locally with both this https://github.com/certbot/certbot/pull/9666 which this builds on is even more significant. After it's been run once so it's had a chance to set up the different virtual environments, `tox` locally now takes 39 seconds on my laptop when it used to take 137 seconds.
2023-04-17 13:01:00 -07:00
Niek Peeters
f41673982d validate lineage name (#9644)
Fixes #6127.

* Added lineage name validity check

* Verify lineage name validity before obtaining certificate

* Added linage name limitation to cli help

* Update documentation on certificate name

* Added lineage name validation to changelog

* Use filepath seperators to determine lineagename validity

* Add unittest for private choose_lineagename method

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2023-04-17 12:55:20 -07:00
Brad Warren
996cc20cd7 remove unused envrc (#9677) 2023-04-17 02:17:55 +00:00
Brad Warren
20ccf8c9c9 remove development dockerfile (#9676) 2023-04-17 12:14:25 +10:00
Alex Zorin
5503d12395 types: CredentialsConfiguration.conf can return None 2023-04-16 10:43:00 +10:00
Brad Warren
4740e20725 Rewrite tox config (#9666)
* rewrite tox config

* fix apacheconftest-with-pebble deps

* more fixes

* more fixes

* move comment up

* fix mock location

* bump cffi

* update oldest constraints

* Revert "fix mock location"

This reverts commit 561037bfad.

* fix apache test

* fix server cleanup

* fix some leaky sockets

* stop leaking sockets

* change less

* Update tox.ini

Co-authored-by: alexzorin <alex@zorin.id.au>

* Update tox.ini

Co-authored-by: alexzorin <alex@zorin.id.au>

* tweak contributing doc

---------

Co-authored-by: alexzorin <alex@zorin.id.au>
2023-04-16 10:30:59 +10:00
Brad Warren
dc05b4da7a Increase stale operations per run (#9668)
* increase operations per run

* update comment
2023-04-13 09:18:24 +10:00
Brad Warren
5149dfd96e Add some missing type libraries for mypy (#9657)
* add some missing types

* install pkg-config

* install pkg-config for docker too

* add pkg-config to plugins

* pkg-config when cryptography may need to be built

* deps cleanup

* more comments

* more tweaks
2023-04-09 11:49:08 +10:00
humanoid2050
9ee1eee219 Build with buildkit (#9628)
* generate multiarch images for non-architecture tags

* Update documentation related to multiarch Docker

* Remove qemu and switch to build via buildkit

* Move to multistage Dockerfile

* refactor docker script arg parsing and fix merge bugs

* removed unnecessary testing script and fixed function name

* improved quoting in shell scripts

---------

Co-authored-by: humanoid2050 <humanoid2050@monolith>
Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
Co-authored-by: humanoid2050 <humanoid2050@katana>
Co-authored-by: Brad Warren <bmw@eff.org>
2023-04-08 12:22:16 -07:00
Brad Warren
7a68b29140 update min cryptography (#9663) 2023-04-07 10:28:17 +10:00
234 changed files with 5030 additions and 3986 deletions

View File

@@ -1,8 +1,8 @@
# Configuring Azure Pipelines with Certbot
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
* `.azure-pipelines/main.yml` is the main one, executed on PRs for main, and pushes to main,
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for main.
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
@@ -64,7 +64,7 @@ Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses,
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
protected branches, like master, and cannot modify the security context around this on GitHub.
protected branches, like main, and cannot modify the security context around this on GitHub.
Access can be defined for all or only selected repositories, which is nice.
```
@@ -91,11 +91,11 @@ grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAut
then are way too large (admin level on almost everything), while the classic approach does not add any more
permissions, and works perfectly well.__
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, main in default branch.
- Click on YAML option for "Select a template"
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
- Click "Save & queue", choose the main branch to build the first pipeline, and click "Save and run" button.
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._

View File

@@ -1,9 +1,9 @@
# We run the test suite on commits to master so codecov gets coverage data
# about the master branch and can use it to track coverage changes.
# We run the test suite on commits to main so codecov gets coverage data
# about the main branch and can use it to track coverage changes.
trigger:
- master
- main
pr:
- master
- main
- '*.x'
variables:

View File

@@ -1,4 +1,4 @@
# Nightly pipeline running each day for master.
# Nightly pipeline running each day for main.
trigger: none
pr: none
schedules:
@@ -6,7 +6,7 @@ schedules:
displayName: Nightly build
branches:
include:
- master
- main
always: true
variables:
@@ -15,5 +15,5 @@ variables:
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/deploy-stage.yml
- template: templates/stages/nightly-deploy-stage.yml
- template: templates/stages/notify-failure-stage.yml

View File

@@ -13,7 +13,5 @@ variables:
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/changelog-stage.yml
- template: templates/stages/deploy-stage.yml
parameters:
snapReleaseChannel: beta
- template: templates/stages/release-deploy-stage.yml
- template: templates/stages/notify-failure-stage.yml

View File

@@ -72,3 +72,57 @@ jobs:
tools/retry.sh eval snapcraft upload --release=${{ parameters.snapReleaseChannel }} "${SNAP_FILE}"
done
displayName: Publish to Snap store
# The credentials used in the following jobs are for the shared
# certbotbot account on Docker Hub. The credentials are stored
# in a service account which was created by following the
# instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
# The name given to this service account must match the value
# given to containerRegistry below. The authentication used when
# creating this service account was a personal access token
# rather than a password to bypass 2FA. When Brad set this up,
# Azure Pipelines failed to verify the credentials with an error
# like "access is forbidden with a JWT issued from a personal
# access token", but after saving them without verification, the
# access token worked when the pipeline actually ran. "Grant
# access to all pipelines" should also be checked on the service
# account. The access token can be deleted on Docker Hub if
# these credentials need to be revoked.
- job: publish_docker_by_arch
pool:
vmImage: ubuntu-22.04
strategy:
matrix:
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
amd64:
DOCKER_ARCH: amd64
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: docker_$(DOCKER_ARCH)
path: $(Build.SourcesDirectory)
displayName: Retrieve Docker images
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
displayName: Load Docker images
- task: Docker@2
inputs:
command: login
containerRegistry: docker-hub
displayName: Login to Docker Hub
- bash: set -e && tools/docker/deploy_images.sh $(dockerTag) $DOCKER_ARCH
displayName: Deploy the Docker images by architecture
- job: publish_docker_multiarch
dependsOn: publish_docker_by_arch
pool:
vmImage: ubuntu-22.04
steps:
- task: Docker@2
inputs:
command: login
containerRegistry: docker-hub
displayName: Login to Docker Hub
- bash: set -e && tools/docker/deploy_manifests.sh $(dockerTag) all
displayName: Deploy the Docker multiarch manifests

View File

@@ -4,55 +4,47 @@ jobs:
- name: IMAGE_NAME
value: ubuntu-22.04
- name: PYTHON_VERSION
value: 3.11
value: 3.12
- group: certbot-common
strategy:
matrix:
linux-py38:
PYTHON_VERSION: 3.8
TOXENV: py38
linux-py39:
PYTHON_VERSION: 3.9
TOXENV: py39
linux-py310:
PYTHON_VERSION: 3.10
TOXENV: py310
linux-boulder-v2-integration-certbot-oldest:
PYTHON_VERSION: 3.7
TOXENV: integration-certbot-oldest
ACME_SERVER: boulder-v2
linux-boulder-v2-integration-nginx-oldest:
PYTHON_VERSION: 3.7
TOXENV: integration-nginx-oldest
ACME_SERVER: boulder-v2
linux-boulder-v2-py37-integration:
PYTHON_VERSION: 3.7
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v2-py38-integration:
linux-py311:
PYTHON_VERSION: 3.11
TOXENV: py311
linux-isolated:
TOXENV: 'isolated-acme,isolated-certbot,isolated-apache,isolated-cloudflare,isolated-digitalocean,isolated-dnsimple,isolated-dnsmadeeasy,isolated-gehirn,isolated-google,isolated-linode,isolated-luadns,isolated-nsone,isolated-ovh,isolated-rfc2136,isolated-route53,isolated-sakuracloud,isolated-nginx'
linux-integration-certbot-oldest:
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v2-py39-integration:
TOXENV: integration-certbot-oldest
linux-integration-nginx-oldest:
PYTHON_VERSION: 3.8
TOXENV: integration-nginx-oldest
# python 3.8 integration tests are not run here because they're run as
# part of the standard test suite
linux-py39-integration:
PYTHON_VERSION: 3.9
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v2-py310-integration:
linux-py310-integration:
PYTHON_VERSION: 3.10
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v2-py311-integration:
linux-py311-integration:
PYTHON_VERSION: 3.11
TOXENV: integration
ACME_SERVER: boulder-v2
linux-py312-integration:
PYTHON_VERSION: 3.12
TOXENV: integration
nginx-compat:
TOXENV: nginx_compat
linux-integration-rfc2136:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.8
TOXENV: integration-dns-rfc2136
docker-dev:
TOXENV: docker_dev
le-modification:
IMAGE_NAME: ubuntu-22.04
TOXENV: modification

View File

@@ -4,12 +4,12 @@ jobs:
vmImage: ubuntu-22.04
strategy:
matrix:
amd64:
DOCKER_ARCH: amd64
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
amd64:
DOCKER_ARCH: amd64
# The default timeout of 60 minutes is a little low for compiling
# cryptography on ARM architectures.
timeoutInMinutes: 180
@@ -32,84 +32,29 @@ jobs:
path: $(Build.ArtifactStagingDirectory)
artifact: docker_$(DOCKER_ARCH)
displayName: Store Docker artifact
- job: docker_run
- job: docker_test
dependsOn: docker_build
pool:
vmImage: ubuntu-22.04
strategy:
matrix:
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
amd64:
DOCKER_ARCH: amd64
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: docker_amd64
artifact: docker_$(DOCKER_ARCH)
path: $(Build.SourcesDirectory)
displayName: Retrieve Docker images
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
displayName: Load Docker images
- bash: |
set -ex
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}:{{.Tag}}')
for DOCKER_IMAGE in ${DOCKER_IMAGES}
do docker run --rm "${DOCKER_IMAGE}" plugins --prepare
done
set -e && tools/docker/test.sh $(dockerTag) $DOCKER_ARCH
displayName: Run integration tests for Docker images
- job: installer_build
pool:
vmImage: windows-2019
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.9
architecture: x64
addToPath: true
- script: |
python -m venv venv
venv\Scripts\python tools\pip_install.py -e windows-installer
displayName: Prepare Windows installer build environment
- script: |
venv\Scripts\construct-windows-installer
displayName: Build Certbot installer
- task: CopyFiles@2
inputs:
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
contents: '*.exe'
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
artifact: windows-installer
displayName: Publish Windows installer
- job: installer_run
dependsOn: installer_build
strategy:
matrix:
win2019:
imageName: windows-2019
pool:
vmImage: $(imageName)
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.9
addToPath: true
- task: DownloadPipelineArtifact@2
inputs:
artifact: windows-installer
path: $(Build.SourcesDirectory)/bin
displayName: Retrieve Windows installer
- script: |
python -m venv venv
venv\Scripts\python tools\pip_install.py -e certbot-ci
env:
PIP_NO_BUILD_ISOLATION: no
displayName: Prepare Certbot-CI
- script: |
set PATH=%ProgramFiles%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\windows_installer_integration_tests --allow-persistent-changes --installer-path $(Build.SourcesDirectory)\bin\certbot-beta-installer-win_amd64.exe
displayName: Run windows installer integration tests
- script: |
set PATH=%ProgramFiles%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
displayName: Run certbot integration tests
- job: snaps_build
pool:
vmImage: ubuntu-22.04
@@ -131,7 +76,7 @@ jobs:
displayName: Install dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
versionSpec: 3.12
addToPath: true
- task: DownloadSecureFile@1
name: credentials
@@ -162,7 +107,7 @@ jobs:
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
versionSpec: 3.12
addToPath: true
- script: |
set -e
@@ -182,7 +127,7 @@ jobs:
displayName: Install Certbot snap
- script: |
set -e
venv/bin/python -m tox -e integration-external,apacheconftest-external-with-pebble
venv/bin/python -m tox run -e integration-external,apacheconftest-external-with-pebble
displayName: Run tox
- job: snap_dns_run
dependsOn: snaps_build
@@ -196,7 +141,7 @@ jobs:
displayName: Install dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
versionSpec: 3.12
addToPath: true
- task: DownloadPipelineArtifact@2
inputs:

View File

@@ -1,40 +1,29 @@
jobs:
- job: test
variables:
PYTHON_VERSION: 3.11
PYTHON_VERSION: 3.12
strategy:
matrix:
macos-py37-cover:
macos-py38-cover:
IMAGE_NAME: macOS-12
PYTHON_VERSION: 3.7
PYTHON_VERSION: 3.8
TOXENV: cover
# As of pip 23.1.0, builds started failing on macOS unless this flag was set.
# See https://github.com/certbot/certbot/pull/9717#issuecomment-1610861794.
PIP_USE_PEP517: "true"
macos-cover:
IMAGE_NAME: macOS-12
IMAGE_NAME: macOS-13
TOXENV: cover
windows-py37:
IMAGE_NAME: windows-2019
PYTHON_VERSION: 3.7
TOXENV: py37-win
windows-py39-cover:
IMAGE_NAME: windows-2019
PYTHON_VERSION: 3.9
TOXENV: cover-win
windows-integration-certbot:
IMAGE_NAME: windows-2019
PYTHON_VERSION: 3.9
TOXENV: integration-certbot
linux-oldest-tests-1:
# See explanation under macos-py38-cover.
PIP_USE_PEP517: "true"
linux-oldest:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.7
TOXENV: '{acme,apache,apache-v2,certbot}-oldest'
linux-oldest-tests-2:
PYTHON_VERSION: 3.8
TOXENV: oldest
linux-py38:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.7
TOXENV: '{dns,nginx}-oldest'
linux-py37:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.7
TOXENV: py37
PYTHON_VERSION: 3.8
TOXENV: py38
linux-cover:
IMAGE_NAME: ubuntu-22.04
TOXENV: cover
@@ -43,12 +32,11 @@ jobs:
TOXENV: lint-posix
linux-mypy:
IMAGE_NAME: ubuntu-22.04
TOXENV: mypy-posix
TOXENV: mypy
linux-integration:
IMAGE_NAME: ubuntu-22.04
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: pebble
apache-compat:
IMAGE_NAME: ubuntu-22.04
TOXENV: apache_compat

View File

@@ -1,67 +0,0 @@
parameters:
# We do not define acceptable values for this parameter here as it is passed
# through to ../jobs/snap-deploy-job.yml which does its own sanity checking.
- name: snapReleaseChannel
type: string
default: edge
stages:
- stage: Deploy
jobs:
- template: ../jobs/snap-deploy-job.yml
parameters:
snapReleaseChannel: ${{ parameters.snapReleaseChannel }}
# The credentials used in the following jobs are for the shared
# certbotbot account on Docker Hub. The credentials are stored
# in a service account which was created by following the
# instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
# The name given to this service account must match the value
# given to containerRegistry below. The authentication used when
# creating this service account was a personal access token
# rather than a password to bypass 2FA. When Brad set this up,
# Azure Pipelines failed to verify the credentials with an error
# like "access is forbidden with a JWT issued from a personal
# access token", but after saving them without verification, the
# access token worked when the pipeline actually ran. "Grant
# access to all pipelines" should also be checked on the service
# account. The access token can be deleted on Docker Hub if
# these credentials need to be revoked.
- job: publish_docker_by_arch
pool:
vmImage: ubuntu-22.04
strategy:
matrix:
amd64:
DOCKER_ARCH: amd64
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: docker_$(DOCKER_ARCH)
path: $(Build.SourcesDirectory)
displayName: Retrieve Docker images
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
displayName: Load Docker images
- task: Docker@2
inputs:
command: login
containerRegistry: docker-hub
displayName: Login to Docker Hub
- bash: set -e && tools/docker/deploy_by_arch.sh $(dockerTag) $DOCKER_ARCH
displayName: Deploy the Docker images by architecture
- job: publish_docker_multiarch
dependsOn: publish_docker_by_arch
pool:
vmImage: ubuntu-22.04
steps:
- task: Docker@2
inputs:
command: login
containerRegistry: docker-hub
displayName: Login to Docker Hub
- bash: set -e && tools/docker/deploy_multiarch.sh $(dockerTag)
displayName: Deploy the Docker multiarch manifests

View File

@@ -0,0 +1,6 @@
stages:
- stage: Deploy
jobs:
- template: ../jobs/common-deploy-jobs.yml
parameters:
snapReleaseChannel: edge

View File

@@ -0,0 +1,38 @@
stages:
- stage: Deploy
jobs:
- template: ../jobs/common-deploy-jobs.yml
parameters:
snapReleaseChannel: beta
- job: create_github_release
pool:
vmImage: ubuntu-22.04
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: changelog
path: '$(Pipeline.Workspace)'
- task: GitHubRelease@1
inputs:
# this "github-releases" credential is what azure pipelines calls a
# "service connection". it was created using the instructions at
# https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#github-service-connection
# with a fine-grained personal access token from github to limit
# the permissions given to azure pipelines. the connection on azure
# needs permissions for the "release" pipeline (and maybe the
# "full-test-suite" pipeline to simplify testing it). information
# on how to set up these permissions can be found at
# https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#secure-a-service-connection.
# the github token that is used needs "contents:write" and
# "workflows:write" permissions for the certbot repo
#
# as of writing this, the current token will expire on 3/15/2025.
# when recreating it, you may also want to create it using the
# shared "certbotbot" github account so the credentials aren't tied
# to any one dev's github account and their access to the certbot
# repo
gitHubConnection: github-releases
title: ${{ format('Certbot {0}', replace(variables['Build.SourceBranchName'], 'v', '')) }}
releaseNotesFilePath: '$(Pipeline.Workspace)/release_notes.md'
assets: '$(Build.SourcesDirectory)/packages/{*.tar.gz,SHA256SUMS*}'
addChangeLog: false

View File

@@ -1,9 +1,17 @@
# This does not include the dependencies needed to build cryptography. See
# https://cryptography.io/en/latest/installation/
steps:
# We run brew update because we've seen attempts to install an older version
# of a package fail. See
# https://github.com/actions/virtual-environments/issues/3165.
#
# We untap homebrew/core and homebrew/cask and unset HOMEBREW_NO_INSTALL_FROM_API (which
# is set by the CI macOS env) because GitHub has been having issues, making these jobs
# fail on git clones: https://github.com/orgs/Homebrew/discussions/4612.
- bash: |
set -e
unset HOMEBREW_NO_INSTALL_FROM_API
brew untap homebrew/core homebrew/cask
brew update
brew install augeas
condition: startswith(variables['IMAGE_NAME'], 'macOS')
@@ -12,14 +20,8 @@ steps:
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends \
python3-dev \
gcc \
libaugeas0 \
libssl-dev \
libffi-dev \
ca-certificates \
nginx-light \
openssl
nginx-light
sudo systemctl stop nginx
sudo sysctl net.ipv4.ip_unprivileged_port_start=0
condition: startswith(variables['IMAGE_NAME'], 'ubuntu')
@@ -42,7 +44,7 @@ steps:
export TARGET_BRANCH="`echo "${BUILD_SOURCEBRANCH}" | sed -E 's!refs/(heads|tags)/!!g'`"
[ -z "${SYSTEM_PULLREQUEST_TARGETBRANCH}" ] || export TARGET_BRANCH="${SYSTEM_PULLREQUEST_TARGETBRANCH}"
env
python3 -m tox
python3 -m tox run
env:
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)

View File

@@ -1,5 +1,24 @@
[run]
omit = */setup.py
source =
acme
certbot
certbot-apache
certbot-dns-cloudflare
certbot-dns-digitalocean
certbot-dns-dnsimple
certbot-dns-dnsmadeeasy
certbot-dns-gehirn
certbot-dns-google
certbot-dns-linode
certbot-dns-luadns
certbot-dns-nsone
certbot-dns-ovh
certbot-dns-rfc2136
certbot-dns-route53
certbot-dns-sakuracloud
certbot-nginx
[report]
omit = */setup.py
show_missing = True

12
.envrc
View File

@@ -1,12 +0,0 @@
# This file is just a nicety for developers who use direnv. When you cd under
# the Certbot repo, Certbot's virtual environment will be automatically
# activated and then deactivated when you cd elsewhere. Developers have to have
# direnv set up and run `direnv allow` to allow this file to execute on their
# system. You can find more information at https://direnv.net/.
. venv/bin/activate
# direnv doesn't support modifying PS1 so we unset it to squelch the error
# it'll otherwise print about this being done in the activate script. See
# https://github.com/direnv/direnv/wiki/PS1. If you would like your shell
# prompt to change like it normally does, see
# https://github.com/direnv/direnv/wiki/Python#restoring-the-ps1.
unset PS1

View File

@@ -1,6 +1,6 @@
## Pull Request Checklist
- [ ] The Certbot team has recently expressed interest in reviewing a PR for this. If not, this PR may be closed due our limited resources and need to prioritize how we spend them.
- [ ] If the change being made is to a [distributed component](https://certbot.eff.org/docs/contributing.html#code-components-and-layout), edit the `master` section of `certbot/CHANGELOG.md` to include a description of the change being made.
- [ ] If the change being made is to a [distributed component](https://certbot.eff.org/docs/contributing.html#code-components-and-layout), edit the `main` section of `certbot/CHANGELOG.md` to include a description of the change being made.
- [ ] Add or update any documentation as needed to support the changes in this PR.
- [ ] Include your name in `AUTHORS.md` if you like.

View File

@@ -7,25 +7,15 @@ on:
jobs:
if_merged:
if: github.event.pull_request.merged == true
# Forked repos can not access Mattermost secret.
if: github.event.pull_request.merged == true && !github.event.pull_request.head.repo.fork
runs-on: ubuntu-latest
steps:
- name: Create Mattermost Message
# https://docs.github.com/en/actions/security-guides/security-hardening-for-github-actions#example-of-a-script-injection-attack
env:
NUMBER: ${{ github.event.number }}
PR_URL: https://github.com/${{ github.repository }}/pull/${{ github.event.number }}
REPO: ${{ github.repository }}
USER: ${{ github.actor }}
TITLE: ${{ github.event.pull_request.title }}
run: |
jq --null-input \
--arg number "$NUMBER" \
--arg pr_url "$PR_URL" \
--arg repo "$REPO" \
--arg user "$USER" \
--arg title "$TITLE" \
'{ "text": "[\($repo)] | [\($title) #\($number)](\($pr_url)) was merged into master by \($user)" }' > mattermost.json
- uses: mattermost/action-mattermost-notify@master
env:
- uses: mattermost/action-mattermost-notify@main
with:
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_MERGE_WEBHOOK }}
TEXT: >
[${{ github.repository }}] |
[${{ github.event.pull_request.title }}
#${{ github.event.number }}](https://github.com/${{ github.repository }}/pull/${{ github.event.number }})
was merged into main by ${{ github.actor }}

View File

@@ -4,6 +4,7 @@ on:
schedule:
# Every week on Thursday @ 13:00
- cron: "0 13 * * 4"
workflow_dispatch:
jobs:
send-mattermost-message:
runs-on: ubuntu-latest
@@ -11,15 +12,16 @@ jobs:
steps:
- name: Create Mattermost Message
run: |
DATE=$(date --date="7 days ago" +"%Y-%m-%d")
MERGED_URL="https://github.com/pulls?q=merged%3A%3E${DATE}+org%3Acertbot"
UPDATED_URL="https://github.com/pulls?q=updated%3A%3E${DATE}+org%3Acertbot"
echo "{\"text\":\"## Updates Across Certbot Repos\n\n
- Certbot team members SHOULD look at: [link]($MERGED_URL)\n\n
- Certbot team members MAY also want to look at: [link]($UPDATED_URL)\n\n
- Want to Discuss something today? Place it [here](https://docs.google.com/document/d/17YMUbtC1yg6MfiTMwT8zVm9LmO-cuGVBom0qFn8XJBM/edit?usp=sharing) and we can meet today on Zoom.\n\n
- The key words SHOULD and MAY in this message are to be interpreted as described in [RFC 8147](https://www.rfc-editor.org/rfc/rfc8174). \"
}" > mattermost.json
- uses: mattermost/action-mattermost-notify@master
env:
DATE=$(date --date="7 days ago" +"%Y-%m-%d")
echo "MERGED_URL=https://github.com/pulls?q=merged%3A%3E${DATE}+org%3Acertbot" >> $GITHUB_ENV
echo "UPDATED_URL=https://github.com/pulls?q=updated%3A%3E${DATE}+org%3Acertbot" >> $GITHUB_ENV
- uses: mattermost/action-mattermost-notify@main
with:
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_WEBHOOK_URL }}
MATTERMOST_CHANNEL: private-certbot
TEXT: |
## Updates Across Certbot Repos
- Certbot team members SHOULD look at: [link](${{ env.MERGED_URL }})
- Certbot team members MAY also want to look at: [link](${{ env.UPDATED_URL }})
- Want to Discuss something today? Place it [here](https://docs.google.com/document/d/17YMUbtC1yg6MfiTMwT8zVm9LmO-cuGVBom0qFn8XJBM/edit?usp=sharing) and we can meet today on Zoom.
- The key words SHOULD and MAY in this message are to be interpreted as described in [RFC 8147](https://www.rfc-editor.org/rfc/rfc8174).

View File

@@ -1,8 +1,9 @@
name: Update Stale Issues
on:
schedule:
# Run 24 minutes past the hour 4 times a day.
- cron: '24 */6 * * *'
# Run 1:24AM every night
- cron: '24 1 * * *'
workflow_dispatch:
permissions:
issues: write
jobs:
@@ -41,5 +42,7 @@ jobs:
should be reopened, please open a new issue with a link to this one and we'll
take a look.
# Limit the number of actions per hour, from 1-30. Default is 30
operations-per-run: 30
# Limit the number of actions per run. As of writing this, GitHub's
# rate limit is 1000 requests per hour so we're still a ways off. See
# https://docs.github.com/en/rest/overview/resources-in-the-rest-api?apiVersion=2022-11-28#rate-limits-for-requests-from-github-actions.
operations-per-run: 180

View File

@@ -69,7 +69,7 @@ ignored-modules=
# CERTBOT COMMENT
# This is needed for pylint to import linter_plugin.py since
# https://github.com/PyCQA/pylint/pull/3396.
init-hook="import pylint.config, os, sys; sys.path.append(os.path.dirname(pylint.config.PYLINTRC))"
init-hook="import pylint.config, os, sys; sys.path.append(os.path.dirname(next(pylint.config.find_default_config_files())))"
# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
# number of processors available to use, and will cap the count on Windows to
@@ -266,8 +266,8 @@ valid-metaclass-classmethod-first-arg=cls
[EXCEPTIONS]
# Exceptions that will emit a warning when caught.
overgeneral-exceptions=BaseException,
Exception
overgeneral-exceptions=builtins.BaseException,
builtins.Exception
[FORMAT]
@@ -524,7 +524,7 @@ ignored-classes=optparse.Values,thread._local,_thread._local,argparse.Namespace,
# List of module names for which member attributes should not be checked
# (useful for modules/projects where namespaces are manipulated during runtime
# and thus existing member attributes cannot be deduced by static analysis
ignored-modules=pkg_resources,confargparse,argparse
ignored-modules=confargparse,argparse
# Show a hint with possible names when a member name was not found. The aspect
# of finding the hint is based on edit distance.

View File

@@ -94,6 +94,7 @@ Authors
* [Felix Yan](https://github.com/felixonmars)
* [Filip Ochnik](https://github.com/filipochnik)
* [Florian Klink](https://github.com/flokli)
* [Francesco Colista](https://github.com/fcolista)
* [Francois Marier](https://github.com/fmarier)
* [Frank](https://github.com/Frankkkkk)
* [Frederic BLANC](https://github.com/fblanc)
@@ -123,6 +124,7 @@ Authors
* [James Balazs](https://github.com/jamesbalazs)
* [James Kasten](https://github.com/jdkasten)
* [Jason Grinblat](https://github.com/ptychomancer)
* [Jawshua](https://github.com/jawshua)
* [Jay Faulkner](https://github.com/jayofdoom)
* [J.C. Jones](https://github.com/jcjones)
* [Jeff Hodges](https://github.com/jmhodges)
@@ -153,6 +155,7 @@ Authors
* [LeCoyote](https://github.com/LeCoyote)
* [Lee Watson](https://github.com/TheReverend403)
* [Leo Famulari](https://github.com/lfam)
* [Leon G](https://github.com/LeonGr)
* [lf](https://github.com/lf-)
* [Liam Marshall](https://github.com/liamim)
* [Lior Sabag](https://github.com/liorsbg)
@@ -163,6 +166,7 @@ Authors
* [Luca Ebach](https://github.com/lucebac)
* [Luca Olivetti](https://github.com/olivluca)
* [Luke Rogers](https://github.com/lukeroge)
* [Lukhnos Liu](https://github.com/lukhnos)
* [Maarten](https://github.com/mrtndwrd)
* [Mads Jensen](https://github.com/atombrella)
* [Maikel Martens](https://github.com/krukas)
@@ -207,6 +211,7 @@ Authors
* [Patrick Heppler](https://github.com/PatrickHeppler)
* [Paul Buonopane](https://github.com/Zenexer)
* [Paul Feitzinger](https://github.com/pfeyz)
* [Paulo Dias](https://github.com/paulojmdias)
* [Pavan Gupta](https://github.com/pavgup)
* [Pavel Pavlov](https://github.com/ghost355)
* [Peter Conrad](https://github.com/pconrad-fb)
@@ -220,12 +225,14 @@ Authors
* [Piotr Kasprzyk](https://github.com/kwadrat)
* [Prayag Verma](https://github.com/pra85)
* [Preston Locke](https://github.com/Preston12321)
* [Q Misell][https://magicalcodewit.ch]
* [Rasesh Patel](https://github.com/raspat1)
* [Reinaldo de Souza Jr](https://github.com/juniorz)
* [Remi Rampin](https://github.com/remram44)
* [Rémy HUBSCHER](https://github.com/Natim)
* [Rémy Léone](https://github.com/sieben)
* [Richard Barnes](https://github.com/r-barnes)
* [Richard Harman](https://github.com/warewolf)
* [Richard Panek](https://github.com/kernelpanek)
* [Robert Buchholz](https://github.com/rbu)
* [Robert Dailey](https://github.com/pahrohfit)

View File

@@ -1,21 +0,0 @@
# This Dockerfile builds an image for development.
FROM ubuntu:focal
# Note: this only exposes the port to other docker containers.
EXPOSE 80 443
WORKDIR /opt/certbot/src
COPY . .
RUN apt-get update && \
DEBIAN_FRONTEND=noninteractive apt-get install apache2 git python3-dev \
python3-venv gcc libaugeas0 libssl-dev libffi-dev ca-certificates \
openssl nginx-light -y --no-install-recommends && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*
RUN VENV_NAME="../venv" python3 tools/venv.py
ENV PATH /opt/certbot/venv/bin:$PATH

33
acme/.readthedocs.yaml Normal file
View File

@@ -0,0 +1,33 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# You can also specify other tool versions:
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: acme/docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
formats:
- pdf
- epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: acme/readthedocs.org.requirements.txt

View File

@@ -29,11 +29,9 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
from acme.crypto_util import SSLSocket
class _TestServer(socketserver.TCPServer):
def server_bind(self): # pylint: disable=missing-docstring
self.socket = SSLSocket(socket.socket(),
certs)
socketserver.TCPServer.server_bind(self)
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.socket = SSLSocket(self.socket, certs)
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
self.port = self.server.socket.getsockname()[1]
@@ -44,6 +42,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
if self.server_thread.is_alive():
# The thread may have already terminated.
self.server_thread.join() # pragma: no cover
self.server.server_close()
def _probe(self, name):
from acme.crypto_util import probe_sni

View File

@@ -34,7 +34,7 @@ class RFC3339FieldTest(unittest.TestCase):
"""Tests for acme.fields.RFC3339Field."""
def setUp(self):
self.decoded = datetime.datetime(2015, 3, 27, tzinfo=pytz.utc)
self.decoded = datetime.datetime(2015, 3, 27, tzinfo=pytz.UTC)
self.encoded = '2015-03-27T00:00:00Z'
def test_default_encoder(self):

View File

@@ -55,6 +55,7 @@ class HTTP01ServerTest(unittest.TestCase):
def tearDown(self):
self.server.shutdown()
self.thread.join()
self.server.server_close()
def test_index(self):
response = requests.get(
@@ -88,25 +89,25 @@ class HTTP01ServerTest(unittest.TestCase):
def test_timely_shutdown(self):
from acme.standalone import HTTP01Server
server = HTTP01Server(('', 0), resources=set(), timeout=0.05)
server_thread = threading.Thread(target=server.serve_forever)
server_thread.start()
with HTTP01Server(('', 0), resources=set(), timeout=0.05) as server:
server_thread = threading.Thread(target=server.serve_forever)
server_thread.start()
client = socket.socket()
client.connect(('localhost', server.socket.getsockname()[1]))
with socket.socket() as client:
client.connect(('localhost', server.socket.getsockname()[1]))
stop_thread = threading.Thread(target=server.shutdown)
stop_thread.start()
server_thread.join(5.)
stop_thread = threading.Thread(target=server.shutdown)
stop_thread.start()
server_thread.join(5.)
is_hung = server_thread.is_alive()
try:
client.shutdown(socket.SHUT_RDWR)
except: # pragma: no cover, pylint: disable=bare-except
# may raise error because socket could already be closed
pass
is_hung = server_thread.is_alive()
try:
client.shutdown(socket.SHUT_RDWR)
except: # pragma: no cover, pylint: disable=bare-except
# may raise error because socket could already be closed
pass
assert not is_hung, 'Server shutdown should not be hung'
assert not is_hung, 'Server shutdown should not be hung'
@unittest.skipIf(not challenges.TLSALPN01.is_supported(), "pyOpenSSL too old")
@@ -133,6 +134,7 @@ class TLSALPN01ServerTest(unittest.TestCase):
def tearDown(self):
self.server.shutdown() # pylint: disable=no-member
self.thread.join()
self.server.server_close()
# TODO: This is not implemented yet, see comments in standalone.py
# def test_certs(self):
@@ -214,6 +216,8 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
if prev_port:
assert prev_port == port
prev_port = port
for server in servers.servers:
server.server_close()
class HTTP01DualNetworkedServersTest(unittest.TestCase):

View File

@@ -4,20 +4,25 @@
"""
import os
import sys
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import serialization
import josepy as jose
from josepy.util import ComparableECKey
from OpenSSL import crypto
import pkg_resources
if sys.version_info >= (3, 9): # pragma: no cover
import importlib.resources as importlib_resources
else: # pragma: no cover
import importlib_resources
def load_vector(*names):
"""Load contents of a test vector."""
# luckily, resource_string opens file in binary mode
return pkg_resources.resource_string(
__name__, os.path.join('testdata', *names))
vector_ref = importlib_resources.files(__package__).joinpath('testdata', *names)
return vector_ref.read_bytes()
def _guess_loader(filename, loader_pem, loader_der):

View File

@@ -12,7 +12,6 @@ from typing import List
from typing import Mapping
from typing import Optional
from typing import Set
from typing import Text
from typing import Tuple
from typing import Union
@@ -517,7 +516,7 @@ class ClientNetwork:
self.account = account
self.alg = alg
self.verify_ssl = verify_ssl
self._nonces: Set[Text] = set()
self._nonces: Set[str] = set()
self.user_agent = user_agent
self.session = requests.Session()
self._default_timeout = timeout

View File

@@ -136,27 +136,33 @@ class SSLSocket: # pylint: disable=too-few-public-methods
def accept(self) -> Tuple[FakeConnection, Any]: # pylint: disable=missing-function-docstring
sock, addr = self.sock.accept()
context = SSL.Context(self.method)
context.set_options(SSL.OP_NO_SSLv2)
context.set_options(SSL.OP_NO_SSLv3)
context.set_tlsext_servername_callback(self._pick_certificate_cb)
if self.alpn_selection is not None:
context.set_alpn_select_callback(self.alpn_selection)
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
ssl_sock.set_accept_state()
# This log line is especially desirable because without it requests to
# our standalone TLSALPN server would not be logged.
logger.debug("Performing handshake with %s", addr)
try:
ssl_sock.do_handshake()
except SSL.Error as error:
# _pick_certificate_cb might have returned without
# creating SSL context (wrong server name)
raise socket.error(error)
context = SSL.Context(self.method)
context.set_options(SSL.OP_NO_SSLv2)
context.set_options(SSL.OP_NO_SSLv3)
context.set_tlsext_servername_callback(self._pick_certificate_cb)
if self.alpn_selection is not None:
context.set_alpn_select_callback(self.alpn_selection)
return ssl_sock, addr
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
ssl_sock.set_accept_state()
# This log line is especially desirable because without it requests to
# our standalone TLSALPN server would not be logged.
logger.debug("Performing handshake with %s", addr)
try:
ssl_sock.do_handshake()
except SSL.Error as error:
# _pick_certificate_cb might have returned without
# creating SSL context (wrong server name)
raise socket.error(error)
return ssl_sock, addr
except:
# If we encounter any error, close the new socket before reraising
# the exception.
sock.close()
raise
def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, # pylint: disable=too-many-arguments

View File

@@ -34,7 +34,7 @@ class RFC3339Field(jose.Field):
Handles decoding/encoding between RFC3339 strings and aware (not
naive) `datetime.datetime` objects
(e.g. ``datetime.datetime.now(pytz.utc)``).
(e.g. ``datetime.datetime.now(pytz.UTC)``).
"""

View File

@@ -29,7 +29,7 @@ class Header(jose.Header):
class Signature(jose.Signature):
"""ACME-specific Signature. Uses ACME-specific Header for customer fields."""
__slots__ = jose.Signature._orig_slots # type: ignore[attr-defined] # pylint: disable=protected-access,no-member
__slots__ = jose.Signature._orig_slots # pylint: disable=protected-access,no-member
# TODO: decoder/encoder should accept cls? Otherwise, subclassing
# JSONObjectWithFields is tricky...
@@ -44,7 +44,7 @@ class Signature(jose.Signature):
class JWS(jose.JWS):
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
signature_cls = Signature
__slots__ = jose.JWS._orig_slots # type: ignore[attr-defined] # pylint: disable=protected-access
__slots__ = jose.JWS._orig_slots # pylint: disable=protected-access
@classmethod
# type: ignore[override] # pylint: disable=arguments-differ

View File

@@ -37,6 +37,7 @@ extensions = [
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode',
'sphinx_rtd_theme',
]
autodoc_member_order = 'bysource'
@@ -122,14 +123,7 @@ todo_include_todos = False
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the

View File

@@ -3,6 +3,6 @@ usage: jws [-h] [--compact] {sign,verify} ...
positional arguments:
{sign,verify}
optional arguments:
options:
-h, --help show this help message and exit
--compact

View File

@@ -3,11 +3,13 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
'cryptography>=2.5.0',
'josepy>=1.13.0',
'cryptography>=3.2.1',
# Josepy 2+ may introduce backward incompatible changes by droping usage of
# deprecated PyOpenSSL APIs.
'josepy>=1.13.0, <2',
# pyOpenSSL 23.1.0 is a bad release: https://github.com/pyca/pyopenssl/issues/1199
'PyOpenSSL>=17.5.0,!=23.1.0',
'pyrfc3339',
@@ -22,6 +24,15 @@ docs_extras = [
]
test_extras = [
# In theory we could scope importlib_resources to env marker 'python_version<"3.9"'. But this
# makes the pinning mechanism emit warnings when running `poetry lock` because in the corner
# case of an extra dependency with env marker coming from a setup.py file, it generate the
# invalid requirement 'importlib_resource>=1.3.1;python<=3.9;extra=="test"'.
# To fix the issue, we do not pass the env marker. This is fine because:
# - importlib_resources can be applied to any Python version,
# - this is a "test" extra dependency for limited audience,
# - it does not change anything at the end for the generated requirement files.
'importlib_resources>=1.3.1',
'pytest',
'pytest-xdist',
'typing-extensions',
@@ -31,22 +42,22 @@ setup(
name='acme',
version=version,
description='ACME protocol implementation in Python',
url='https://github.com/letsencrypt/letsencrypt',
url='https://github.com/certbot/certbot',
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],

View File

@@ -1,21 +1,28 @@
""" Utility functions for certbot-apache plugin """
import atexit
import binascii
import fnmatch
import logging
import re
import subprocess
import sys
from contextlib import ExitStack
from typing import Dict
from typing import Iterable
from typing import List
from typing import Optional
from typing import Tuple
import pkg_resources
from certbot import errors
from certbot import util
from certbot.compat import os
if sys.version_info >= (3, 9): # pragma: no cover
import importlib.resources as importlib_resources
else: # pragma: no cover
import importlib_resources
logger = logging.getLogger(__name__)
@@ -248,6 +255,8 @@ def find_ssl_apache_conf(prefix: str) -> str:
:return: the path the TLS Apache config file
:rtype: str
"""
return pkg_resources.resource_filename(
"certbot_apache",
os.path.join("_internal", "tls_configs", "{0}-options-ssl-apache.conf".format(prefix)))
file_manager = ExitStack()
atexit.register(file_manager.close)
ref = (importlib_resources.files("certbot_apache").joinpath("_internal")
.joinpath("tls_configs").joinpath("{0}-options-ssl-apache.conf".format(prefix)))
return str(file_manager.enter_context(importlib_resources.as_file(ref)))

View File

@@ -1,10 +1,14 @@
"""Apache plugin constants."""
import atexit
import sys
from contextlib import ExitStack
from typing import Dict
from typing import List
import pkg_resources
from certbot.compat import os
if sys.version_info >= (3, 9): # pragma: no cover
import importlib.resources as importlib_resources
else: # pragma: no cover
import importlib_resources
MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
"""Name of the mod_ssl config file as saved
@@ -37,8 +41,15 @@ ALL_SSL_OPTIONS_HASHES: List[str] = [
]
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""
AUGEAS_LENS_DIR = pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "augeas_lens"))
def _generate_augeas_lens_dir_static() -> str:
# This code ensures that the resource is accessible as file for the lifetime of current
# Python process, and will be automatically cleaned up on exit.
file_manager = ExitStack()
atexit.register(file_manager.close)
augeas_lens_dir_ref = importlib_resources.files("certbot_apache") / "_internal" / "augeas_lens"
return str(file_manager.enter_context(importlib_resources.as_file(augeas_lens_dir_ref)))
AUGEAS_LENS_DIR = _generate_augeas_lens_dir_static()
"""Path to the Augeas lens directory"""
REWRITE_HTTPS_ARGS: List[str] = [

View File

@@ -4,6 +4,7 @@ from typing import Type
from certbot import util
from certbot_apache._internal import configurator
from certbot_apache._internal import override_alpine
from certbot_apache._internal import override_arch
from certbot_apache._internal import override_centos
from certbot_apache._internal import override_darwin
@@ -14,6 +15,7 @@ from certbot_apache._internal import override_suse
from certbot_apache._internal import override_void
OVERRIDE_CLASSES: Dict[str, Type[configurator.ApacheConfigurator]] = {
"alpine": override_alpine.AlpineConfigurator,
"arch": override_arch.ArchConfigurator,
"cloudlinux": override_centos.CentOSConfigurator,
"darwin": override_darwin.DarwinConfigurator,

View File

@@ -0,0 +1,19 @@
""" Distribution specific override class for Alpine Linux """
from certbot_apache._internal import configurator
from certbot_apache._internal.configurator import OsOptions
class AlpineConfigurator(configurator.ApacheConfigurator):
"""Alpine Linux specific ApacheConfigurator override class"""
OS_DEFAULTS = OsOptions(
server_root="/etc/apache2",
vhost_root="/etc/apache2/conf.d",
vhost_files="*.conf",
logs_root="/var/log/apache2",
ctl="apachectl",
version_cmd=['apachectl', '-v'],
restart_cmd=['apachectl', 'graceful'],
conftest_cmd=['apachectl', 'configtest'],
challenge_location="/etc/apache2/conf.d",
)

View File

@@ -14,7 +14,7 @@ SCRIPT_DIRNAME = os.path.dirname(__file__)
def main() -> int:
args = sys.argv[1:]
with acme_server.ACMEServer('pebble', [], False) as acme_xdist:
with acme_server.ACMEServer([], False) as acme_xdist:
environ = os.environ.copy()
environ['SERVER'] = acme_xdist['directory_url']
command = [os.path.join(SCRIPT_DIRNAME, 'apache-conf-test')]

View File

@@ -128,11 +128,11 @@ class AutoHSTSTest(util.ApacheTest):
max_val
def test_autohsts_update_noop(self):
with mock.patch("time.time") as mock_time:
with mock.patch("certbot_apache._internal.configurator.time") as mock_time_module:
# Time mock is used to make sure that the execution does not
# continue when no autohsts entries exist in pluginstorage
self.config.update_autohsts(mock.MagicMock())
assert mock_time.called is False
assert not mock_time_module.time.called
def test_autohsts_make_permanent_noop(self):
self.config.storage.put = mock.MagicMock()

View File

@@ -5,7 +5,6 @@ import shutil
import socket
import sys
import tempfile
import unittest
from unittest import mock
import pytest
@@ -1659,20 +1658,23 @@ class InstallSslOptionsConfTest(util.ApacheTest):
file has been manually edited by the user, and will refuse to update it.
This test ensures that all necessary hashes are present.
"""
import pkg_resources
if sys.version_info >= (3, 9): # pragma: no cover
import importlib.resources as importlib_resources
else: # pragma: no cover
import importlib_resources
from certbot_apache._internal.constants import ALL_SSL_OPTIONS_HASHES
tls_configs_dir = pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "tls_configs"))
all_files = [os.path.join(tls_configs_dir, name) for name in os.listdir(tls_configs_dir)
if name.endswith('options-ssl-apache.conf')]
assert len(all_files) >= 1
for one_file in all_files:
file_hash = crypto_util.sha256sum(one_file)
assert file_hash in ALL_SSL_OPTIONS_HASHES, \
f"Constants.ALL_SSL_OPTIONS_HASHES must be appended with the sha256 " \
f"hash of {one_file} when it is updated."
ref = importlib_resources.files("certbot_apache") / "_internal" / "tls_configs"
with importlib_resources.as_file(ref) as tls_configs_dir:
all_files = [os.path.join(tls_configs_dir, name) for name in os.listdir(tls_configs_dir)
if name.endswith('options-ssl-apache.conf')]
assert len(all_files) >= 1
for one_file in all_files:
file_hash = crypto_util.sha256sum(one_file)
assert file_hash in ALL_SSL_OPTIONS_HASHES, \
f"Constants.ALL_SSL_OPTIONS_HASHES must be appended with the sha256 " \
f"hash of {one_file} when it is updated."
def test_openssl_version(self):
self.config._openssl_version = None

View File

@@ -22,7 +22,7 @@ class ApacheTest(unittest.TestCase):
# pylint: disable=arguments-differ
self.temp_dir, self.config_dir, self.work_dir = common.dir_setup(
test_dir=test_dir,
pkg=__name__)
pkg=__package__)
self.config_path = os.path.join(self.temp_dir, config_root)
self.vhost_path = os.path.join(self.temp_dir, vhost_root)

View File

@@ -1,7 +1,7 @@
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
# We specify the minimum acme and certbot version as the current plugin
@@ -9,6 +9,7 @@ install_requires = [
# https://github.com/certbot/certbot/issues/8761 for more info.
f'acme>={version}',
f'certbot>={version}',
'importlib_resources>=1.3.1; python_version < "3.9"',
'python-augeas',
'setuptools>=41.6.0',
]
@@ -25,11 +26,11 @@ setup(
name='certbot-apache',
version=version,
description="Apache plugin for Certbot",
url='https://github.com/letsencrypt/letsencrypt',
url='https://github.com/certbot/certbot',
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
@@ -38,11 +39,11 @@ setup(
'Operating System :: POSIX :: Linux',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -23,7 +23,6 @@ class IntegrationTestsContext:
self.worker_id = 'primary'
acme_xdist = request.config.acme_xdist # type: ignore[attr-defined]
self.acme_server = acme_xdist['acme_server']
self.directory_url = acme_xdist['directory_url']
self.tls_alpn_01_port = acme_xdist['https_port'][self.worker_id]
self.http_01_port = acme_xdist['http_port'][self.worker_id]

View File

@@ -7,7 +7,6 @@ import shutil
import subprocess
import time
from typing import Generator
from typing import Iterable
from typing import Tuple
from typing import Type
@@ -82,11 +81,9 @@ def test_registration_override(context: IntegrationTestsContext) -> None:
context.certbot(['update_account', '--email', 'ex1@domain.org,ex2@domain.org'])
stdout2, _ = context.certbot(['show_account'])
# https://github.com/letsencrypt/boulder/issues/6144
if context.acme_server != 'boulder-v2':
assert 'example@domain.org' in stdout1, "New email should be present"
assert 'example@domain.org' not in stdout2, "Old email should not be present"
assert 'ex1@domain.org, ex2@domain.org' in stdout2, "New emails should be present"
assert 'example@domain.org' in stdout1, "New email should be present"
assert 'example@domain.org' not in stdout2, "Old email should not be present"
assert 'ex1@domain.org, ex2@domain.org' in stdout2, "New emails should be present"
def test_prepare_plugins(context: IntegrationTestsContext) -> None:
@@ -566,19 +563,15 @@ def test_default_rsa_size(context: IntegrationTestsContext) -> None:
assert_rsa_key(key1, 2048)
@pytest.mark.parametrize('curve,curve_cls,skip_servers', [
@pytest.mark.parametrize('curve,curve_cls', [
# Curve name, Curve class, ACME servers to skip
('secp256r1', SECP256R1, []),
('secp384r1', SECP384R1, []),
('secp521r1', SECP521R1, ['boulder-v2'])]
('secp256r1', SECP256R1),
('secp384r1', SECP384R1),
('secp521r1', SECP521R1)]
)
def test_ecdsa_curves(context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve],
skip_servers: Iterable[str]) -> None:
def test_ecdsa_curves(context: IntegrationTestsContext, curve: str,
curve_cls: Type[EllipticCurve]) -> None:
"""Test issuance for each supported ECDSA curve"""
if context.acme_server in skip_servers:
pytest.skip('ACME server {} does not support ECDSA curve {}'
.format(context.acme_server, curve))
domain = context.get_domain('curve')
context.certbot([
'certonly',
@@ -640,9 +633,6 @@ def test_renew_with_ec_keys(context: IntegrationTestsContext) -> None:
def test_ocsp_must_staple(context: IntegrationTestsContext) -> None:
"""Test that OCSP Must-Staple is correctly set in the generated certificate."""
if context.acme_server == 'pebble':
pytest.skip('Pebble does not support OCSP Must-Staple.')
certname = context.get_domain('must-staple')
context.certbot(['auth', '--must-staple', '--domains', certname])
@@ -710,17 +700,14 @@ def test_revoke_and_unregister(context: IntegrationTestsContext) -> None:
assert cert3 in stdout
@pytest.mark.parametrize('curve,curve_cls,skip_servers', [
('secp256r1', SECP256R1, []),
('secp384r1', SECP384R1, []),
('secp521r1', SECP521R1, ['boulder-v2'])]
@pytest.mark.parametrize('curve,curve_cls', [
('secp256r1', SECP256R1),
('secp384r1', SECP384R1),
('secp521r1', SECP521R1)]
)
def test_revoke_ecdsa_cert_key(
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve],
skip_servers: Iterable[str]) -> None:
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve]) -> None:
"""Test revoking a certificate """
if context.acme_server in skip_servers:
pytest.skip(f'ACME server {context.acme_server} does not support ECDSA curve {curve}')
cert: str = context.get_domain('curve')
context.certbot([
'certonly',
@@ -738,17 +725,14 @@ def test_revoke_ecdsa_cert_key(
assert stdout.count('INVALID: REVOKED') == 1, 'Expected {0} to be REVOKED'.format(cert)
@pytest.mark.parametrize('curve,curve_cls,skip_servers', [
('secp256r1', SECP256R1, []),
('secp384r1', SECP384R1, []),
('secp521r1', SECP521R1, ['boulder-v2'])]
@pytest.mark.parametrize('curve,curve_cls', [
('secp256r1', SECP256R1),
('secp384r1', SECP384R1),
('secp521r1', SECP521R1)]
)
def test_revoke_ecdsa_cert_key_delete(
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve],
skip_servers: Iterable[str]) -> None:
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve]) -> None:
"""Test revoke and deletion for each supported curve type"""
if context.acme_server in skip_servers:
pytest.skip(f'ACME server {context.acme_server} does not support ECDSA curve {curve}')
cert: str = context.get_domain('curve')
context.certbot([
'certonly',
@@ -913,7 +897,7 @@ def test_dry_run_deactivate_authzs(context: IntegrationTestsContext) -> None:
def test_preferred_chain(context: IntegrationTestsContext) -> None:
"""Test that --preferred-chain results in the correct chain.pem being produced"""
try:
issuers = misc.get_acme_issuers(context)
issuers = misc.get_acme_issuers()
except NotImplementedError:
pytest.skip('This ACME server does not support alternative issuers.')

View File

@@ -8,7 +8,6 @@ for a directory a specific configuration using built-in pytest hooks.
See https://docs.pytest.org/en/latest/reference.html#hook-reference
"""
import contextlib
import subprocess
import sys
from certbot_integration_tests.utils import acme_server as acme_lib
@@ -20,10 +19,6 @@ def pytest_addoption(parser):
Standard pytest hook to add options to the pytest parser.
:param parser: current pytest parser that will be used on the CLI
"""
parser.addoption('--acme-server', default='pebble',
choices=['boulder-v2', 'pebble'],
help='select the ACME server to use (boulder-v2, pebble), '
'defaulting to pebble')
parser.addoption('--dns-server', default='challtestsrv',
choices=['bind', 'challtestsrv'],
help='select the DNS server to use (bind, challtestsrv), '
@@ -69,7 +64,7 @@ def _setup_primary_node(config):
Setup the environment for integration tests.
This function will:
- check runtime compatibility (Docker, docker-compose, Nginx)
- check runtime compatibility (Docker, docker compose, Nginx)
- create a temporary workspace and the persistent GIT repositories space
- configure and start a DNS server using Docker, if configured
- configure and start paralleled ACME CA servers using Docker
@@ -80,22 +75,6 @@ def _setup_primary_node(config):
:param config: Configuration of the pytest primary node. Is modified by this function.
"""
# Check for runtime compatibility: some tools are required to be available in PATH
if 'boulder' in config.option.acme_server:
try:
subprocess.check_output(['docker', '-v'], stderr=subprocess.STDOUT)
except (subprocess.CalledProcessError, OSError):
raise ValueError('Error: docker is required in PATH to launch the integration tests on'
'boulder, but is not installed or not available for current user.')
try:
subprocess.check_output(['docker-compose', '-v'], stderr=subprocess.STDOUT)
except (subprocess.CalledProcessError, OSError):
raise ValueError(
'Error: docker-compose is required in PATH to launch the integration tests, '
'but is not installed or not available for current user.'
)
# Parameter numprocesses is added to option by pytest-xdist
workers = ['primary'] if not config.option.numprocesses\
else ['gw{0}'.format(i) for i in range(config.option.numprocesses)]
@@ -115,8 +94,7 @@ def _setup_primary_node(config):
# By calling setup_acme_server we ensure that all necessary acme server instances will be
# fully started. This runtime is reflected by the acme_xdist returned.
acme_server = acme_lib.ACMEServer(config.option.acme_server, workers,
dns_server=acme_dns_server)
acme_server = acme_lib.ACMEServer(workers, dns_server=acme_dns_server)
config.add_cleanup(acme_server.stop)
print('ACME xdist config:\n{0}'.format(acme_server.acme_xdist))
acme_server.start()

View File

@@ -1,9 +1,15 @@
# -*- coding: utf-8 -*-
"""General purpose nginx test configuration generator."""
import atexit
import getpass
import sys
from contextlib import ExitStack
from typing import Optional
import pkg_resources
if sys.version_info >= (3, 9): # pragma: no cover
import importlib.resources as importlib_resources
else: # pragma: no cover
import importlib_resources
def construct_nginx_config(nginx_root: str, nginx_webroot: str, http_port: int, https_port: int,
@@ -23,10 +29,20 @@ def construct_nginx_config(nginx_root: str, nginx_webroot: str, http_port: int,
:return: a string containing the full nginx configuration
:rtype: str
"""
key_path = key_path if key_path \
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/key.pem')
cert_path = cert_path if cert_path \
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/cert.pem')
if not key_path:
file_manager = ExitStack()
atexit.register(file_manager.close)
ref = (importlib_resources.files('certbot_integration_tests').joinpath('assets')
.joinpath('key.pem'))
key_path = str(file_manager.enter_context(importlib_resources.as_file(ref)))
if not cert_path:
file_manager = ExitStack()
atexit.register(file_manager.close)
ref = (importlib_resources.files('certbot_integration_tests').joinpath('assets')
.joinpath('cert.pem'))
cert_path = str(file_manager.enter_context(importlib_resources.as_file(ref)))
return '''\
# This error log will be written regardless of server scope error_log
# definitions, so we have to set this here in the main scope.

View File

@@ -1,17 +1,21 @@
"""Module to handle the context of RFC2136 integration tests."""
from contextlib import contextmanager
import sys
import tempfile
from typing import Generator
from typing import Iterable
from typing import Tuple
from pkg_resources import resource_filename
import pytest
from certbot_integration_tests.certbot_tests import context as certbot_context
from certbot_integration_tests.utils import certbot_call
if sys.version_info >= (3, 9): # pragma: no cover
import importlib.resources as importlib_resources
else: # pragma: no cover
import importlib_resources
class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
"""Integration test context for certbot-dns-rfc2136"""
@@ -44,15 +48,14 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
:yields: Path to credentials file
:rtype: str
"""
src_file = resource_filename('certbot_integration_tests',
'assets/bind-config/rfc2136-credentials-{}.ini.tpl'
.format(label))
with open(src_file, 'r') as f:
contents = f.read().format(
server_address=self._dns_xdist['address'],
server_port=self._dns_xdist['port']
)
src_ref_file = (importlib_resources.files('certbot_integration_tests').joinpath('assets')
.joinpath('bind-config').joinpath(f'rfc2136-credentials-{label}.ini.tpl'))
with importlib_resources.as_file(src_ref_file) as src_file:
with open(src_file, 'r') as f:
contents = f.read().format(
server_address=self._dns_xdist['address'],
server_port=self._dns_xdist['port']
)
with tempfile.NamedTemporaryFile('w+', prefix='rfc2136-creds-{}'.format(label),
suffix='.ini', dir=self.workspace) as fp:

View File

@@ -5,7 +5,6 @@ import argparse
import errno
import json
import os
from os.path import join
import shutil
import subprocess
import sys
@@ -18,14 +17,12 @@ from typing import Dict
from typing import List
from typing import Mapping
from typing import Optional
from typing import Tuple
from typing import Type
import requests
# pylint: disable=wildcard-import,unused-wildcard-import
from certbot_integration_tests.utils import misc
from certbot_integration_tests.utils import pebble_artifacts
from certbot_integration_tests.utils import pebble_ocsp_server
from certbot_integration_tests.utils import proxy
from certbot_integration_tests.utils.constants import *
@@ -42,34 +39,30 @@ class ACMEServer:
ACMEServer is also a context manager, and so can be used to ensure ACME server is
started/stopped upon context enter/exit.
"""
def __init__(self, acme_server: str, nodes: List[str], http_proxy: bool = True,
def __init__(self, nodes: List[str], http_proxy: bool = True,
stdout: bool = False, dns_server: Optional[str] = None,
http_01_port: Optional[int] = None) -> None:
"""
Create an ACMEServer instance.
:param str acme_server: the type of acme server used (boulder-v2 or pebble)
:param list nodes: list of node names that will be setup by pytest xdist
:param bool http_proxy: if False do not start the HTTP proxy
:param bool stdout: if True stream all subprocesses stdout to standard stdout
:param str dns_server: if set, Pebble/Boulder will use it to resolve domains
:param str dns_server: if set, Pebble will use it to resolve domains
:param int http_01_port: port to use for http-01 validation; currently
only supported for pebble without an HTTP proxy
"""
self._construct_acme_xdist(acme_server, nodes)
self._construct_acme_xdist(nodes)
self._acme_type = 'pebble' if acme_server == 'pebble' else 'boulder'
self._proxy = http_proxy
self._workspace = tempfile.mkdtemp()
self._processes: List[subprocess.Popen] = []
self._stdout = sys.stdout if stdout else open(os.devnull, 'w') # pylint: disable=consider-using-with
self._dns_server = dns_server
self._preterminate_cmds_args: List[Tuple[Tuple[Any, ...], Dict[str, Any]]] = []
self._http_01_port = BOULDER_HTTP_01_PORT if self._acme_type == 'boulder' \
else DEFAULT_HTTP_01_PORT
self._http_01_port = DEFAULT_HTTP_01_PORT
if http_01_port:
if (self._acme_type == 'pebble' and self._proxy) or self._acme_type == 'boulder':
if self._proxy:
raise ValueError('Setting http_01_port is not currently supported when '
'using Boulder or the HTTP proxy')
'using the HTTP proxy')
self._http_01_port = http_01_port
def start(self) -> None:
@@ -77,10 +70,7 @@ class ACMEServer:
try:
if self._proxy:
self._prepare_http_proxy()
if self._acme_type == 'pebble':
self._prepare_pebble_server()
if self._acme_type == 'boulder':
self._prepare_boulder_server()
self._prepare_pebble_server()
except BaseException as e:
self.stop()
raise e
@@ -89,7 +79,6 @@ class ACMEServer:
"""Stop the test stack, and clean its resources"""
print('=> Tear down the test infrastructure...')
try:
self._run_preterminate_cmds()
for process in self._processes:
try:
process.terminate()
@@ -115,19 +104,14 @@ class ACMEServer:
traceback: Optional[TracebackType]) -> None:
self.stop()
def _construct_acme_xdist(self, acme_server: str, nodes: List[str]) -> None:
def _construct_acme_xdist(self, nodes: List[str]) -> None:
"""Generate and return the acme_xdist dict"""
acme_xdist: Dict[str, Any] = {'acme_server': acme_server}
acme_xdist: Dict[str, Any] = {}
# Directory and ACME port are set implicitly in the docker-compose.yml
# files of Boulder/Pebble.
if acme_server == 'pebble':
acme_xdist['directory_url'] = PEBBLE_DIRECTORY_URL
acme_xdist['challtestsrv_url'] = PEBBLE_CHALLTESTSRV_URL
else: # boulder
acme_xdist['directory_url'] = BOULDER_V2_DIRECTORY_URL
acme_xdist['challtestsrv_url'] = BOULDER_V2_CHALLTESTSRV_URL
# files of Pebble.
acme_xdist['directory_url'] = PEBBLE_DIRECTORY_URL
acme_xdist['challtestsrv_url'] = PEBBLE_CHALLTESTSRV_URL
acme_xdist['http_port'] = dict(zip(nodes, range(5200, 5200 + len(nodes))))
acme_xdist['https_port'] = dict(zip(nodes, range(5100, 5100 + len(nodes))))
acme_xdist['other_port'] = dict(zip(nodes, range(5300, 5300 + len(nodes))))
@@ -161,11 +145,6 @@ class ACMEServer:
[pebble_path, '-config', pebble_config_path, '-dnsserver', dns_server, '-strict'],
env=environ)
# pebble_ocsp_server is imported here and not at the top of module in order to avoid a
# useless ImportError, in the case where cryptography dependency is too old to support
# ocsp, but Boulder is used instead of Pebble, so pebble_ocsp_server is not used. This is
# the typical situation of integration-certbot-oldest tox testenv.
from certbot_integration_tests.utils import pebble_ocsp_server
self._launch_process([sys.executable, pebble_ocsp_server.__file__])
# Wait for the ACME CA server to be up.
@@ -174,68 +153,6 @@ class ACMEServer:
print('=> Finished pebble instance deployment.')
def _prepare_boulder_server(self) -> None:
"""Configure and launch the Boulder server"""
print('=> Starting boulder instance deployment...')
instance_path = join(self._workspace, 'boulder')
# Load Boulder from git, that includes a docker-compose.yml ready for production.
process = self._launch_process(['git', 'clone', 'https://github.com/letsencrypt/boulder',
'--single-branch', '--depth=1', instance_path])
process.wait(MAX_SUBPROCESS_WAIT)
# Allow Boulder to ignore usual limit rate policies, useful for tests.
os.rename(join(instance_path, 'test/rate-limit-policies-b.yml'),
join(instance_path, 'test/rate-limit-policies.yml'))
if self._dns_server:
# Change Boulder config to use the provided DNS server
for suffix in ["", "-remote-a", "-remote-b"]:
with open(join(instance_path, 'test/config/va{}.json'.format(suffix)), 'r') as f:
config = json.loads(f.read())
config['va']['dnsResolvers'] = [self._dns_server]
with open(join(instance_path, 'test/config/va{}.json'.format(suffix)), 'w') as f:
f.write(json.dumps(config, indent=2, separators=(',', ': ')))
# This command needs to be run before we try and terminate running processes because
# docker-compose up doesn't always respond to SIGTERM. See
# https://github.com/certbot/certbot/pull/9435.
self._register_preterminate_cmd(['docker-compose', 'down'], cwd=instance_path)
# Boulder docker generates build artifacts owned by root with 0o744 permissions.
# If we started the acme server from a normal user that has access to the Docker
# daemon, this user will not be able to delete these artifacts from the host.
# We need to do it through a docker.
self._register_preterminate_cmd(['docker', 'run', '--rm', '-v',
'{0}:/workspace'.format(self._workspace), 'alpine', 'rm',
'-rf', '/workspace/boulder'])
try:
# Launch the Boulder server
self._launch_process(['docker-compose', 'up', '--force-recreate'], cwd=instance_path)
# Wait for the ACME CA server to be up.
print('=> Waiting for boulder instance to respond...')
misc.check_until_timeout(
self.acme_xdist['directory_url'], attempts=300)
if not self._dns_server:
# Configure challtestsrv to answer any A record request with ip of the docker host.
response = requests.post(
f'{BOULDER_V2_CHALLTESTSRV_URL}/set-default-ipv4',
json={'ip': '10.77.77.1'},
timeout=10
)
response.raise_for_status()
except BaseException:
# If we failed to set up boulder, print its logs.
print('=> Boulder setup failed. Boulder logs are:')
process = self._launch_process([
'docker-compose', 'logs'], cwd=instance_path, force_stderr=True
)
process.wait(MAX_SUBPROCESS_WAIT)
raise
print('=> Finished boulder instance deployment.')
def _prepare_http_proxy(self) -> None:
"""Configure and launch an HTTP proxy"""
print(f'=> Configuring the HTTP proxy on port {self._http_01_port}...')
@@ -260,26 +177,11 @@ class ACMEServer:
self._processes.append(process)
return process
def _register_preterminate_cmd(self, *args: Any, **kwargs: Any) -> None:
self._preterminate_cmds_args.append((args, kwargs))
def _run_preterminate_cmds(self) -> None:
for args, kwargs in self._preterminate_cmds_args:
process = self._launch_process(*args, **kwargs)
process.wait(MAX_SUBPROCESS_WAIT)
# It's unlikely to matter, but let's clear the list of cleanup commands
# once they've been run.
self._preterminate_cmds_args.clear()
def main() -> None:
# pylint: disable=missing-function-docstring
parser = argparse.ArgumentParser(
description='CLI tool to start a local instance of Pebble or Boulder CA server.')
parser.add_argument('--server-type', '-s',
choices=['pebble', 'boulder-v2'], default='pebble',
help='type of CA server to start: can be Pebble or Boulder. '
'Pebble is used if not set.')
description='CLI tool to start a local instance of Pebble CA server.')
parser.add_argument('--dns-server', '-d',
help='specify the DNS server as `IP:PORT` to use by '
'Pebble; if not specified, a local mock DNS server will be used to '
@@ -290,8 +192,8 @@ def main() -> None:
args = parser.parse_args()
acme_server = ACMEServer(
args.server_type, [], http_proxy=False, stdout=True,
dns_server=args.dns_server, http_01_port=args.http_01_port,
[], http_proxy=False, stdout=True, dns_server=args.dns_server,
http_01_port=args.http_01_port,
)
try:

View File

@@ -6,11 +6,8 @@ import subprocess
import sys
from typing import Dict
from typing import List
from typing import Mapping
from typing import Tuple
import pkg_resources
import certbot_integration_tests
# pylint: disable=wildcard-import,unused-wildcard-import
from certbot_integration_tests.utils.constants import *
@@ -84,29 +81,14 @@ def _prepare_environ(workspace: str) -> Dict[str, str]:
return new_environ
def _compute_additional_args(workspace: str, environ: Mapping[str, str],
force_renew: bool) -> List[str]:
additional_args = []
output = subprocess.check_output(['certbot', '--version'],
universal_newlines=True, stderr=subprocess.STDOUT,
cwd=workspace, env=environ)
# Typical response is: output = 'certbot 0.31.0.dev0'
version_str = output.split(' ')[1].strip()
if pkg_resources.parse_version(version_str) >= pkg_resources.parse_version('0.30.0'):
additional_args.append('--no-random-sleep-on-renew')
if force_renew:
additional_args.append('--renew-by-default')
return additional_args
def _prepare_args_env(certbot_args: List[str], directory_url: str, http_01_port: int,
tls_alpn_01_port: int, config_dir: str, workspace: str,
force_renew: bool) -> Tuple[List[str], Dict[str, str]]:
new_environ = _prepare_environ(workspace)
additional_args = _compute_additional_args(workspace, new_environ, force_renew)
additional_args = ['--no-random-sleep-on-renew']
if force_renew:
additional_args.append('--renew-by-default')
command = [
'certbot',
@@ -114,7 +96,6 @@ def _prepare_args_env(certbot_args: List[str], directory_url: str, http_01_port:
'--no-verify-ssl',
'--http-01-port', str(http_01_port),
'--https-port', str(tls_alpn_01_port),
'--manual-public-ip-logging-ok',
'--config-dir', config_dir,
'--work-dir', os.path.join(workspace, 'work'),
'--logs-dir', os.path.join(workspace, 'logs'),

View File

@@ -1,10 +1,7 @@
"""Some useful constants to use throughout certbot-ci integration tests"""
DEFAULT_HTTP_01_PORT = 5002
BOULDER_HTTP_01_PORT = 80
TLS_ALPN_01_PORT = 5001
CHALLTESTSRV_PORT = 8055
BOULDER_V2_CHALLTESTSRV_URL = f'http://10.77.77.77:{CHALLTESTSRV_PORT}'
BOULDER_V2_DIRECTORY_URL = 'http://localhost:4001/directory'
PEBBLE_DIRECTORY_URL = 'https://localhost:14000/dir'
PEBBLE_MANAGEMENT_URL = 'https://localhost:15000'
PEBBLE_CHALLTESTSRV_URL = f'http://localhost:{CHALLTESTSRV_PORT}'

View File

@@ -15,11 +15,14 @@ from typing import List
from typing import Optional
from typing import Type
from pkg_resources import resource_filename
from certbot_integration_tests.utils import constants
BIND_DOCKER_IMAGE = "internetsystemsconsortium/bind9:9.16"
if sys.version_info >= (3, 9): # pragma: no cover
import importlib.resources as importlib_resources
else: # pragma: no cover
import importlib_resources
BIND_DOCKER_IMAGE = "internetsystemsconsortium/bind9:9.20"
BIND_BIND_ADDRESS = ("127.0.0.1", 45953)
# A TCP DNS message which is a query for '. CH A' transaction ID 0xcb37. This is used
@@ -80,13 +83,12 @@ class DNSServer:
def _configure_bind(self) -> None:
"""Configure the BIND9 server based on the prebaked configuration"""
bind_conf_src = resource_filename(
"certbot_integration_tests", "assets/bind-config"
)
for directory in ("conf", "zones"):
shutil.copytree(
os.path.join(bind_conf_src, directory), os.path.join(self.bind_root, directory)
)
ref = importlib_resources.files("certbot_integration_tests") / "assets" / "bind-config"
with importlib_resources.as_file(ref) as path:
for directory in ("conf", "zones"):
shutil.copytree(
os.path.join(path, directory), os.path.join(self.bind_root, directory)
)
def _start_bind(self) -> None:
"""Launch the BIND9 server as a Docker container"""

View File

@@ -2,6 +2,7 @@
Misc module contains stateless functions that could be used during pytest execution,
or outside during setup/teardown of the integration tests environment.
"""
import atexit
import contextlib
import errno
import functools
@@ -30,27 +31,23 @@ from cryptography.hazmat.primitives.serialization import PrivateFormat
from cryptography.x509 import Certificate
from cryptography.x509 import load_pem_x509_certificate
from OpenSSL import crypto
import pkg_resources
import requests
from certbot_integration_tests.certbot_tests.context import IntegrationTestsContext
from certbot_integration_tests.utils.constants import PEBBLE_ALTERNATE_ROOTS
from certbot_integration_tests.utils.constants import PEBBLE_MANAGEMENT_URL
if sys.version_info >= (3, 9): # pragma: no cover
import importlib.resources as importlib_resources
else: # pragma: no cover
import importlib_resources
RSA_KEY_TYPE = 'rsa'
ECDSA_KEY_TYPE = 'ecdsa'
def _suppress_x509_verification_warnings() -> None:
try:
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
except ImportError:
# Handle old versions of request with vendorized urllib3
# pylint: disable=no-member
from requests.packages.urllib3.exceptions import InsecureRequestWarning
requests.packages.urllib3.disable_warnings( # type: ignore[attr-defined]
InsecureRequestWarning)
import urllib3
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def check_until_timeout(url: str, attempts: int = 30) -> None:
@@ -125,7 +122,11 @@ def generate_test_file_hooks(config_dir: str, hook_probe: str) -> None:
:param str config_dir: current certbot config directory
:param str hook_probe: path to the hook probe to test hook scripts execution
"""
hook_path = pkg_resources.resource_filename('certbot_integration_tests', 'assets/hook.py')
file_manager = contextlib.ExitStack()
atexit.register(file_manager.close)
hook_path_ref = (importlib_resources.files('certbot_integration_tests').joinpath('assets')
.joinpath('hook.py'))
hook_path = str(file_manager.enter_context(importlib_resources.as_file(hook_path_ref)))
for hook_dir in list_renewal_hooks_dirs(config_dir):
# We want an equivalent of bash `chmod -p $HOOK_DIR, that does not fail if one folder of
@@ -232,7 +233,7 @@ def generate_csr(domains: Iterable[str], key_path: str, csr_path: str,
req.add_extensions([san_constraint])
req.set_pubkey(key)
req.set_version(2)
req.set_version(0)
req.sign(key, 'sha256')
with open(csr_path, 'wb') as file_h:
@@ -260,9 +261,11 @@ def load_sample_data_path(workspace: str) -> str:
:returns: the path to the loaded sample data directory
:rtype: str
"""
original = pkg_resources.resource_filename('certbot_integration_tests', 'assets/sample-config')
copied = os.path.join(workspace, 'sample-config')
shutil.copytree(original, copied, symlinks=True)
original_ref = (importlib_resources.files('certbot_integration_tests').joinpath('assets')
.joinpath('sample-config'))
with importlib_resources.as_file(original_ref) as original:
copied = os.path.join(workspace, 'sample-config')
shutil.copytree(original, copied, symlinks=True)
if os.name == 'nt':
# Fix the symlinks on Windows if GIT is not configured to create them upon checkout
@@ -299,16 +302,12 @@ def echo(keyword: str, path: Optional[str] = None) -> str:
os.path.basename(sys.executable), keyword, ' >> "{0}"'.format(path) if path else '')
def get_acme_issuers(context: IntegrationTestsContext) -> List[Certificate]:
def get_acme_issuers() -> List[Certificate]:
"""Gets the list of one or more issuer certificates from the ACME server used by the
context.
:param context: the testing context.
:return: the `list of x509.Certificate` representing the list of issuers.
"""
# TODO: in fact, Boulder has alternate chains in config-next/, just not yet in config/.
if context.acme_server != "pebble":
raise NotImplementedError()
_suppress_x509_verification_warnings()
issuers = []

View File

@@ -1,54 +1,76 @@
# pylint: disable=missing-module-docstring
import atexit
import io
import json
import os
import stat
from typing import Tuple
import sys
import zipfile
from contextlib import ExitStack
from typing import Optional, Tuple
import pkg_resources
import requests
from certbot_integration_tests.utils.constants import DEFAULT_HTTP_01_PORT
from certbot_integration_tests.utils.constants import MOCK_OCSP_SERVER_PORT
PEBBLE_VERSION = 'v2.3.1'
ASSETS_PATH = pkg_resources.resource_filename('certbot_integration_tests', 'assets')
if sys.version_info >= (3, 9): # pragma: no cover
import importlib.resources as importlib_resources
else: # pragma: no cover
import importlib_resources
PEBBLE_VERSION = 'v2.5.1'
def fetch(workspace: str, http_01_port: int = DEFAULT_HTTP_01_PORT) -> Tuple[str, str, str]:
# pylint: disable=missing-function-docstring
suffix = 'linux-amd64' if os.name != 'nt' else 'windows-amd64.exe'
file_manager = ExitStack()
atexit.register(file_manager.close)
pebble_path_ref = importlib_resources.files('certbot_integration_tests') / 'assets'
assets_path = str(file_manager.enter_context(importlib_resources.as_file(pebble_path_ref)))
pebble_path = _fetch_asset('pebble', suffix)
challtestsrv_path = _fetch_asset('pebble-challtestsrv', suffix)
pebble_config_path = _build_pebble_config(workspace, http_01_port)
pebble_path = _fetch_asset('pebble', assets_path)
challtestsrv_path = _fetch_asset('pebble-challtestsrv', assets_path)
pebble_config_path = _build_pebble_config(workspace, http_01_port, assets_path)
return pebble_path, challtestsrv_path, pebble_config_path
def _fetch_asset(asset: str, suffix: str) -> str:
asset_path = os.path.join(ASSETS_PATH, '{0}_{1}_{2}'.format(asset, PEBBLE_VERSION, suffix))
def _fetch_asset(asset: str, assets_path: str) -> str:
platform = 'linux-amd64'
base_url = 'https://github.com/letsencrypt/pebble/releases/download'
asset_path = os.path.join(assets_path, f'{asset}_{PEBBLE_VERSION}_{platform}')
if not os.path.exists(asset_path):
asset_url = ('https://github.com/letsencrypt/pebble/releases/download/{0}/{1}_{2}'
.format(PEBBLE_VERSION, asset, suffix))
asset_url = f'{base_url}/{PEBBLE_VERSION}/{asset}-{platform}.zip'
response = requests.get(asset_url, timeout=30)
response.raise_for_status()
asset_data = _unzip_asset(response.content, asset)
if asset_data is None:
raise ValueError(f"zipfile {asset_url} didn't contain file {asset}")
with open(asset_path, 'wb') as file_h:
file_h.write(response.content)
file_h.write(asset_data)
os.chmod(asset_path, os.stat(asset_path).st_mode | stat.S_IEXEC)
return asset_path
def _build_pebble_config(workspace: str, http_01_port: int) -> str:
def _unzip_asset(zipped_data: bytes, asset_name: str) -> Optional[bytes]:
with zipfile.ZipFile(io.BytesIO(zipped_data)) as zip_file:
for entry in zip_file.filelist:
if not entry.is_dir() and entry.filename.endswith(asset_name):
return zip_file.read(entry)
return None
def _build_pebble_config(workspace: str, http_01_port: int, assets_path: str) -> str:
config_path = os.path.join(workspace, 'pebble-config.json')
with open(config_path, 'w') as file_h:
file_h.write(json.dumps({
'pebble': {
'listenAddress': '0.0.0.0:14000',
'managementListenAddress': '0.0.0.0:15000',
'certificate': os.path.join(ASSETS_PATH, 'cert.pem'),
'privateKey': os.path.join(ASSETS_PATH, 'key.pem'),
'certificate': os.path.join(assets_path, 'cert.pem'),
'privateKey': os.path.join(assets_path, 'key.pem'),
'httpPort': http_01_port,
'tlsPort': 5001,
'ocspResponderURL': 'http://127.0.0.1:{0}'.format(MOCK_OCSP_SERVER_PORT),

View File

@@ -5,6 +5,7 @@ to serve a mock OCSP responder during integration tests against Pebble.
"""
import datetime
import http.server as BaseHTTPServer
import pytz
import re
from typing import cast
from typing import Union
@@ -37,7 +38,9 @@ class _ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
verify=False, timeout=10)
issuer_cert = x509.load_pem_x509_certificate(request.content, default_backend())
content_len = int(self.headers.get('Content-Length'))
raw_content_len = self.headers.get('Content-Length')
assert isinstance(raw_content_len, str)
content_len = int(raw_content_len)
ocsp_request = ocsp.load_der_ocsp_request(self.rfile.read(content_len))
response = requests.get('{0}/cert-status-by-serial/{1}'.format(
@@ -52,7 +55,7 @@ class _ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
else:
data = response.json()
now = datetime.datetime.utcnow()
now = datetime.datetime.now(pytz.UTC)
cert = x509.load_pem_x509_certificate(data['Certificate'].encode(), default_backend())
if data['Status'] != 'Revoked':
ocsp_status = ocsp.OCSPCertStatus.GOOD

View File

@@ -1,20 +1,12 @@
from pkg_resources import parse_version
from setuptools import __version__ as setuptools_version
from setuptools import find_packages
from setuptools import setup
version = '0.32.0.dev0'
# setuptools 36.2+ is needed for support for environment markers
min_setuptools_version='36.2'
# This conditional isn't necessary, but it provides better error messages to
# people who try to install this package with older versions of setuptools.
if parse_version(setuptools_version) < parse_version(min_setuptools_version):
raise RuntimeError(f'setuptools {min_setuptools_version}+ is required')
install_requires = [
'coverage',
'cryptography',
'importlib_resources>=1.3.1; python_version < "3.9"',
'pyopenssl',
'pytest',
'pytest-cov',
@@ -26,9 +18,12 @@ install_requires = [
# installation on Linux.
'pywin32>=300 ; sys_platform == "win32"',
'pyyaml',
'requests',
'pytz>=2019.3',
# requests unvendored its dependencies in version 2.16.0 and this code relies on that for
# calling `urllib3.disable_warnings`.
'requests>=2.16.0',
'setuptools',
'types-python-dateutil'
'types-python-dateutil',
]
setup(
@@ -39,18 +34,18 @@ setup(
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 3 - Alpha',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],

View File

@@ -1,9 +1,10 @@
FROM debian:buster
MAINTAINER Brad Warren <bmw@eff.org>
FROM docker.io/python:3.8-buster
LABEL maintainer="Brad Warren <bmw@eff.org>"
# This does not include the dependencies needed to build cryptography. See
# https://cryptography.io/en/latest/installation/#building-cryptography-on-linux
RUN apt-get update && \
apt install python3-dev python3-venv gcc libaugeas0 libssl-dev \
libffi-dev ca-certificates openssl -y
apt install python3-venv libaugeas0 -y
WORKDIR /opt/certbot/src

View File

@@ -75,7 +75,7 @@ def _get_server_root(config: str) -> str:
if os.path.isdir(os.path.join(config, name))]
if len(subdirs) != 1:
errors.Error("Malformed configuration directory {0}".format(config))
raise errors.Error("Malformed configuration directory {0}".format(config))
return os.path.join(config, subdirs[0].rstrip())

View File

@@ -19,7 +19,6 @@
server {
listen 80 default_server;
listen [::]:80 default_server ipv6only=on;
root /usr/share/nginx/html;
index index.html index.htm;

View File

@@ -1,7 +1,7 @@
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
'certbot',
@@ -14,22 +14,22 @@ setup(
name='certbot-compatibility-test',
version=version,
description="Compatibility tests for Certbot",
url='https://github.com/letsencrypt/letsencrypt',
url='https://github.com/certbot/certbot',
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 3 - Alpha',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
],

View File

@@ -0,0 +1,33 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# You can also specify other tool versions:
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: certbot-dns-cloudflare/docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
formats:
- pdf
- epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: certbot-dns-cloudflare/readthedocs.org.requirements.txt

View File

@@ -16,7 +16,7 @@
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode']
'sphinx.ext.viewcode',
'sphinx_rtd_theme']
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
@@ -93,14 +94,7 @@ todo_include_todos = False
# a list of builtin themes.
#
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the

View File

@@ -4,14 +4,18 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
'cloudflare>=1.5.1',
# for now, do not upgrade to cloudflare>=2.20 to avoid deprecation warnings and the breaking
# changes in version 3.0. see https://github.com/certbot/certbot/issues/9938
'cloudflare>=1.5.1, <2.20',
'setuptools>=41.6.0',
]
if not os.environ.get('SNAP_BUILD'):
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
else:
install_requires.extend([
# We specify the minimum acme and certbot version as the current plugin
# version for simplicity. See
@@ -19,11 +23,6 @@ if not os.environ.get('SNAP_BUILD'):
f'acme>={version}',
f'certbot>={version}',
])
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Unset SNAP_BUILD when building wheels '
'to include certbot dependencies.')
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
docs_extras = [
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
@@ -42,7 +41,7 @@ setup(
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
@@ -51,11 +50,11 @@ setup(
'Operating System :: POSIX :: Linux',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -0,0 +1,33 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# You can also specify other tool versions:
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: certbot-dns-digitalocean/docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
formats:
- pdf
- epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: certbot-dns-digitalocean/readthedocs.org.requirements.txt

View File

@@ -2,6 +2,7 @@
import logging
from typing import Any
from typing import Callable
from typing import cast
from typing import Optional
import digitalocean
@@ -56,7 +57,7 @@ class Authenticator(dns_common.DNSAuthenticator):
def _get_digitalocean_client(self) -> "_DigitalOceanClient":
if not self.credentials: # pragma: no cover
raise errors.Error("Plugin has not been prepared.")
return _DigitalOceanClient(self.credentials.conf('token'))
return _DigitalOceanClient(cast(str, self.credentials.conf('token')))
class _DigitalOceanClient:

View File

@@ -16,7 +16,7 @@
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode']
'sphinx.ext.viewcode',
'sphinx_rtd_theme']
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
@@ -93,14 +94,7 @@ todo_include_todos = False
# a list of builtin themes.
#
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the

View File

@@ -4,14 +4,16 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
'python-digitalocean>=1.11', # 1.15.0 or newer is recommended for TTL support
'setuptools>=41.6.0',
]
if not os.environ.get('SNAP_BUILD'):
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
else:
install_requires.extend([
# We specify the minimum acme and certbot version as the current plugin
# version for simplicity. See
@@ -19,11 +21,6 @@ if not os.environ.get('SNAP_BUILD'):
f'acme>={version}',
f'certbot>={version}',
])
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Unset SNAP_BUILD when building wheels '
'to include certbot dependencies.')
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
docs_extras = [
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
@@ -42,7 +39,7 @@ setup(
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
@@ -51,11 +48,11 @@ setup(
'Operating System :: POSIX :: Linux',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -0,0 +1,33 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# You can also specify other tool versions:
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: certbot-dns-dnsimple/docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
formats:
- pdf
- epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: certbot-dns-dnsimple/readthedocs.org.requirements.txt

View File

@@ -2,33 +2,30 @@
import logging
from typing import Any
from typing import Callable
from typing import Optional
from lexicon.providers import dnsimple
from requests import HTTPError
from certbot import errors
from certbot.plugins import dns_common
from certbot.plugins import dns_common_lexicon
from certbot.plugins.dns_common import CredentialsConfiguration
logger = logging.getLogger(__name__)
ACCOUNT_URL = 'https://dnsimple.com/user'
class Authenticator(dns_common.DNSAuthenticator):
class Authenticator(dns_common_lexicon.LexiconDNSAuthenticator):
"""DNS Authenticator for DNSimple
This Authenticator uses the DNSimple v2 API to fulfill a dns-01 challenge.
"""
description = 'Obtain certificates using a DNS TXT record (if you are using DNSimple for DNS).'
ttl = 60
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.credentials: Optional[CredentialsConfiguration] = None
self._add_provider_option('token',
f'User access token for DNSimple v2 API. (See {ACCOUNT_URL}.)',
'auth_token')
@classmethod
def add_parser_arguments(cls, add: Callable[..., None],
@@ -40,42 +37,9 @@ class Authenticator(dns_common.DNSAuthenticator):
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
'the DNSimple API.'
def _setup_credentials(self) -> None:
self.credentials = self._configure_credentials(
'credentials',
'DNSimple credentials INI file',
{
'token': 'User access token for DNSimple v2 API. (See {0}.)'.format(ACCOUNT_URL)
}
)
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
self._get_dnsimple_client().add_txt_record(domain, validation_name, validation)
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
self._get_dnsimple_client().del_txt_record(domain, validation_name, validation)
def _get_dnsimple_client(self) -> "_DNSimpleLexiconClient":
if not self.credentials: # pragma: no cover
raise errors.Error("Plugin has not been prepared.")
return _DNSimpleLexiconClient(self.credentials.conf('token'), self.ttl)
class _DNSimpleLexiconClient(dns_common_lexicon.LexiconClient):
"""
Encapsulates all communication with the DNSimple via Lexicon.
"""
def __init__(self, token: str, ttl: int) -> None:
super().__init__()
config = dns_common_lexicon.build_lexicon_config('dnssimple', {
'ttl': ttl,
}, {
'auth_token': token,
})
self.provider = dnsimple.Provider(config)
@property
def _provider_name(self) -> str:
return 'dnsimple'
def _handle_http_error(self, e: HTTPError, domain_name: str) -> errors.PluginError:
hint = None

View File

@@ -1,10 +1,9 @@
"""Tests for certbot_dns_dnsimple._internal.dns_dnsimple."""
import sys
import unittest
from unittest import mock
import sys
import pytest
from requests import Response
from requests.exceptions import HTTPError
from certbot.compat import os
@@ -16,7 +15,9 @@ TOKEN = 'foo'
class AuthenticatorTest(test_util.TempDirTestCase,
dns_test_common_lexicon.BaseLexiconAuthenticatorTest):
dns_test_common_lexicon.BaseLexiconDNSAuthenticatorTest):
LOGIN_ERROR = HTTPError('401 Client Error: Unauthorized for url: ...', response=Response())
def setUp(self):
super().setUp()
@@ -31,23 +32,6 @@ class AuthenticatorTest(test_util.TempDirTestCase,
self.auth = Authenticator(self.config, "dnsimple")
self.mock_client = mock.MagicMock()
# _get_dnsimple_client | pylint: disable=protected-access
self.auth._get_dnsimple_client = mock.MagicMock(return_value=self.mock_client)
class DNSimpleLexiconClientTest(unittest.TestCase, dns_test_common_lexicon.BaseLexiconClientTest):
LOGIN_ERROR = HTTPError('401 Client Error: Unauthorized for url: ...')
def setUp(self):
from certbot_dns_dnsimple._internal.dns_dnsimple import _DNSimpleLexiconClient
self.client = _DNSimpleLexiconClient(TOKEN, 0)
self.provider_mock = mock.MagicMock()
self.client.provider = self.provider_mock
if __name__ == "__main__":
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover

View File

@@ -16,7 +16,7 @@
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode']
'sphinx.ext.viewcode',
'sphinx_rtd_theme']
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
@@ -93,14 +94,7 @@ todo_include_todos = False
# a list of builtin themes.
#
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the

View File

@@ -4,16 +4,18 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
# This version of lexicon is required to address the problem described in
# https://github.com/AnalogJ/lexicon/issues/387.
'dns-lexicon>=3.2.1',
'dns-lexicon>=3.14.1',
'setuptools>=41.6.0',
]
if not os.environ.get('SNAP_BUILD'):
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
else:
install_requires.extend([
# We specify the minimum acme and certbot version as the current plugin
# version for simplicity. See
@@ -21,11 +23,6 @@ if not os.environ.get('SNAP_BUILD'):
f'acme>={version}',
f'certbot>={version}',
])
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Unset SNAP_BUILD when building wheels '
'to include certbot dependencies.')
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
docs_extras = [
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
@@ -44,7 +41,7 @@ setup(
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
@@ -53,11 +50,11 @@ setup(
'Operating System :: POSIX :: Linux',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -0,0 +1,33 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# You can also specify other tool versions:
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: certbot-dns-dnsmadeeasy/docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
formats:
- pdf
- epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: certbot-dns-dnsmadeeasy/readthedocs.org.requirements.txt

View File

@@ -4,20 +4,17 @@ from typing import Any
from typing import Callable
from typing import Optional
from lexicon.providers import dnsmadeeasy
from requests import HTTPError
from certbot import errors
from certbot.plugins import dns_common
from certbot.plugins import dns_common_lexicon
from certbot.plugins.dns_common import CredentialsConfiguration
logger = logging.getLogger(__name__)
ACCOUNT_URL = 'https://cp.dnsmadeeasy.com/account/info'
class Authenticator(dns_common.DNSAuthenticator):
class Authenticator(dns_common_lexicon.LexiconDNSAuthenticator):
"""DNS Authenticator for DNS Made Easy
This Authenticator uses the DNS Made Easy API to fulfill a dns-01 challenge.
@@ -25,11 +22,17 @@ class Authenticator(dns_common.DNSAuthenticator):
description = ('Obtain certificates using a DNS TXT record (if you are using DNS Made Easy for '
'DNS).')
ttl = 60
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.credentials: Optional[CredentialsConfiguration] = None
self._add_provider_option('api-key',
'API key for DNS Made Easy account, '
f'obtained from {ACCOUNT_URL}',
'auth_username')
self._add_provider_option('secret-key',
'Secret key for DNS Made Easy account, '
f'obtained from {ACCOUNT_URL}',
'auth_token')
@classmethod
def add_parser_arguments(cls, add: Callable[..., None],
@@ -41,48 +44,9 @@ class Authenticator(dns_common.DNSAuthenticator):
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
'the DNS Made Easy API.'
def _setup_credentials(self) -> None:
self.credentials = self._configure_credentials(
'credentials',
'DNS Made Easy credentials INI file',
{
'api-key': 'API key for DNS Made Easy account, obtained from {0}'
.format(ACCOUNT_URL),
'secret-key': 'Secret key for DNS Made Easy account, obtained from {0}'
.format(ACCOUNT_URL)
}
)
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
self._get_dnsmadeeasy_client().add_txt_record(domain, validation_name, validation)
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
self._get_dnsmadeeasy_client().del_txt_record(domain, validation_name, validation)
def _get_dnsmadeeasy_client(self) -> "_DNSMadeEasyLexiconClient":
if not self.credentials: # pragma: no cover
raise errors.Error("Plugin has not been prepared.")
return _DNSMadeEasyLexiconClient(self.credentials.conf('api-key'),
self.credentials.conf('secret-key'),
self.ttl)
class _DNSMadeEasyLexiconClient(dns_common_lexicon.LexiconClient):
"""
Encapsulates all communication with the DNS Made Easy via Lexicon.
"""
def __init__(self, api_key: str, secret_key: str, ttl: int) -> None:
super().__init__()
config = dns_common_lexicon.build_lexicon_config('dnsmadeeasy', {
'ttl': ttl,
}, {
'auth_username': api_key,
'auth_token': secret_key,
})
self.provider = dnsmadeeasy.Provider(config)
@property
def _provider_name(self) -> str:
return 'dnsmadeeasy'
def _handle_http_error(self, e: HTTPError, domain_name: str) -> Optional[errors.PluginError]:
if domain_name in str(e) and str(e).startswith('404 Client Error: Not Found for url:'):

View File

@@ -1,10 +1,10 @@
"""Tests for certbot_dns_dnsmadeeasy._internal.dns_dnsmadeeasy."""
import sys
import unittest
from unittest import mock
import pytest
from requests import Response
from requests.exceptions import HTTPError
from certbot.compat import os
@@ -18,7 +18,10 @@ SECRET_KEY = 'bar'
class AuthenticatorTest(test_util.TempDirTestCase,
dns_test_common_lexicon.BaseLexiconAuthenticatorTest):
dns_test_common_lexicon.BaseLexiconDNSAuthenticatorTest):
DOMAIN_NOT_FOUND = HTTPError(f'404 Client Error: Not Found for url: {DOMAIN}.', response=Response())
LOGIN_ERROR = HTTPError(f'403 Client Error: Forbidden for url: {DOMAIN}.', response=Response())
def setUp(self):
super().setUp()
@@ -35,24 +38,6 @@ class AuthenticatorTest(test_util.TempDirTestCase,
self.auth = Authenticator(self.config, "dnsmadeeasy")
self.mock_client = mock.MagicMock()
# _get_dnsmadeeasy_client | pylint: disable=protected-access
self.auth._get_dnsmadeeasy_client = mock.MagicMock(return_value=self.mock_client)
class DNSMadeEasyLexiconClientTest(unittest.TestCase,
dns_test_common_lexicon.BaseLexiconClientTest):
DOMAIN_NOT_FOUND = HTTPError('404 Client Error: Not Found for url: {0}.'.format(DOMAIN))
LOGIN_ERROR = HTTPError('403 Client Error: Forbidden for url: {0}.'.format(DOMAIN))
def setUp(self):
from certbot_dns_dnsmadeeasy._internal.dns_dnsmadeeasy import _DNSMadeEasyLexiconClient
self.client = _DNSMadeEasyLexiconClient(API_KEY, SECRET_KEY, 0)
self.provider_mock = mock.MagicMock()
self.client.provider = self.provider_mock
if __name__ == "__main__":
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover

View File

@@ -16,7 +16,7 @@
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode']
'sphinx.ext.viewcode',
'sphinx_rtd_theme']
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
@@ -93,14 +94,7 @@ todo_include_todos = False
# a list of builtin themes.
#
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the

View File

@@ -4,14 +4,16 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
'dns-lexicon>=3.2.1',
'dns-lexicon>=3.14.1',
'setuptools>=41.6.0',
]
if not os.environ.get('SNAP_BUILD'):
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
else:
install_requires.extend([
# We specify the minimum acme and certbot version as the current plugin
# version for simplicity. See
@@ -19,11 +21,6 @@ if not os.environ.get('SNAP_BUILD'):
f'acme>={version}',
f'certbot>={version}',
])
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Unset SNAP_BUILD when building wheels '
'to include certbot dependencies.')
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
docs_extras = [
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
@@ -42,7 +39,7 @@ setup(
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
@@ -51,11 +48,11 @@ setup(
'Operating System :: POSIX :: Linux',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -0,0 +1,33 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# You can also specify other tool versions:
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: certbot-dns-gehirn/docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
formats:
- pdf
- epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: certbot-dns-gehirn/readthedocs.org.requirements.txt

View File

@@ -4,20 +4,17 @@ from typing import Any
from typing import Callable
from typing import Optional
from lexicon.providers import gehirn
from requests import HTTPError
from certbot import errors
from certbot.plugins import dns_common
from certbot.plugins import dns_common_lexicon
from certbot.plugins.dns_common import CredentialsConfiguration
logger = logging.getLogger(__name__)
DASHBOARD_URL = "https://gis.gehirn.jp/"
class Authenticator(dns_common.DNSAuthenticator):
class Authenticator(dns_common_lexicon.LexiconDNSAuthenticator):
"""DNS Authenticator for Gehirn Infrastructure Service DNS
This Authenticator uses the Gehirn Infrastructure Service API to fulfill
@@ -26,11 +23,17 @@ class Authenticator(dns_common.DNSAuthenticator):
description = 'Obtain certificates using a DNS TXT record ' + \
'(if you are using Gehirn Infrastructure Service for DNS).'
ttl = 60
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.credentials: Optional[CredentialsConfiguration] = None
self._add_provider_option('api-token',
'API token for Gehirn Infrastructure Service '
f'API obtained from {DASHBOARD_URL}',
'auth_token')
self._add_provider_option('api-secret',
'API secret for Gehirn Infrastructure Service '
f'API obtained from {DASHBOARD_URL}',
'auth_secret')
@classmethod
def add_parser_arguments(cls, add: Callable[..., None],
@@ -42,50 +45,9 @@ class Authenticator(dns_common.DNSAuthenticator):
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
'the Gehirn Infrastructure Service API.'
def _setup_credentials(self) -> None:
self.credentials = self._configure_credentials(
'credentials',
'Gehirn Infrastructure Service credentials file',
{
'api-token': 'API token for Gehirn Infrastructure Service ' + \
'API obtained from {0}'.format(DASHBOARD_URL),
'api-secret': 'API secret for Gehirn Infrastructure Service ' + \
'API obtained from {0}'.format(DASHBOARD_URL),
}
)
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
self._get_gehirn_client().add_txt_record(domain, validation_name, validation)
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
self._get_gehirn_client().del_txt_record(domain, validation_name, validation)
def _get_gehirn_client(self) -> "_GehirnLexiconClient":
if not self.credentials: # pragma: no cover
raise errors.Error("Plugin has not been prepared.")
return _GehirnLexiconClient(
self.credentials.conf('api-token'),
self.credentials.conf('api-secret'),
self.ttl
)
class _GehirnLexiconClient(dns_common_lexicon.LexiconClient):
"""
Encapsulates all communication with the Gehirn Infrastructure Service via Lexicon.
"""
def __init__(self, api_token: str, api_secret: str, ttl: int) -> None:
super().__init__()
config = dns_common_lexicon.build_lexicon_config('gehirn', {
'ttl': ttl,
}, {
'auth_token': api_token,
'auth_secret': api_secret,
})
self.provider = gehirn.Provider(config)
@property
def _provider_name(self) -> str:
return 'gehirn'
def _handle_http_error(self, e: HTTPError, domain_name: str) -> Optional[errors.PluginError]:
if domain_name in str(e) and (str(e).startswith('404 Client Error: Not Found for url:')):

View File

@@ -1,10 +1,10 @@
"""Tests for certbot_dns_gehirn._internal.dns_gehirn."""
import sys
import unittest
from unittest import mock
import pytest
from requests import Response
from requests.exceptions import HTTPError
from certbot.compat import os
@@ -16,8 +16,12 @@ from certbot.tests import util as test_util
API_TOKEN = '00000000-0000-0000-0000-000000000000'
API_SECRET = 'MDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAw'
class AuthenticatorTest(test_util.TempDirTestCase,
dns_test_common_lexicon.BaseLexiconAuthenticatorTest):
dns_test_common_lexicon.BaseLexiconDNSAuthenticatorTest):
DOMAIN_NOT_FOUND = HTTPError(f'404 Client Error: Not Found for url: {DOMAIN}.', response=Response())
LOGIN_ERROR = HTTPError(f'401 Client Error: Unauthorized for url: {DOMAIN}.', response=Response())
def setUp(self):
super().setUp()
@@ -35,23 +39,6 @@ class AuthenticatorTest(test_util.TempDirTestCase,
self.auth = Authenticator(self.config, "gehirn")
self.mock_client = mock.MagicMock()
# _get_gehirn_client | pylint: disable=protected-access
self.auth._get_gehirn_client = mock.MagicMock(return_value=self.mock_client)
class GehirnLexiconClientTest(unittest.TestCase, dns_test_common_lexicon.BaseLexiconClientTest):
DOMAIN_NOT_FOUND = HTTPError('404 Client Error: Not Found for url: {0}.'.format(DOMAIN))
LOGIN_ERROR = HTTPError('401 Client Error: Unauthorized for url: {0}.'.format(DOMAIN))
def setUp(self):
from certbot_dns_gehirn._internal.dns_gehirn import _GehirnLexiconClient
self.client = _GehirnLexiconClient(API_TOKEN, API_SECRET, 0)
self.provider_mock = mock.MagicMock()
self.client.provider = self.provider_mock
if __name__ == "__main__":
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover

View File

@@ -16,7 +16,7 @@
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode']
'sphinx.ext.viewcode',
'sphinx_rtd_theme']
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
@@ -93,14 +94,7 @@ todo_include_todos = False
# a list of builtin themes.
#
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the

View File

@@ -4,14 +4,16 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
'dns-lexicon>=3.2.1',
'dns-lexicon>=3.14.1',
'setuptools>=41.6.0',
]
if not os.environ.get('SNAP_BUILD'):
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
else:
install_requires.extend([
# We specify the minimum acme and certbot version as the current plugin
# version for simplicity. See
@@ -19,11 +21,6 @@ if not os.environ.get('SNAP_BUILD'):
f'acme>={version}',
f'certbot>={version}',
])
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Unset SNAP_BUILD when building wheels '
'to include certbot dependencies.')
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
docs_extras = [
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
@@ -42,7 +39,7 @@ setup(
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
@@ -51,11 +48,11 @@ setup(
'Operating System :: POSIX :: Linux',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -0,0 +1,33 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# You can also specify other tool versions:
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: certbot-dns-google/docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
formats:
- pdf
- epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: certbot-dns-google/readthedocs.org.requirements.txt

View File

@@ -14,7 +14,14 @@ Named Arguments
======================================== =====================================
``--dns-google-credentials`` Google Cloud Platform credentials_
JSON file.
(Required - Optional on Google Compute Engine)
(Required if not using `Application Default
Credentials <https://cloud.google.com/docs/authentication/
application-default-credentials>`_.)
``--dns-google-project`` The ID of the Google Cloud project that the Google
Cloud DNS managed zone(s) reside in.
(Default: project that the Google credentials_ belong to)
``--dns-google-propagation-seconds`` The number of seconds to wait for DNS
to propagate before asking the ACME
server to verify the DNS record.
@@ -25,45 +32,37 @@ Named Arguments
Credentials
-----------
Use of this plugin requires Google Cloud Platform API credentials
for an account with the following permissions:
Use of this plugin requires Google Cloud Platform credentials with the ability to modify the Cloud
DNS managed zone(s) for which certificates are being issued.
* ``dns.changes.create``
* ``dns.changes.get``
* ``dns.changes.list``
* ``dns.managedZones.get``
* ``dns.managedZones.list``
* ``dns.resourceRecordSets.create``
* ``dns.resourceRecordSets.delete``
* ``dns.resourceRecordSets.list``
* ``dns.resourceRecordSets.update``
In most cases, configuring credentials for Certbot will require `creating a service account
<https://cloud.google.com/iam/docs/service-accounts-create>`_, and then either `granting permissions
with predefined roles`_ or `granting permissions with custom roles`_ using IAM.
(The closest role is `dns.admin <https://cloud.google.com/dns/docs/
access-control#dns.admin>`_).
If the above permissions are assigned at the `resource level <https://cloud
.google.com/dns/docs/zones/iam-per-resource-zones>`_, the same user must
have, at the PROJECT level, the following permissions:
Providing Credentials
^^^^^^^^^^^^^^^^^^^^^
* ``dns.managedZones.get``
* ``dns.managedZones.list``
The preferred method of providing credentials is to `set up Application Default Credentials
<https://cloud.google.com/docs/authentication/provide-credentials-adc>`_ (ADC) in the environment
that Certbot is running in.
(The closest role is `dns.reader <https://cloud.google.com/dns/docs/
access-control#dns.reader>`_).
If you are running Certbot on Google Cloud then a service account can be assigned directly to most
types of workload, including `Compute Engine VMs <https://cloud.google.com/compute/docs/access/
create-enable-service-accounts-for-instances>`_, `Kubernetes Engine Pods <https://cloud.google.com/
kubernetes-engine/docs/how-to/workload-identity>`_, `Cloud Run jobs <https://cloud.google.com/run
/docs/securing/service-identity>`_, `Cloud Functions <https://cloud.google.com/functions/docs/
securing/function-identity>`_, and `Cloud Builds <https://cloud.google.com/build/docs/securing-
builds/configure-user-specified-service-accounts>`_.
Google provides instructions for `creating a service account <https://developers
.google.com/identity/protocols/OAuth2ServiceAccount#creatinganaccount>`_ and
`information about the required permissions <https://cloud.google.com/dns/access
-control#permissions_and_roles>`_. If you're running on Google Compute Engine,
you can `assign the service account to the instance <https://cloud.google.com/
compute/docs/access/create-enable-service-accounts-for-instances>`_ which
is running certbot. A credentials file is not required in this case, as they
are automatically obtained by certbot through the `metadata service
<https://cloud.google.com/compute/docs/storing-retrieving-metadata>`_ .
If you are not running Certbot on Google Cloud then a credentials file should be provided using the
``--dns-google-credentials`` command-line argument. Google provides documentation for `creating
service account keys <https://cloud.google.com/iam/docs/keys-create-delete#creating>`_, which is the
most common method of using a service account outside of Google Cloud.
.. code-block:: json
:name: credentials.json
:caption: Example credentials file:
:name: credentials-sa.json
:caption: Example service account key file:
{
"type": "service_account",
@@ -78,12 +77,8 @@ are automatically obtained by certbot through the `metadata service
"client_x509_cert_url": "..."
}
The path to this file can be provided interactively or using the
``--dns-google-credentials`` command-line argument. Certbot records the path
to this file for use during renewal, but does not store the file's contents.
.. caution::
You should protect these API credentials as you would a password. Users who
You should protect these credentials as you would a password. Users who
can read this file can use these credentials to issue some types of API calls
on your behalf, limited by the permissions assigned to the account. Users who
can cause Certbot to run using these credentials can complete a ``dns-01``
@@ -97,35 +92,132 @@ file. This warning will be emitted each time Certbot uses the credentials file,
including for renewal, and cannot be silenced except by addressing the issue
(e.g., by using a command like ``chmod 600`` to restrict access to the file).
If you are running Certbot within another cloud platform, a CI platform, or any other platform that
supports issuing OpenID Connect Tokens, then you may also have the option of securely authenticating
with `workload identity federation <https://cloud.google.com/iam/docs/workload-identity-
federation>`_. Instructions are generally available for most platforms, including `AWS or Azure
<https://cloud.google.com/iam/docs/workload-identity-federation-with-other-clouds>`_, `GitHub
Actions <https://cloud.google.com/blog/products/identity-security/enabling-keyless-authentication
-from-github-actions>`_, and `GitLab CI <https://docs.gitlab.com/ee/ci/cloud_services/
google_cloud/>`_.
Granting Permissions with Predefined Roles
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The simplest method of granting the required permissions to the user or service account that Certbot
is authenticating with is to use either of these predefined role strategies:
* `dns.admin <https://cloud.google.com/dns/docs/access-control#dns.admin>`_ against the *DNS
zone(s)* that Certbot will be issuing certificates for.
* `dns.reader <https://cloud.google.com/dns/docs/access-control#dns.reader>`_ against the *project*
containing the relevant DNS zones.
*or*
* `dns.admin <https://cloud.google.com/dns/docs/access-control#dns.admin>`_ against the *project*
containing the relevant DNS zones
For instructions on how to grant roles, please read the Google provided documentation for `granting
access roles against a project <https://cloud.google.com/iam/docs/granting-changing-revoking-access
#single-role>`_ and `granting access roles against zones <https://cloud.google.com/dns/docs/zones/
iam-per-resource-zones#set_access_control_policy_for_a_specific_resource>`_.
.. caution::
Granting the ``dns.admin`` role at the project level can present a significant security risk. It
provides full administrative access to all DNS zones within the project, granting the ability to
perform any action up to and including deleting all zones within a project.
Granting Permissions with Custom Roles
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Custom roles are an alternative to predefined roles that provide the ability to define fine grained
permission sets for specific use cases. They should generally be used when it is desirable to adhere
to the principle of least privilege, such as within production or other security sensitive
workloads.
The following is an example strategy for granting granular permissions to Certbot using custom
roles. If you are not already familiar with how to do so, Google provides documentation for
`creating a custom IAM role <https://cloud.google.com/iam/docs/creating-custom-roles#creating>`_.
Firstly, create a custom role containing the permissions required to make DNS record updates. We
suggest naming the custom role ``Certbot - Zone Editor`` with the ID ``certbot.zoneEditor``. The
following permissions are required:
* ``dns.changes.create``
* ``dns.changes.get``
* ``dns.changes.list``
* ``dns.resourceRecordSets.create``
* ``dns.resourceRecordSets.delete``
* ``dns.resourceRecordSets.list``
* ``dns.resourceRecordSets.update``
Next, create a custom role granting Certbot the ability to discover DNS zones. We suggest naming the
custom role ``Certbot - Zone Lister`` with the ID ``certbot.zoneLister``. The following permissions
are required:
* ``dns.managedZones.get``
* ``dns.managedZones.list``
Finally, grant the custom roles to the user or service account that Certbot is authenticating with:
* Grant your custom ``Certbot - Zone Editor`` role against the *DNS zone(s)* that Certbot will be
issuing certificates for.
* Grant your custom ``Certbot - Zone Lister`` role against the *project* containing the relevant DNS
zones.
For instructions on how to grant roles, please read the Google provided documentation for `granting
access roles against a project <https://cloud.google.com/iam/docs/granting-changing-revoking-access
#single-role>`_ and `granting access roles against zones <https://cloud.google.com/dns/docs/zones/
iam-per-resource-zones#set_access_control_policy_for_a_specific_resource>`_.
Examples
--------
.. code-block:: bash
:caption: To acquire a certificate for ``example.com``
:caption: To acquire a certificate for ``example.com``, providing a credentials file
certbot certonly \\
--dns-google \\
--dns-google-credentials ~/.secrets/certbot/google.json \\
-d example.com
.. code-block:: bash
:caption: To acquire a certificate for ``example.com``, where ADC is available and
a credentials file is not required
certbot certonly \\
--dns-google \\
-d example.com
.. code-block:: bash
:caption: To acquire a single certificate for both ``example.com`` and
``www.example.com``
certbot certonly \\
--dns-google \\
--dns-google-credentials ~/.secrets/certbot/google.json \\
-d example.com \\
-d www.example.com
.. code-block:: bash
:caption: To acquire a certificate for ``example.com``, where the managed DNS
zone resides in another Google Cloud project.
certbot certonly \\
--dns-google \\
--dns-google-credentials ~/.secrets/certbot/google-project-test-foo.json \\
--dns-google-project test-bar \\
-d example.com
.. code-block:: bash
:caption: To acquire a certificate for ``example.com``, waiting 120 seconds
for DNS propagation
certbot certonly \\
--dns-google \\
--dns-google-credentials ~/.secrets/certbot/google.json \\
--dns-google-propagation-seconds 120 \\
-d example.com

View File

@@ -1,21 +1,22 @@
"""DNS Authenticator for Google Cloud DNS."""
import json
import logging
from typing import Any
from typing import Callable
from typing import Dict
from typing import Optional
import google.auth
from google.auth import exceptions as googleauth_exceptions
from googleapiclient import discovery
from googleapiclient import errors as googleapiclient_errors
import httplib2
from oauth2client.service_account import ServiceAccountCredentials
from certbot import errors
from certbot.plugins import dns_common
logger = logging.getLogger(__name__)
ADC_URL = 'https://cloud.google.com/docs/authentication/application-default-credentials'
ACCT_URL = 'https://developers.google.com/identity/protocols/OAuth2ServiceAccount#creatinganaccount'
PERMISSIONS_URL = 'https://cloud.google.com/dns/access-control#permissions_and_roles'
METADATA_URL = 'http://metadata.google.internal/computeMetadata/v1/'
@@ -31,15 +32,23 @@ class Authenticator(dns_common.DNSAuthenticator):
description = ('Obtain certificates using a DNS TXT record (if you are using Google Cloud DNS '
'for DNS).')
ttl = 60
google_client = None
@classmethod
def add_parser_arguments(cls, add: Callable[..., None],
default_propagation_seconds: int = 60) -> None:
super().add_parser_arguments(add, default_propagation_seconds=60)
add('credentials',
help=('Path to Google Cloud DNS service account JSON file. (See {0} for' +
'information about creating a service account and {1} for information about the' +
'required permissions.)').format(ACCT_URL, PERMISSIONS_URL),
help=('Path to Google Cloud DNS service account JSON file to use instead of relying on'
' Application Default Credentials (ADC). (See {0} for information about ADC, {1}'
' for information about creating a service account, and {2} for information about'
' the permissions required to modify Cloud DNS records.)')
.format(ADC_URL, ACCT_URL, PERMISSIONS_URL),
default=None)
add('project',
help=('The ID of the Google Cloud project that the Google Cloud DNS managed zone(s)' +
' reside in. This will be determined automatically if not specified.'),
default=None)
def more_info(self) -> str:
@@ -47,22 +56,19 @@ class Authenticator(dns_common.DNSAuthenticator):
'the Google Cloud DNS API.'
def _setup_credentials(self) -> None:
if self.conf('credentials') is None:
try:
# use project_id query to check for availability of google metadata server
# we won't use the result but know we're not on GCP when an exception is thrown
_GoogleClient.get_project_id()
except (ValueError, httplib2.ServerNotFoundError):
raise errors.PluginError('Unable to get Google Cloud Metadata and no credentials'
' specified. Automatic credential lookup is only '
'available on Google Cloud Platform. Please configure'
' credentials using --dns-google-credentials <file>')
else:
if self.conf('credentials') is not None:
self._configure_file('credentials',
'path to Google Cloud DNS service account JSON file')
dns_common.validate_file_permissions(self.conf('credentials'))
try:
self._get_google_client()
except googleauth_exceptions.DefaultCredentialsError as e:
raise errors.PluginError('Authentication using Google Application Default Credentials '
'failed ({}). Please configure credentials using'
' --dns-google-credentials <file>'.format(e))
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
self._get_google_client().add_txt_record(domain, validation_name, validation, self.ttl)
@@ -70,7 +76,10 @@ class Authenticator(dns_common.DNSAuthenticator):
self._get_google_client().del_txt_record(domain, validation_name, validation, self.ttl)
def _get_google_client(self) -> '_GoogleClient':
return _GoogleClient(self.conf('credentials'))
if self.google_client is None:
self.google_client = _GoogleClient(self.conf('credentials'), self.conf('project'))
return self.google_client
class _GoogleClient:
@@ -79,20 +88,31 @@ class _GoogleClient:
"""
def __init__(self, account_json: Optional[str] = None,
dns_project_id: Optional[str] = None,
dns_api: Optional[discovery.Resource] = None) -> None:
scopes = ['https://www.googleapis.com/auth/ndev.clouddns.readwrite']
credentials = None
project_id = None
if account_json is not None:
try:
credentials = ServiceAccountCredentials.from_json_keyfile_name(account_json, scopes)
with open(account_json) as account:
self.project_id = json.load(account)['project_id']
except Exception as e:
credentials, project_id = google.auth.load_credentials_from_file(
account_json, scopes=scopes)
except googleauth_exceptions.GoogleAuthError as e:
raise errors.PluginError(
"Error parsing credentials file '{}': {}".format(account_json, e))
"Error loading credentials file '{}': {}".format(account_json, e))
else:
credentials = None
self.project_id = self.get_project_id()
credentials, project_id = google.auth.default(scopes=scopes)
if dns_project_id is not None:
project_id = dns_project_id
if not project_id:
raise errors.PluginError('The Google Cloud project could not be automatically '
'determined. Please configure it using --dns-google-project'
' <project>.')
self.project_id = project_id
if not dns_api:
self.dns = discovery.build('dns', 'v1',
@@ -286,32 +306,9 @@ class _GoogleClient:
for zone in zones:
zone_id = zone['id']
if 'privateVisibilityConfig' not in zone:
if zone['visibility'] == "public":
logger.debug('Found id of %s for %s using name %s', zone_id, domain, zone_name)
return zone_id
raise errors.PluginError('Unable to determine managed zone for {0} using zone names: {1}.'
.format(domain, zone_dns_name_guesses))
@staticmethod
def get_project_id() -> str:
"""
Query the google metadata service for the current project ID
This only works on Google Cloud Platform
:raises ServerNotFoundError: Not running on Google Compute or DNS not available
:raises ValueError: Server is found, but response code is not 200
:returns: project id
"""
url = '{0}project/project-id'.format(METADATA_URL)
# Request an access token from the metadata server.
http = httplib2.Http()
r, content = http.request(url, headers=METADATA_HEADERS)
if r.status != 200:
raise ValueError("Invalid status code: {0}".format(r))
if isinstance(content, bytes):
return content.decode()
return content

View File

@@ -4,10 +4,10 @@ import sys
import unittest
from unittest import mock
from google.auth import exceptions as googleauth_exceptions
from googleapiclient import discovery
from googleapiclient.errors import Error
from googleapiclient.http import HttpMock
from httplib2 import ServerNotFoundError
import pytest
from certbot import errors
@@ -20,7 +20,7 @@ from certbot.tests import util as test_util
ACCOUNT_JSON_PATH = '/not/a/real/path.json'
API_ERROR = Error()
PROJECT_ID = "test-test-1"
SCOPES = ['https://www.googleapis.com/auth/ndev.clouddns.readwrite']
class AuthenticatorTest(test_util.TempDirTestCase, dns_test_common.BaseAuthenticatorTest):
@@ -34,22 +34,25 @@ class AuthenticatorTest(test_util.TempDirTestCase, dns_test_common.BaseAuthentic
super().setUp()
self.config = mock.MagicMock(google_credentials=path,
google_project=PROJECT_ID,
google_propagation_seconds=0) # don't wait during tests
self.auth = Authenticator(self.config, "google")
self.mock_client = mock.MagicMock()
# _get_google_client | pylint: disable=protected-access
self.auth._get_google_client = mock.MagicMock(return_value=self.mock_client)
@test_util.patch_display_util()
def test_perform(self, unused_mock_get_utility):
# _get_google_client | pylint: disable=protected-access
self.auth._get_google_client = mock.MagicMock(return_value=self.mock_client)
self.auth.perform([self.achall])
expected = [mock.call.add_txt_record(DOMAIN, '_acme-challenge.'+DOMAIN, mock.ANY, mock.ANY)]
assert expected == self.mock_client.mock_calls
def test_cleanup(self):
# _get_google_client | pylint: disable=protected-access
self.auth._get_google_client = mock.MagicMock(return_value=self.mock_client)
# _attempt_cleanup | pylint: disable=protected-access
self.auth._attempt_cleanup = True
self.auth.cleanup([self.achall])
@@ -57,13 +60,27 @@ class AuthenticatorTest(test_util.TempDirTestCase, dns_test_common.BaseAuthentic
expected = [mock.call.del_txt_record(DOMAIN, '_acme-challenge.'+DOMAIN, mock.ANY, mock.ANY)]
assert expected == self.mock_client.mock_calls
@mock.patch('httplib2.Http.request', side_effect=ServerNotFoundError)
@test_util.patch_display_util()
def test_without_auth(self, unused_mock_get_utility, unused_mock):
def test_without_auth(self, unused_mock_get_utility):
self.auth._get_google_client = mock.MagicMock(side_effect=googleauth_exceptions.DefaultCredentialsError)
self.config.google_credentials = None
with pytest.raises(PluginError):
self.auth.perform([self.achall])
@mock.patch('certbot_dns_google._internal.dns_google._GoogleClient')
def test_get_google_client(self, client_mock):
test_client = mock.MagicMock()
client_mock.return_value = test_client
self.auth._get_google_client()
assert client_mock.called
assert self.auth.google_client is test_client
def test_get_google_client_cached(self):
test_client = mock.MagicMock()
self.auth.google_client = test_client
assert self.auth._get_google_client() is test_client
class GoogleClientTest(unittest.TestCase):
record_name = "foo"
@@ -71,6 +88,7 @@ class GoogleClientTest(unittest.TestCase):
record_ttl = 42
zone = "ZONE_ID"
change = "an-id"
visibility = "public"
def _setUp_client_with_mock(self, zone_request_side_effect, rrs_list_side_effect=None):
from certbot_dns_google._internal.dns_google import _GoogleClient
@@ -81,7 +99,7 @@ class GoogleClientTest(unittest.TestCase):
http_mock = HttpMock(discovery_file, {'status': '200'})
dns_api = discovery.build('dns', 'v1', http=http_mock)
client = _GoogleClient(ACCOUNT_JSON_PATH, dns_api)
client = _GoogleClient(ACCOUNT_JSON_PATH, None, dns_api)
# Setup
mock_mz = mock.MagicMock()
@@ -107,32 +125,67 @@ class GoogleClientTest(unittest.TestCase):
return client, mock_changes
@mock.patch('googleapiclient.discovery.build')
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('certbot_dns_google._internal.dns_google._GoogleClient.get_project_id')
def test_client_without_credentials(self, get_project_id_mock, credential_mock,
unused_discovery_mock):
@mock.patch('google.auth.default')
def test_client_with_default_credentials(self, credential_mock, discovery_mock):
test_credentials = mock.MagicMock()
credential_mock.return_value = (test_credentials, PROJECT_ID)
from certbot_dns_google._internal.dns_google import _GoogleClient
_GoogleClient(None)
assert not credential_mock.called
assert get_project_id_mock.called
client = _GoogleClient(None)
credential_mock.assert_called_once_with(scopes=SCOPES)
assert client.project_id == PROJECT_ID
discovery_mock.assert_called_once_with('dns', 'v1',
credentials=test_credentials,
cache_discovery=False)
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('googleapiclient.discovery.build')
@mock.patch('google.auth.load_credentials_from_file')
def test_client_with_json_credentials(self, credential_mock, discovery_mock):
test_credentials = mock.MagicMock()
credential_mock.return_value = (test_credentials, PROJECT_ID)
from certbot_dns_google._internal.dns_google import _GoogleClient
client = _GoogleClient(ACCOUNT_JSON_PATH)
credential_mock.assert_called_once_with(ACCOUNT_JSON_PATH, scopes=SCOPES)
assert credential_mock.called
assert client.project_id == PROJECT_ID
discovery_mock.assert_called_once_with('dns', 'v1',
credentials=test_credentials,
cache_discovery=False)
@mock.patch('google.auth.load_credentials_from_file')
def test_client_bad_credentials_file(self, credential_mock):
credential_mock.side_effect = ValueError('Some exception buried in oauth2client')
credential_mock.side_effect = googleauth_exceptions.DefaultCredentialsError('Some exception buried in google.auth')
with pytest.raises(errors.PluginError) as exc_info:
self._setUp_client_with_mock([])
assert str(exc_info.value) == \
"Error parsing credentials file '/not/a/real/path.json': " \
"Some exception buried in oauth2client"
"Error loading credentials file '/not/a/real/path.json': " \
"Some exception buried in google.auth"
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
def test_client_missing_project_id(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), "")
with pytest.raises(errors.PluginError) as exc_info:
self._setUp_client_with_mock([])
assert str(exc_info.value) == \
"The Google Cloud project could not be automatically determined. " \
"Please configure it using --dns-google-project <project>."
@mock.patch('googleapiclient.discovery.build')
@mock.patch('google.auth.default')
def test_client_with_project_id(self, credential_mock, unused_discovery_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
from certbot_dns_google._internal.dns_google import _GoogleClient
client = _GoogleClient(None, "test-project-2")
assert credential_mock.called
assert client.project_id == "test-project-2"
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
@mock.patch('certbot_dns_google._internal.dns_google._GoogleClient.get_project_id')
def test_add_txt_record(self, get_project_id_mock, credential_mock):
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
credential_mock.assert_called_once_with('/not/a/real/path.json', mock.ANY)
assert not get_project_id_mock.called
def test_add_txt_record(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
credential_mock.assert_called_once_with('/not/a/real/path.json', scopes=SCOPES)
client.add_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
@@ -153,11 +206,13 @@ class GoogleClientTest(unittest.TestCase):
managedZone=self.zone,
project=PROJECT_ID)
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_add_txt_record_and_poll(self, unused_credential_mock):
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
def test_add_txt_record_and_poll(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
changes.create.return_value.execute.return_value = {'status': 'pending', 'id': self.change}
changes.get.return_value.execute.return_value = {'status': 'done'}
@@ -171,12 +226,34 @@ class GoogleClientTest(unittest.TestCase):
managedZone=self.zone,
project=PROJECT_ID)
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_add_txt_record_delete_old(self, unused_credential_mock):
def test_add_txt_record_and_poll_split_horizon(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': '{zone}-private'.format(zone=self.zone), 'dnsName': DOMAIN, 'visibility': 'private'},{'id': '{zone}-public'.format(zone=self.zone), 'dnsName': DOMAIN, 'visibility': self.visibility}]}])
changes.create.return_value.execute.return_value = {'status': 'pending', 'id': self.change}
changes.get.return_value.execute.return_value = {'status': 'done'}
client.add_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
changes.create.assert_called_with(body=mock.ANY,
managedZone='{zone}-public'.format(zone=self.zone),
project=PROJECT_ID)
changes.get.assert_called_with(changeId=self.change,
managedZone='{zone}-public'.format(zone=self.zone),
project=PROJECT_ID)
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_add_txt_record_delete_old(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock(
[{'managedZones': [{'id': self.zone}]}])
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
# pylint: disable=line-too-long
mock_get_rrs = "certbot_dns_google._internal.dns_google._GoogleClient.get_existing_txt_rrset"
with mock.patch(mock_get_rrs) as mock_rrs:
@@ -187,12 +264,14 @@ class GoogleClientTest(unittest.TestCase):
assert "sample-txt-contents" in deletions["rrdatas"]
assert self.record_ttl == deletions["ttl"]
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_add_txt_record_delete_old_ttl_case(self, unused_credential_mock):
def test_add_txt_record_delete_old_ttl_case(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock(
[{'managedZones': [{'id': self.zone}]}])
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
# pylint: disable=line-too-long
mock_get_rrs = "certbot_dns_google._internal.dns_google._GoogleClient.get_existing_txt_rrset"
with mock.patch(mock_get_rrs) as mock_rrs:
@@ -204,50 +283,60 @@ class GoogleClientTest(unittest.TestCase):
assert "sample-txt-contents" in deletions["rrdatas"]
assert custom_ttl == deletions["ttl"] #otherwise HTTP 412
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_add_txt_record_noop(self, unused_credential_mock):
def test_add_txt_record_noop(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock(
[{'managedZones': [{'id': self.zone}]}])
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
client.add_txt_record(DOMAIN, "_acme-challenge.example.org",
"example-txt-contents", self.record_ttl)
assert changes.create.called is False
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_add_txt_record_error_during_zone_lookup(self, unused_credential_mock):
def test_add_txt_record_error_during_zone_lookup(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, unused_changes = self._setUp_client_with_mock(API_ERROR)
with pytest.raises(errors.PluginError):
client.add_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_add_txt_record_zone_not_found(self, unused_credential_mock):
def test_add_txt_record_zone_not_found(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, unused_changes = self._setUp_client_with_mock([{'managedZones': []},
{'managedZones': []}])
with pytest.raises(errors.PluginError):
client.add_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_add_txt_record_error_during_add(self, unused_credential_mock):
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
def test_add_txt_record_error_during_add(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
changes.create.side_effect = API_ERROR
with pytest.raises(errors.PluginError):
client.add_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_del_txt_record_multi_rrdatas(self, unused_credential_mock):
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
def test_del_txt_record_multi_rrdatas(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
# pylint: disable=line-too-long
mock_get_rrs = "certbot_dns_google._internal.dns_google._GoogleClient.get_existing_txt_rrset"
with mock.patch(mock_get_rrs) as mock_rrs:
@@ -282,11 +371,13 @@ class GoogleClientTest(unittest.TestCase):
managedZone=self.zone,
project=PROJECT_ID)
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_del_txt_record_single_rrdatas(self, unused_credential_mock):
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
def test_del_txt_record_single_rrdatas(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
# pylint: disable=line-too-long
mock_get_rrs = "certbot_dns_google._internal.dns_google._GoogleClient.get_existing_txt_rrset"
with mock.patch(mock_get_rrs) as mock_rrs:
@@ -311,106 +402,84 @@ class GoogleClientTest(unittest.TestCase):
managedZone=self.zone,
project=PROJECT_ID)
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_del_txt_record_error_during_zone_lookup(self, unused_credential_mock):
def test_del_txt_record_error_during_zone_lookup(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock(API_ERROR)
client.del_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
changes.create.assert_not_called()
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_del_txt_record_zone_not_found(self, unused_credential_mock):
def test_del_txt_record_zone_not_found(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock([{'managedZones': []},
{'managedZones': []}])
client.del_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
changes.create.assert_not_called()
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_del_txt_record_error_during_delete(self, unused_credential_mock):
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
def test_del_txt_record_error_during_delete(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
changes.create.side_effect = API_ERROR
client.del_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_get_existing_found(self, unused_credential_mock):
def test_get_existing_found(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, unused_changes = self._setUp_client_with_mock(
[{'managedZones': [{'id': self.zone}]}])
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
# Record name mocked in setUp
found = client.get_existing_txt_rrset(self.zone, "_acme-challenge.example.org")
assert found["rrdatas"] == ["\"example-txt-contents\""]
assert found["ttl"] == 60
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_get_existing_not_found(self, unused_credential_mock):
def test_get_existing_not_found(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, unused_changes = self._setUp_client_with_mock(
[{'managedZones': [{'id': self.zone}]}])
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
not_found = client.get_existing_txt_rrset(self.zone, "nonexistent.tld")
assert not_found is None
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_get_existing_with_error(self, unused_credential_mock):
def test_get_existing_with_error(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, unused_changes = self._setUp_client_with_mock(
[{'managedZones': [{'id': self.zone}]}], API_ERROR)
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}], API_ERROR)
# Record name mocked in setUp
found = client.get_existing_txt_rrset(self.zone, "_acme-challenge.example.org")
assert found is None
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
@mock.patch('google.auth.load_credentials_from_file')
@mock.patch('certbot_dns_google._internal.dns_google.open',
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
def test_get_existing_fallback(self, unused_credential_mock):
def test_get_existing_fallback(self, credential_mock):
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
client, unused_changes = self._setUp_client_with_mock(
[{'managedZones': [{'id': self.zone}]}], API_ERROR)
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}], API_ERROR)
rrset = client.get_existing_txt_rrset(self.zone, "_acme-challenge.example.org")
assert not rrset
def test_get_project_id(self):
from certbot_dns_google._internal.dns_google import _GoogleClient
response = DummyResponse()
response.status = 200
with mock.patch('httplib2.Http.request', return_value=(response, 'test-test-1')):
project_id = _GoogleClient.get_project_id()
assert project_id == 'test-test-1'
with mock.patch('httplib2.Http.request', return_value=(response, b'test-test-1')):
project_id = _GoogleClient.get_project_id()
assert project_id == 'test-test-1'
failed_response = DummyResponse()
failed_response.status = 404
with mock.patch('httplib2.Http.request',
return_value=(failed_response, "some detailed http error response")):
with pytest.raises(ValueError):
_GoogleClient.get_project_id()
with mock.patch('httplib2.Http.request', side_effect=ServerNotFoundError):
with pytest.raises(ServerNotFoundError):
_GoogleClient.get_project_id()
class DummyResponse:
"""
Dummy object to create a fake HTTPResponse (the actual one requires a socket and we only
need the status attribute)
"""
def __init__(self):
self.status = 200
if __name__ == "__main__":
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover

View File

@@ -389,6 +389,11 @@
"items": {
"type": "string"
}
},
"visibility": {
"type": "string",
"description": "The zone's visibility: public zones are exposed to the Internet, while private zones are visible only to Virtual Private Cloud resources.",
"default": "public"
}
}
},

View File

@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode']
'sphinx.ext.viewcode',
'sphinx_rtd_theme']
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
@@ -93,14 +94,7 @@ todo_include_todos = False
# a list of builtin themes.
#
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the

View File

@@ -4,17 +4,17 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
'google-api-python-client>=1.5.5',
'oauth2client>=4.0',
'google-api-python-client>=1.6.5',
'google-auth>=2.16.0',
'setuptools>=41.6.0',
# already a dependency of google-api-python-client, but added for consistency
'httplib2'
]
if not os.environ.get('SNAP_BUILD'):
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
else:
install_requires.extend([
# We specify the minimum acme and certbot version as the current plugin
# version for simplicity. See
@@ -22,11 +22,6 @@ if not os.environ.get('SNAP_BUILD'):
f'acme>={version}',
f'certbot>={version}',
])
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Unset SNAP_BUILD when building wheels '
'to include certbot dependencies.')
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
docs_extras = [
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
@@ -45,7 +40,7 @@ setup(
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
@@ -54,11 +49,11 @@ setup(
'Operating System :: POSIX :: Linux',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -0,0 +1,33 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# You can also specify other tool versions:
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: certbot-dns-linode/docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
formats:
- pdf
- epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: certbot-dns-linode/readthedocs.org.requirements.txt

View File

@@ -3,16 +3,12 @@ import logging
import re
from typing import Any
from typing import Callable
from typing import cast
from typing import Optional
from typing import Union
from lexicon.providers import linode
from lexicon.providers import linode4
from certbot import errors
from certbot.plugins import dns_common
from certbot.plugins import dns_common_lexicon
from certbot.plugins.dns_common import CredentialsConfiguration
logger = logging.getLogger(__name__)
@@ -20,7 +16,7 @@ API_KEY_URL = 'https://manager.linode.com/profile/api'
API_KEY_URL_V4 = 'https://cloud.linode.com/profile/tokens'
class Authenticator(dns_common.DNSAuthenticator):
class Authenticator(dns_common_lexicon.LexiconDNSAuthenticator):
"""DNS Authenticator for Linode
This Authenticator uses the Linode API to fulfill a dns-01 challenge.
@@ -30,7 +26,10 @@ class Authenticator(dns_common.DNSAuthenticator):
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.credentials: Optional[CredentialsConfiguration] = None
self._add_provider_option('key',
'API key for Linode account, '
f'obtained from {API_KEY_URL} or {API_KEY_URL_V4}',
'auth_token')
@classmethod
def add_parser_arguments(cls, add: Callable[..., None],
@@ -42,29 +41,13 @@ class Authenticator(dns_common.DNSAuthenticator):
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
'the Linode API.'
def _setup_credentials(self) -> None:
self.credentials = self._configure_credentials(
'credentials',
'Linode credentials INI file',
{
'key': 'API key for Linode account, obtained from {0} or {1}'
.format(API_KEY_URL, API_KEY_URL_V4)
}
)
@property
def _provider_name(self) -> str:
if not hasattr(self, '_credentials'): # pragma: no cover
self._setup_credentials()
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
self._get_linode_client().add_txt_record(domain, validation_name, validation)
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
self._get_linode_client().del_txt_record(domain, validation_name, validation)
def _get_linode_client(self) -> '_LinodeLexiconClient':
if not self.credentials: # pragma: no cover
raise errors.Error("Plugin has not been prepared.")
api_key = self.credentials.conf('key')
api_version: Optional[Union[str, int]] = self.credentials.conf('version')
if api_version == '':
api_version = None
api_key = cast(str, self._credentials.conf('key'))
api_version: Optional[Union[str, int]] = self._credentials.conf('version')
if not api_version:
api_version = 3
@@ -77,34 +60,19 @@ class Authenticator(dns_common.DNSAuthenticator):
else:
api_version = int(api_version)
return _LinodeLexiconClient(api_key, api_version)
class _LinodeLexiconClient(dns_common_lexicon.LexiconClient):
"""
Encapsulates all communication with the Linode API.
"""
def __init__(self, api_key: str, api_version: int) -> None:
super().__init__()
self.api_version = api_version
if api_version == 3:
config = dns_common_lexicon.build_lexicon_config('linode', {}, {
'auth_token': api_key,
})
self.provider = linode.Provider(config)
return 'linode'
elif api_version == 4:
config = dns_common_lexicon.build_lexicon_config('linode4', {}, {
'auth_token': api_key,
})
return 'linode4'
self.provider = linode4.Provider(config)
else:
raise errors.PluginError('Invalid api version specified: {0}. (Supported: 3, 4)'
.format(api_version))
raise errors.PluginError(f'Invalid api version specified: {api_version}. (Supported: 3, 4)')
def _setup_credentials(self) -> None:
self._credentials = self._configure_credentials(
key='credentials',
label='Credentials INI file for linode DNS authenticator',
required_variables={item[0]: item[1] for item in self._provider_options},
)
def _handle_general_error(self, e: Exception, domain_name: str) -> Optional[errors.PluginError]:
if not str(e).startswith('Domain not found'):

View File

@@ -19,7 +19,9 @@ TOKEN_V4 = '0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef'
class AuthenticatorTest(test_util.TempDirTestCase,
dns_test_common_lexicon.BaseLexiconAuthenticatorTest):
dns_test_common_lexicon.BaseLexiconDNSAuthenticatorTest):
DOMAIN_NOT_FOUND = Exception('Domain not found')
def setUp(self):
super().setUp()
@@ -32,10 +34,6 @@ class AuthenticatorTest(test_util.TempDirTestCase,
self.auth = Authenticator(self.config, "linode")
self.mock_client = mock.MagicMock()
# _get_linode_client | pylint: disable=protected-access
self.auth._get_linode_client = mock.MagicMock(return_value=self.mock_client)
# pylint: disable=protected-access
def test_api_version_3_detection(self):
path = os.path.join(self.tempdir, 'file_3_auto.ini')
@@ -44,9 +42,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
config = mock.MagicMock(linode_credentials=path,
linode_propagation_seconds=0)
auth = Authenticator(config, "linode")
auth._setup_credentials()
client = auth._get_linode_client()
assert 3 == client.api_version
assert auth._provider_name == "linode"
# pylint: disable=protected-access
def test_api_version_4_detection(self):
@@ -56,9 +53,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
config = mock.MagicMock(linode_credentials=path,
linode_propagation_seconds=0)
auth = Authenticator(config, "linode")
auth._setup_credentials()
client = auth._get_linode_client()
assert 4 == client.api_version
assert auth._provider_name == "linode4"
# pylint: disable=protected-access
def test_api_version_3_detection_empty_version(self):
@@ -68,9 +64,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
config = mock.MagicMock(linode_credentials=path,
linode_propagation_seconds=0)
auth = Authenticator(config, "linode")
auth._setup_credentials()
client = auth._get_linode_client()
assert 3 == client.api_version
assert auth._provider_name == "linode"
# pylint: disable=protected-access
def test_api_version_4_detection_empty_version(self):
@@ -80,9 +75,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
config = mock.MagicMock(linode_credentials=path,
linode_propagation_seconds=0)
auth = Authenticator(config, "linode")
auth._setup_credentials()
client = auth._get_linode_client()
assert 4 == client.api_version
assert auth._provider_name == "linode4"
# pylint: disable=protected-access
def test_api_version_3_manual(self):
@@ -92,9 +86,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
config = mock.MagicMock(linode_credentials=path,
linode_propagation_seconds=0)
auth = Authenticator(config, "linode")
auth._setup_credentials()
client = auth._get_linode_client()
assert 3 == client.api_version
assert auth._provider_name == "linode"
# pylint: disable=protected-access
def test_api_version_4_manual(self):
@@ -104,9 +97,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
config = mock.MagicMock(linode_credentials=path,
linode_propagation_seconds=0)
auth = Authenticator(config, "linode")
auth._setup_credentials()
client = auth._get_linode_client()
assert 4 == client.api_version
assert auth._provider_name == "linode4"
# pylint: disable=protected-access
def test_api_version_error(self):
@@ -116,35 +108,9 @@ class AuthenticatorTest(test_util.TempDirTestCase,
config = mock.MagicMock(linode_credentials=path,
linode_propagation_seconds=0)
auth = Authenticator(config, "linode")
auth._setup_credentials()
with pytest.raises(errors.PluginError):
auth._get_linode_client()
class LinodeLexiconClientTest(unittest.TestCase, dns_test_common_lexicon.BaseLexiconClientTest):
DOMAIN_NOT_FOUND = Exception('Domain not found')
def setUp(self):
from certbot_dns_linode._internal.dns_linode import _LinodeLexiconClient
self.client = _LinodeLexiconClient(TOKEN, 3)
self.provider_mock = mock.MagicMock()
self.client.provider = self.provider_mock
class Linode4LexiconClientTest(unittest.TestCase, dns_test_common_lexicon.BaseLexiconClientTest):
DOMAIN_NOT_FOUND = Exception('Domain not found')
def setUp(self):
from certbot_dns_linode._internal.dns_linode import _LinodeLexiconClient
self.client = _LinodeLexiconClient(TOKEN, 4)
self.provider_mock = mock.MagicMock()
self.client.provider = self.provider_mock
assert auth._provider_name == "linode4"
if __name__ == "__main__":

View File

@@ -16,7 +16,7 @@
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode']
'sphinx.ext.viewcode',
'sphinx_rtd_theme']
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
@@ -93,14 +94,7 @@ todo_include_todos = False
# a list of builtin themes.
#
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the

View File

@@ -4,14 +4,16 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
'dns-lexicon>=3.2.1',
'dns-lexicon>=3.14.1',
'setuptools>=41.6.0',
]
if not os.environ.get('SNAP_BUILD'):
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
else:
install_requires.extend([
# We specify the minimum acme and certbot version as the current plugin
# version for simplicity. See
@@ -19,11 +21,6 @@ if not os.environ.get('SNAP_BUILD'):
f'acme>={version}',
f'certbot>={version}',
])
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Unset SNAP_BUILD when building wheels '
'to include certbot dependencies.')
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
docs_extras = [
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
@@ -42,7 +39,7 @@ setup(
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
@@ -51,11 +48,11 @@ setup(
'Operating System :: POSIX :: Linux',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -0,0 +1,33 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# You can also specify other tool versions:
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: certbot-dns-luadns/docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
formats:
- pdf
- epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: certbot-dns-luadns/readthedocs.org.requirements.txt

View File

@@ -2,33 +2,33 @@
import logging
from typing import Any
from typing import Callable
from typing import Optional
from lexicon.providers import luadns
from requests import HTTPError
from certbot import errors
from certbot.plugins import dns_common
from certbot.plugins import dns_common_lexicon
from certbot.plugins.dns_common import CredentialsConfiguration
logger = logging.getLogger(__name__)
ACCOUNT_URL = 'https://api.luadns.com/settings'
class Authenticator(dns_common.DNSAuthenticator):
class Authenticator(dns_common_lexicon.LexiconDNSAuthenticator):
"""DNS Authenticator for LuaDNS
This Authenticator uses the LuaDNS API to fulfill a dns-01 challenge.
"""
description = 'Obtain certificates using a DNS TXT record (if you are using LuaDNS for DNS).'
ttl = 60
def __init__(self, *args: Any, **kwargs: Any) -> None:
super().__init__(*args, **kwargs)
self.credentials: Optional[CredentialsConfiguration] = None
self._add_provider_option('email',
'email address associated with LuaDNS account',
'auth_username')
self._add_provider_option('token',
f'API token for LuaDNS account, obtained from {ACCOUNT_URL}',
'auth_token')
@classmethod
def add_parser_arguments(cls, add: Callable[..., None],
@@ -40,46 +40,9 @@ class Authenticator(dns_common.DNSAuthenticator):
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
'the LuaDNS API.'
def _setup_credentials(self) -> None:
self.credentials = self._configure_credentials(
'credentials',
'LuaDNS credentials INI file',
{
'email': 'email address associated with LuaDNS account',
'token': 'API token for LuaDNS account, obtained from {0}'.format(ACCOUNT_URL)
}
)
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
self._get_luadns_client().add_txt_record(domain, validation_name, validation)
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
self._get_luadns_client().del_txt_record(domain, validation_name, validation)
def _get_luadns_client(self) -> "_LuaDNSLexiconClient":
if not self.credentials: # pragma: no cover
raise errors.Error("Plugin has not been prepared.")
return _LuaDNSLexiconClient(self.credentials.conf('email'),
self.credentials.conf('token'),
self.ttl)
class _LuaDNSLexiconClient(dns_common_lexicon.LexiconClient):
"""
Encapsulates all communication with the LuaDNS via Lexicon.
"""
def __init__(self, email: str, token: str, ttl: int) -> None:
super().__init__()
config = dns_common_lexicon.build_lexicon_config('luadns', {
'ttl': ttl,
}, {
'auth_username': email,
'auth_token': token,
})
self.provider = luadns.Provider(config)
@property
def _provider_name(self) -> str:
return 'luadns'
def _handle_http_error(self, e: HTTPError, domain_name: str) -> errors.PluginError:
hint = None

View File

@@ -1,10 +1,9 @@
"""Tests for certbot_dns_luadns._internal.dns_luadns."""
import sys
import unittest
from unittest import mock
import pytest
from requests import Response
from requests.exceptions import HTTPError
from certbot.compat import os
@@ -17,7 +16,9 @@ TOKEN = 'foo'
class AuthenticatorTest(test_util.TempDirTestCase,
dns_test_common_lexicon.BaseLexiconAuthenticatorTest):
dns_test_common_lexicon.BaseLexiconDNSAuthenticatorTest):
LOGIN_ERROR = HTTPError("401 Client Error: Unauthorized for url: ...", response=Response())
def setUp(self):
super().setUp()
@@ -32,23 +33,6 @@ class AuthenticatorTest(test_util.TempDirTestCase,
self.auth = Authenticator(self.config, "luadns")
self.mock_client = mock.MagicMock()
# _get_luadns_client | pylint: disable=protected-access
self.auth._get_luadns_client = mock.MagicMock(return_value=self.mock_client)
class LuaDNSLexiconClientTest(unittest.TestCase, dns_test_common_lexicon.BaseLexiconClientTest):
LOGIN_ERROR = HTTPError("401 Client Error: Unauthorized for url: ...")
def setUp(self):
from certbot_dns_luadns._internal.dns_luadns import _LuaDNSLexiconClient
self.client = _LuaDNSLexiconClient(EMAIL, TOKEN, 0)
self.provider_mock = mock.MagicMock()
self.client.provider = self.provider_mock
if __name__ == "__main__":
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover

View File

@@ -16,7 +16,7 @@
# add these directories to sys.path here. If the directory is relative to the
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
'sphinx.ext.intersphinx',
'sphinx.ext.todo',
'sphinx.ext.coverage',
'sphinx.ext.viewcode']
'sphinx.ext.viewcode',
'sphinx_rtd_theme']
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance']
@@ -93,14 +94,7 @@ todo_include_todos = False
# a list of builtin themes.
#
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally
import sphinx_rtd_theme
html_theme = 'sphinx_rtd_theme'
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
html_theme = 'sphinx_rtd_theme'
# Theme options are theme-specific and customize the look and feel of a theme
# further. For a list of options available for each theme, see the

View File

@@ -4,14 +4,16 @@ import sys
from setuptools import find_packages
from setuptools import setup
version = '2.6.0.dev0'
version = '2.12.0.dev0'
install_requires = [
'dns-lexicon>=3.2.1',
'dns-lexicon>=3.14.1',
'setuptools>=41.6.0',
]
if not os.environ.get('SNAP_BUILD'):
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
else:
install_requires.extend([
# We specify the minimum acme and certbot version as the current plugin
# version for simplicity. See
@@ -19,11 +21,6 @@ if not os.environ.get('SNAP_BUILD'):
f'acme>={version}',
f'certbot>={version}',
])
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Unset SNAP_BUILD when building wheels '
'to include certbot dependencies.')
if os.environ.get('SNAP_BUILD'):
install_requires.append('packaging')
docs_extras = [
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
@@ -42,7 +39,7 @@ setup(
author="Certbot Project",
author_email='certbot-dev@eff.org',
license='Apache License 2.0',
python_requires='>=3.7',
python_requires='>=3.8',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
@@ -51,11 +48,11 @@ setup(
'Operating System :: POSIX :: Linux',
'Programming Language :: Python',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
'Programming Language :: Python :: 3.9',
'Programming Language :: Python :: 3.10',
'Programming Language :: Python :: 3.11',
'Programming Language :: Python :: 3.12',
'Topic :: Internet :: WWW/HTTP',
'Topic :: Security',
'Topic :: System :: Installation/Setup',

View File

@@ -0,0 +1,33 @@
# Read the Docs configuration file for Sphinx projects
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
# Required
version: 2
# Set the OS, Python version and other tools you might need
build:
os: ubuntu-22.04
tools:
python: "3.11"
# You can also specify other tool versions:
# Build documentation in the "docs/" directory with Sphinx
sphinx:
configuration: certbot-dns-nsone/docs/conf.py
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
# builder: "dirhtml"
# Fail on all warnings to avoid broken references
fail_on_warning: true
# Optionally build your docs in additional formats such as PDF and ePub
formats:
- pdf
- epub
# Optional but recommended, declare the Python requirements required
# to build your documentation
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
python:
install:
- requirements: certbot-dns-nsone/readthedocs.org.requirements.txt

Some files were not shown because too many files have changed in this diff Show More