Compare commits

...

527 Commits

Author SHA1 Message Date
Mads Jensen
4e81ebcabc Use better asserts. Added notes to style guide. 2020-11-12 22:21:54 +01:00
Brad Warren
f15f4f9838 Add certbot renew --key-type test (#8447)
* Test certbot renew --key-type

* Fix typo
2020-11-12 00:06:50 +01:00
Adrien Ferrand
8f5787008d Handle unexpected key type migration. (#8435)
Fixes #8365

This PR adds a control when `certbot certonly` or `certbot run` are called for a certificate that already exists and would eventually be replaced. As described in #8365, this control is here to ensure that the user will not modify the key type of their certificate (eg. ECDSA to RSA) without an explicit approval (set explicitly `--cert-name` and `--key-type`), since RSA is the default if not specified.

* Handle unexpected key type migration.

* Update certbot-ci/certbot_integration_tests/certbot_tests/test_main.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-11-11 12:36:16 -08:00
Adrien Ferrand
198f5a99bc Merge pull request #8431 from atombrella/ec_dsa_2163
Implements support for ECDSA keys. Fixes #2163.
2020-11-04 23:43:46 +01:00
Mads Jensen
47c1045f6d Implements support for ECDSA keys. Fixes #2163.
Thanks to @pahrohfit and @Tomoyuki-GH for previous efforts to implement
suport for this.

Co-Authored-By: Robert Dailey <rob@wargam.es>
Co-Authored-By: Tomoyuki-GH <55397638+Tomoyuki-GH@users.noreply.github.com>
2020-11-04 15:16:48 +01:00
Brad Warren
e570e8ad32 Generate plugin snap configs as needed (#8411)
While reviewing https://github.com/certbot/certbot/pull/8404, it occurred to me that we're keeping both the generated files and the script used to generate them in `git`. Keeping both around seems unnecessary and is almost asking for the files to get out of sync at some point in the future. I fixed that by removing the files, adding them to `.gitignore`, and updating `build_remote.py` to generate them as needed.

* Remove generated files.

* Add generated files to gitignore.

* Reuse generate_dnsplugins_all.sh in build_remote
2020-10-30 14:12:57 -07:00
Brad Warren
df138d0027 Document that logs aren't always created. (#8410) 2020-10-30 13:15:47 -07:00
Brad Warren
9567352002 Update tools/snap/generate_* comments. (#8412) 2020-10-30 13:08:57 -07:00
Brad Warren
6c7b99f7e0 Remove fedora test farm tests (#8415)
While working on https://github.com/certbot/certbot/issues/8400, I noticed our Fedora AMIs are quite out of date. I considered updating them and what we could do to avoid the AMIs becoming so out-of-date in the future, but I think we don't actually need these tests.

I pulled a new count of Certbot users by OS and we have less than 7,000 Fedora users meaning only ~0.26% of Certbot users run Fedora. (I think Fedora is a popular desktop OS, but not as popular of a server OS which is where Certbot normally runs.)

Also, Certbot is regularly updated on Fedora including Fedora Rawhide or the rolling release version of Fedora which is similar to Debian sid/unstable. Rawhide changes far too frequently for it to make sense for us to run tests there in my opinon, but that also means that many problems such as Certbot's unit tests failing to run because of Fedora changes will be caught there by our Fedora maintainers before we'd even see it. This is how https://github.com/certbot/certbot/issues/7106 became an issue and how I learned [Certbot worked on Python 3.9 before we could run tests on it](https://github.com/certbot/certbot/issues/8134#issuecomment-655106169).

Because of all this, I think we should just simplify things and remove these tests. If a problem arises in the future, we can always add them back.
2020-10-28 15:52:20 -07:00
ohemorange
3673ca77a5 Fix LXD setup in snap README (#8416)
Fixes #8409.

Change the line in the README to allow `sudo /snap/bin/lxd.migrate -yes` to fail (for example, if there's nothing to migrate), but the whole command to succeed.

I tested this on a clean Focal install and confirmed it works.
2020-10-28 15:51:16 -07:00
Brad Warren
bb45c9aa41 Add Ubuntu 20.10 test farm tests (#8414)
Fixes https://github.com/certbot/certbot/issues/8400.

I had to switch the package installed in `apacheconftest` to `libapache2-mod-wsgi-py3` because Ubuntu 20.10 removed the Python 2 version of this module.

I didn't add this AMI to `tests/letstest/auto_targets.yaml` because like Ubuntu 20.04, `certbot-auto` has never worked on the OS.

* Add Ubuntu 20.20 test farm tests

* Try Python 3 WSGI
2020-10-28 15:08:16 -07:00
Brad Warren
4c347f5576 Switch to using python directly (#8413)
Windows installer tests failed last night because they suddenly switched to Python 3.9.

This is happening despite bf07ec20b0/.azure-pipelines/templates/jobs/packaging-jobs.yml (L92-L95) just a few lines earlier than what I modified in the PR here.

I think what's going on is `py -3` is finding and preferring the newer version of Python 3, Python 3.9, which was [just recently added to the image](https://github.com/actions/virtual-environments/issues/1740#issuecomment-717849233).

The [documentation for UsePythonVersion](https://docs.microsoft.com/en-us/azure/devops/pipelines/tasks/tool/use-python-version?view=azure-devops) says that:

> After running this task with "Add to PATH," the `python` command in subsequent scripts will be for the highest available version of the interpreter matching the version spec and architecture.

So let's just use `python` instead of `py`.
2020-10-28 14:12:32 -07:00
alexzorin
bf07ec20b0 run: dont report new certs when only re-installing (#8392) 2020-10-27 12:48:07 -07:00
ohemorange
fc864543a7 Simplify/document snap creation (#8404)
This PR adds the following documentation improvements to fix https://github.com/certbot/certbot/issues/7958:

- Simplify building external plugins
- Separate out certbot snap instructions from plugin instructions
- Mention that dnsimple is just an example for the plugin instructions
- Mention remote build for other architectures
- Mention snap doc exists elsewhere in developer guide (`contributing.rst`)

* Set up generate_dnsplugins_all.sh for all files and parametrize snapcraft and postrefreshhook files

* Create constraints file in the generate_dnsplugins_all script

* Separate out plugin and certbot snaps and update instructions

* Add remote build instructions

* Add pointers to the README to contributing.rst
2020-10-27 10:22:40 -07:00
Mark Dumay
4fa1df3075 Added links for gehirn and sakuracloud DNS plugins (#8406) 2020-10-26 17:22:00 -07:00
Adrien Ferrand
cfd0a6ff1f Remove usage of buildkit (#8408)
Fixes #8355 

During the troubleshooting of #8355, I came to the conclusion that using buildkit was creating the problem. Without it all docker images are built correctly. Initially buildkit was enabled to avoid a building problem in Azure Pipeline, but I also found in my recent tests that this problem was not there anymore.

You can find more details about the troubleshooting and reasoning in #8355.

As a consequence, I disable the usage of buildkit in this PR which will solve the issue.
2020-10-26 15:20:27 -07:00
Adrien Ferrand
00ed56afd6 Execute basic integration tests against Certbot dockers during CI (#8396)
Fixes #8202

This PR adds an Azure Pipeline job to execute certbot plugins --prepare for each Docker image created during the CI on amd64.

* Prepare basic integration tests for certbot dockers

* Add a displayName for the integration tests task
2020-10-23 11:02:35 -07:00
alexzorin
b6e3a3ad02 register: remove report_new_account, use logger (#8393) 2020-10-22 17:33:45 -07:00
Brad Warren
c250957ab0 Add .envrc. (#8382) 2020-10-22 14:01:30 -07:00
alexzorin
4eb0b560c5 manual: deprecate --manual-public-ip-logging-ok (#8381)
* manual: deprecate --manual-public-ip-logging-ok

* remove unused cli.report_config_interaction code

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2020-10-22 12:12:54 -07:00
Brad Warren
cb916a0682 Deprecate certbot-auto on Debian systems (#8354)
Fixes #8294.

* Deprecate certbot-auto on Debian systems.

* Add changelog entry.

* Remove le_auto_xenial test.

* Update certbot-auto test farm tests.

* Add comments explaining expected behavior.
2020-10-20 16:25:20 -07:00
Brad Warren
88386e8c82 Add external snap docs and clean up dev docs (#8356)
* Add external snap docs and clean up dev docs

* Correctly refer to content identifiers

* Expand plugin interface docs and add line breaks
2020-10-19 15:30:30 -07:00
alexzorin
a64e1f0129 changelog: move entry to the right section (#8378) 2020-10-15 13:23:36 -07:00
osirisinferi
fea176449c Add confirmation before certificate delete (#8349)
* Ask confirmation before deleting cert

* Changelog

* Fix lint and preserve non-interactively deletetion

* Improve English

* Integrate message into yesno() without logger

* Reduce if/else into oneliner

* Expand "certificate(s)" in `get_certnames`

* Address comments

* Update certbot/certbot/_internal/cert_manager.py
2020-10-16 06:18:01 +11:00
Nobuki Fujii
ff03e34c70 Enable dns-inwx link in third-party plugins (#8374) 2020-10-14 12:31:25 -07:00
Nobuki Fujii
6fc832677e Add dns-lightsail to third-party plugins (#8372) 2020-10-13 14:59:51 -07:00
osirisinferi
725870d558 Add query timeout for dns-rfc2136 plugin (#8268)
* Add timeout to DNS query function calls

* Modify tests to account for new timeout variable

* Add change to CHANGELOG

* Add `dns.exception.Timeout` to exception handler

* Move changelog to 1.10.0
2020-10-09 13:13:46 -07:00
Brad Warren
631c88b209 Remove PPA instructions from docs. (#8364)
We're doing what we can to keep the PPA working in the most basic sense, but it is essentially deprecated and new users should not use it.
2020-10-09 12:44:09 -07:00
Brad Warren
6a093bd35a Move status message (#8361) 2020-10-08 16:38:05 -07:00
Brad Warren
afb07cf50d Automate publishing snaps to the stable channel (#8351)
Fixes https://github.com/certbot/certbot/issues/8171.

See the comment at the top of the script to learn how to set things up and run this. Running the script between releases will have no effect on our snaps and it should fail when creating the GitHub release. The latter is described at https://github.com/certbot/certbot/pull/8189#discussion_r466707114.

* Rename create_github_release to finish_release

* Add initial version of snap release automation.

* Handle snapcraft login.

* Catch OSError raised when snapcraft doesn't exist.

* Update documentation.

* Only publish the Certbot snap for now.

* Fix typo.

* Document other exceptions.

* Document assertion

* Add status message before getting revisions.

* Publish all snaps.
2020-10-08 15:18:09 -07:00
alexzorin
aa61e6ad4e certbot.util: suppress Popen CLI output (#8341)
* certbot.util: suppress Popen CLI output

Fixes #8326

* can't use subprocess.DEVNULL in py2
2020-10-08 13:27:36 -07:00
ohemorange
8a3aed0476 add status messages to create_github_release script (#8353)
It took long enough to do all the downloading and uploading that I found myself wishing I could be sure things were happening.
2020-10-07 08:31:37 -07:00
Brad Warren
afc5baad4a Merge pull request #8352 from certbot/candidate-1.9.0
Release 1.9.0
2020-10-06 15:41:21 -07:00
Erica Portnoy
eff761ab1e Bump version to 1.10.0 2020-10-06 12:15:29 -07:00
Erica Portnoy
5f040a8e32 Add contents to certbot/CHANGELOG.md for next version 2020-10-06 12:15:29 -07:00
Erica Portnoy
5173ab6b90 Release 1.9.0 2020-10-06 12:15:27 -07:00
Erica Portnoy
448fd9145a Update changelog for 1.9.0 release 2020-10-06 11:39:49 -07:00
Brad Warren
ac8798e818 Give DNS plugin snaps grade stable. (#8350)
With more and more of our wildcard instructions on https://certbot.eff.org telling people to use these plugins, I think we should get ready to move our DNS plugins to the stable channel. This PR removes grade: devel so the snap store doesn't prevent us from doing that when we want to. See #8128 where we did this to the Certbot snap for more info.

You can see the snap tests passing with this change at https://dev.azure.com/certbot/certbot/_build/results?buildId=2797&view=results.
2020-10-05 15:55:01 -07:00
Adrien Ferrand
34694251dd Reuse key renewal params (#8343)
* Ensure key params are stored in renewal config when --reuse-key is set.

* Fix mypy definition

* Add unit test

* Clean code.
2020-10-05 20:50:45 +02:00
Brad Warren
cc76906712 Set Certbot snap version from __init__.py (#8344)
Fixes https://github.com/certbot/certbot/issues/8166 following the feedback in https://github.com/certbot/certbot/pull/8337.

I took the command to get Certbot's version from: ef8c481634/snap/snapcraft.yaml (L90)

You can see the snap tests passing with this change at https://dev.azure.com/certbot/certbot/_build/results?buildId=2785&view=results.
2020-10-05 08:37:01 -07:00
Brad Warren
ef8c481634 Add snap log files to gitignore. (#8336) 2020-10-01 14:44:12 +02:00
Mads Jensen
c12404451d Converted dict comprehensions to use literals. (#8342) 2020-10-01 14:42:37 +02:00
Brad Warren
e378931eda Upgrade httplib2 (#8289)
* Upgrade httplib2.

* Add changelog entry.
2020-09-30 17:15:06 -07:00
Brad Warren
160b209394 Automatically retry test farm tests (#8325)
Fixes #8317.

* move retry to script

* Retry test farm tests.

* Fix retry path.
2020-09-30 17:05:52 -07:00
Brad Warren
cac9d8f75e Deprecate certbot-auto outside of Debian and RHEL (#8324)
Fixes https://github.com/certbot/certbot/issues/8292.

This uses the same approach that worked well for us in https://github.com/certbot/certbot/pull/7926. I'm sure we could delete more code or refactor things here, but I think we should make the most conservative changes we can to certbot-auto until we can just delete the entire thing.

I ran the full test suite on these changes at https://dev.azure.com/certbot/certbot/_build/results?buildId=2773&view=results and manually tested things on OpenSUSE and it worked as expected. certbot-auto refused to create new installations and refused to update old ones while continuing to allow the old version of Certbot to run.

* Deprecate cb-auto outside of Debian and RHEL.

* Don't deprecate Amazon Linux yet.
2020-09-30 17:03:59 -07:00
Adrien Ferrand
7f0fa18c57 Refactor certbot snap wrapper (#8313)
Partial fix for #8280

This PR refactors the bash script wrapper for snap (`/certbot.wrapper`) into certbot python codebase. Here are the keypoints of this refactoring:
* the wrapping is applied when `main` function from `certbot._internal.main` is called if environment variable `CERTBOT_SNAPPED` is `True`, which is set during the snap build
* the initial bash script wrapper  is removed, simplifying `snap/snapcraft.yaml` by removing the `certbot.wrapper` part
* the dependency to `curl` and `jq` binaries are removed
* the failure during requesting the snapd socket is correctly handled, and displays an informative message in order to correct the situation, as required by #8280

One side note about the modifications done to `app.certbot.command` in `snapcraft.yaml`. Normally calling `bin/certbot` should be sufficient and it is effectively under a normal situation (`core` snap up-to-date). However in the same situation than when the problem occurs in #8280, using `bin/certbot` makes the snap raise an exception about `certbot.main` module that cannot be found.

It seems that when `core` snap is not up-to-date (in Debian for instance with default `snapd` installation), the shebang `/usr/bin/env python3` in the `bin/certbot` wrapper is wrongly resolved to the host Python, instead of the snap Python. It is working as expected if `core` snap is up-to-date. One way to fix that is to keep a bash script wrapper, because in this case, it is the `PATH` value that matters to resolve the Python interpreter, and `PATH` is correctly set up to resolve it from the snap first.

However to keep the simplification provided by the wrapper removal, I prefered to use `bin/python3 $SNAP/bin/certbot` as `command` to explicitly target the correct Python interpreter. Again normally it is not needed because everything is working correctly with a `core` snap up-to-date, but since the root purpose of all of this is to target bad situations, well, it is better to have a snap that is effectively able to start to display the informative message...

* Refactor the bash wrapper for snap execution as Python code into certbot

* Remove wrapper, finalize the python logic

* Organize code

* Improve error handling

* Update command

* Setup basic certbot logging before running the snap prepare logic

* Improve instructions

* Use logging facility

* Handle properly an exception in snap_config

* Use the python script call approach

* Update instructions to keep sync with https://github.com/certbot/website/pull/650
2020-09-30 13:24:56 -07:00
ohemorange
fca7ec896a Improve error message for prepare-plug-plugin hook when certbot isn't installed (#8338)
Provides a partial fix for #8182 by improving the error message.
2020-09-30 12:43:24 -07:00
Brad Warren
e066766cc9 Revert "Disable build isolation during snap dns plugins build (#8319)" (#8323)
This reverts commit feca125437.

Since this change landed, ARM builds for many of the DNS plugins have failed every night. See https://dev.azure.com/certbot/certbot/_build?definitionId=5 or our public Mattermost channel.

I quickly tried to fix this myself and wasn't trivially able to do so. I tried setting `SNAPCRAFT_PYTHON_VENV_ARGS: --system-site-packages` and adding `python3-wheel` as a build dependency, but it didn't work for some reason. The `python3-wheel` package didn't seem to be installed.

I still suspect something like this is the approach we should take, however, I want to fix the failing tests now so things are no longer broken in `master` and those of us on the Certbot team at EFF stop getting spammed with 54 (!!) emails about failed builds from launchpad every night.

Unfortunately, while I was working on this the queue for ARM machines on Launchpad jumped up to an estimated ~20 hour wait, but I confirmed that this fixes the problem by building on an ARM AMI using the instructions at https://github.com/certbot/certbot/blob/master/tools/snap/README.md#use-testing-and-development. If whoever reviews this would like an ARM machine to test on themselves, please let me know.
2020-09-28 14:27:29 -07:00
ohemorange
be6c890874 Retry Snap upload in pipeline (#8300)
* add set -e to all bash instances in deploy-stage.yml

* retry uploading snap if we fail

* Add the rest of the set -e calls for bash in azure while we're here

* use retry based on travis_retry

* add set -e to the script: sections that run on macOS/Linux

* actually don't fail on result

* reset result before running command because bash short circuits or conditionals

* remove inapplicable comment
2020-09-25 15:31:13 -07:00
Adrien Ferrand
feca125437 Disable build isolation during snap dns plugins build (#8319)
Partial fix for #8256

This PR disable the build isolation for snap dns plugins similarly to what is done for the certbot snap.
2020-09-25 11:24:29 -07:00
Brad Warren
1be005289a Print more output from snapcraft remote-build (#8321)
* Print more output from snapcraft remote-build.

* Include the build target in the output.
2020-09-25 18:58:04 +02:00
Adrien Ferrand
79297ef5cb Invoke pipstrap in tox and during the CI (#8316)
Partial fix for #8256

This PR makes tox calls pipstrap before any commands is executed, and Azure Pipelines calls pipstrap when appropriate (when an actual call to pip is done). 

* Invoke pipstrap in tox and during the CI

* Set default value for PYTHON_VERSION and always set python interpreter

* Set Python for snaps_build also

* Fix the build for Windows installer

* Add a warning comment for pinned versions in pipstrap

* Rebuild letsencrypt-auto

* Same version than the installer build

* Let's update to latest pip for installer tests
2020-09-24 17:12:12 -07:00
alexzorin
5ec29ca60b suppress tracebacks in ErrorHandler recovery (#8310)
The ErrorHandler context manager could produce very verbose CLI output
when handling long exception chains (PIP 3134 enhanced reporting).

Rather than logging every exception with its traceback to the CLI, this
commit changes ErrorHandler so that only the final exception in the
chain, without traceback, is logged to the CLI.

This is consistent with a previous change made in the global except
hook (#8000).
2020-09-24 14:22:38 -07:00
Cameron Steel
9a72db5b9b Convert http links to https (#8287)
* Convert http links to https

* Fix remaining links
2020-09-23 19:36:55 +02:00
alexzorin
14cbf67d65 tests: remove Ubuntu 19.10 (#8312)
EOL since July 2020.
2020-09-23 09:42:37 -07:00
alexzorin
b20aaff661 remove unused ssllabs-related code (#8307) 2020-09-21 12:42:00 -07:00
Mads Jensen
a66f4e1150 Added an .editorconfig file. (#8297)
https://editorconfig.org/ is meant as a guideline for editors how to format
files.
2020-09-19 11:39:13 +02:00
Mads Jensen
501df0dc4e Use in dict rather than "in dict.keys()". Fix linting warnings about "not in". (#8298)
* Fixed a few linting warnings for if not x in y.

These should have been caught by pylint, but weren't.

* Replaced "x in y.keys()" with "x in y".

It's much faster, and more Pythonic.
2020-09-19 11:35:49 +02:00
Mads Jensen
b551b6ee73 Removed unnecessary unittest.TestCase.setUp/tearDown calls. (#8264) 2020-09-19 10:38:40 +02:00
alexzorin
71d9dfa86e nginx: reduced CLI logging when reloading nginx (#8237)
* nginx: reduced CLI logging when reloading nginx

Hides the output of `nginx -s reload` from the CLI, moving it to
debug-level logging.

Additionally, fixes an issue where Certbot did not properly capture the
output of the nginx reload and restart commands.

Fixes #8231

* remove leftover debugging

* reorder CHANGELOG

* don't use bare asserts
2020-09-16 12:22:15 -07:00
alexzorin
6628bc0e9b certbot-compat: remove dupe random25863 nginx name (#8286)
random25863.example.org appears in multiple port 80 virtualhosts in the
nginx testdata tarball and also is in the nginx-roundtrip-testdata.
Certbot doesn't handle these properly, which results in random test
failures.

This commit ensures that random25863.example.org only appears in a
single virtualhost and should ensure that the tests pass consistently.
2020-09-16 10:00:38 -07:00
alexzorin
f43fa12fc0 cli: add --preconfigured-renewal packaging flag (#8274)
* cli: add --preconfigured-renewal packaging flag

* fix rst formatting

* snap: make the flag postfixed
2020-09-15 15:45:36 -07:00
Brad Warren
2b425110dc Delete conflicting server_names for random28524. (#8278) 2020-09-11 12:16:55 -07:00
Adrien Ferrand
55d411f1eb Remove deprecated python setup.py test call and update packager guide (#8262)
Fixes #7585

This PR removes the specific configuration to configure the test runner included in `setuptools` to use pytest, the deprecated parameters related to setuptools testing in `setup.py`, and update the packaging guide to use `python -m pytest` instead of `python setup.py test`.

The farm test `test_sdist.sh` is also updated to use directly pytest. This test is designed to reproduce the steps used by OS integrators when they package `certbot`, and ensure that we are not breaking something that will impact their work. We discussed with integrators from RHEL/CentOS and Debian, and they are fine with us testing sdist directly with pytest.

One execution of the `test_sdist.sh` farm test with the modifications made by this PR can be seen here: https://dev.azure.com/certbot/certbot/_build/results?buildId=2606&view=results

* Remove setuptools deprecated features about testing

* Updating packaging guide

* Add changelog entry
2020-09-10 15:57:59 -07:00
Mads Jensen
7ddd327f63 Removed unneeded chmod-call in a test. (#8244)
* Removed unneeded chmod-call in a test.

* Trigger CI.

Co-authored-by: Adrien Ferrand <ferrand.ad@gmail.com>
2020-09-11 00:11:51 +02:00
Brad Warren
3a615176c5 fix Certbot acme dep (#8279) 2020-09-10 09:37:10 +02:00
alexzorin
e79af1b1de changelog: move #8263 to the right section (#8271) 2020-09-09 16:16:53 -07:00
Brad Warren
c8828dab30 Move compatibility tests off of certbot-auto and Python 2 (#8248)
Fixes https://github.com/certbot/certbot/issues/8162.

I had to update the base of the Dockerfile to get a new enough version of Python 3. I also simplified things a lot and removed a lot of the comments that were essentially just describing how Dockerfiles work.

The most complicated changes here are in `testdata`. You can find a diff of the changes to `nginx.tar.gz` at https://gist.github.com/c7727db0cecf3f15f02439f085c73848.

The first problem was that there were some complaints from the new Apache/nginx/OpenSSL version about the 1024 bit RSA key so I updated `empty_cert.pem` both inside and outside of the tarball as well as the corresponding private key in the tarball to use a 2048 bit key.

The 2nd problem is trickier to understand. If you look at the output from nginx after loading the config from `lots/` you'll see it complaining about conflicting `server_name` directives for the directives I deleted. See https://dev.azure.com/certbot/certbot/_build/results?buildId=2578&view=logs&j=250aa146-b243-5f8f-bf86-17a529c9fb7e&t=9baa2014-9673-5e78-8f4f-7a463caf2bfa&l=1516.

After switching the tests to Python 3, tests on that domain started failing. What I believe to be happening is we were just lucky these tests were passing to begin with. In both the Apache and Nginx plugin, if there are conflicting virtual hosts like this, we just arbitrarily pick one. The relevant code here for nginx is 575092d603/certbot-nginx/certbot_nginx/_internal/configurator.py (L455)

I played around with a debugger and confirmed that before I removed the conflicting server names, there were two exact matches for the domain we were searching for here.

I think all that's going on is with the switch to Python 3, the vhost we happen to choose changes and "breaks" the test. I suspect this to be due to something like getting values out of a dict somewhere where the order of items in a dict while iterating over it is different between Python 2 and 3. I didn't track where this difference happens down, but I personally don't think it's a good use of time since I think the real problem here is that the nginx config being tested was invalid with conflicting `server` blocks.

I removed all references to the `server_name` causing conflicts in that nginx configuration because both server blocks had other domains that are being tested, but I could add either back if you prefer. You can see the `nginx_compat` test passing with these changes at https://dev.azure.com/certbot/certbot/_build/results?buildId=2587&view=logs&j=250aa146-b243-5f8f-bf86-17a529c9fb7e.

* update Dockerfile

* Fix apache_compat on py3.

* Update empty_cert.pem.

The command used here was `openssl req -key
certbot/certbot/tests/testdata/rsa2048_key.pem -new -subj '/CN=example.com'
-x509 >
certbot-compatibility-test/certbot_compatibility_test/testdata/empty_cert.pem`.

* update nginx.tar.gz

* Remove conflicting server_names
2020-09-09 15:16:52 -07:00
Xebax
f85b738e2f Fix filename in example (#8275) 2020-09-09 18:01:04 +02:00
alexzorin
95a6b61cdc nginx: fix server_name case-sensitivity in parser (#8263)
This commit fixes an issue with the nginx parser where it would perform
case-sensitive matching against server_name.

This would cause the authenticator and installer to ignore existing
virtualhosts containing uppercase characters, resulting in duplicate
virtualhosts and broken configurations.

"Exact" and "wildcard" matching is now case-insensitive. Regex-based
matching will continue to respect the case mode of the pattern.

Fixes #6776.
2020-09-08 14:14:54 -07:00
Brad Warren
21b320ef42 Add TODO to certbot.wrapper. (#8270)
I'm adding this comment as part of the resolution of #8251. I think rewriting the script in Python is something we really should only worry about if we're working on the script in the future. Because of this, I personally prefer a code comment rather than an issue here.
2020-09-08 12:54:00 -07:00
Brad Warren
8c81a1aaf8 Merge pull request #8269 from certbot/candidate-1.8.0
Release 1.8.0
2020-09-08 11:45:54 -07:00
Brad Warren
ec147740ee Bump version to 1.9.0 2020-09-08 09:59:33 -07:00
Brad Warren
b7b0ec321e Add contents to certbot/CHANGELOG.md for next version 2020-09-08 09:59:33 -07:00
Brad Warren
7fe7a965f5 Release 1.8.0 2020-09-08 09:59:31 -07:00
Brad Warren
9f243c768f Update changelog for 1.8.0 release 2020-09-08 09:41:49 -07:00
osirisinferi
b841f0f307 Change ACME spec link to RFC 8555 (#8266) 2020-09-06 14:14:33 +02:00
osirisinferi
8e736479f7 Lower heading level of "Changing a certs domain" (#8267) 2020-09-06 14:03:15 +02:00
alexzorin
2ceabadb81 snap: use snap REST API in certbot.wrapper (#8260)
In order to avoid potentially breaking changes in the snap CLI on the
host, this commit changes certbot.wrapper to use the snap REST API (via
curl and jq) to list connected Certbot plugins.
2020-09-04 23:55:21 +02:00
alexzorin
a2951b4db1 snap: Fix "stack smashing" error in wrapper (#8249)
* snap: Fix "stack smashing" error in wrapper

certbot.wrapper had implicit dependencies on sed, awk and coreutils,
which were being accidentally provided through the host system. Because
certbot.wrapper modifies LD_LIBRARY_PATH, this was causing some systems
to load an incompatible combination of shared libraries, resulting sed
crashing.

This commit reduces the dependencies of this script to just gawk, and
explicitly stages it as part of the Certbot snap.

It additionally moves invocations of all host system programs to a
moment prior to the modification of LD_LIBRARY_PATH, and the invocation
of snapped programs to after the modification.

Fixes #8245

* snap: Don't modify LD_LIBRARY_PATH

* leftover tracing

* snap: revert curl/jq in wrapper, use gawk for now
2020-09-04 20:51:01 +02:00
alexzorin
98615564ed log: Don't print backtrace on ^c/KeyboardInterrupt (#8259) 2020-09-04 12:57:46 +02:00
Adrien Ferrand
3ce87d1fcb Test PIP_NO_BUILD_ISOLATION (#8255)
Fixes #8252

With @bmw we digged quite a lot on why the failure happens on ARM snap, and here we what we understood:
* the failure occurs since the version 50 of setuptools is available
* normally, we should not be impacted because the setuptools version used in the snap build should be the one installed by the `core20` base snap, because the build occurs in a `venv` created with `--system-site-packages`
* BUT associated with the build isolation provided by recent versions of pip (to implement PEP 517), a bad interaction happens: following the definition of the build system provided by `cryptography`, pip installs the most recent version of setuptools on a separate path for the build (because `cryptography` just asks for a minimal version of `setuptools`), then features of this version conflict with the old version of `setuptools` initially present
* the exact interaction is described here: https://github.com/pypa/pip/issues/6264#issuecomment-685230919. Basically the new version of `setuptools` triggers some hacks, that are then applied at runtime on the old version of `setuptools` that is also still available in `sys.path` at this point, and breaks the build.

To fix that, one can disable the isolation build on cryptography, by passing `PIP_NO_ISOLATION_BUILD=no` to pip. It is the purpose of this PR.

This will have the consequence to not be PEP 517 compliant: if needed the `cryptography` library will be built using the `setuptools` available in the system. In general I think it makes sense for the snap build purpose, since we control precisely the build environment, and makes consistent build that will not be broken by a new version of a build system if library maintainers did not provide a strict version of it in their build requirements. However we need now to take care about having a compatible build system for all libraries that may have specific requirements in their build system using the PEP 517 definition in `pyproject.toml`.

I think as of now that it is a safe move if we keep using the most recent version of `setuptools` available in Ubuntu 20.04, and it is the case here for snap builds. It may however be problematic if some libraries require another build system than `setuptools` and do not provide a fallback to a `setuptools` build. For the record, `dns-lexicon`, that I maintain, uses `poetry` and so a PEP 517 compliant definition of a build system, but provides also this fallback (https://github.com/AnalogJ/lexicon/blob/master/setup.py).

Full test suite compiling the snaps for the 3 architectures using this PR is available here: https://dev.azure.com/certbot/certbot/_build/results?buildId=2596&view=results
2020-09-02 11:45:38 -07:00
Brad Warren
d62d853ea4 Clean up --register-unsafely-without-email docs (#8223)
* Clean up --register-unsafely text.

* update unsafe_suggestion

* remove unused import

* Expand scary message.
2020-08-27 13:25:57 -07:00
Daniel Drexler
70731dd75b Move changes to the right section of the changelog (#8236)
Fixing a mistake in pull request #8212 where I recorded my changes in an already released version 😳.

- Moving new changes out of a previous changelog and into the next
  releases' changelog
2020-08-27 09:45:10 -07:00
Daniel Drexler
ae7b4a1755 Support Register Unsafely in Update (#8212)
* Allow user to remove email using update command

Fixes #3162. Slight change to control flow to replace current email
addresses with an empty list. Also add appropriate result message when
an email is removed.

* Update ACME to allow update to remove fields

- New field type "UnFalseyField" that treats all non-None fields as
  non-empty
- Contact changed to new field type to allow sending of empty contact
  field
- Certbot update adjusted to use tuple instead of None when empty
- Test updated to check more logic
- Unrelated type hint added to keep pycharm gods happy

* Moved some mocks into decorators

* Restore default to `contact` but do not serialize

- Add `to_partial_json` and `fields_to_partial_json` to Registration
- Store private variable noting if the value of the `contact` field was
  provided by the user.
- Change message when updating without email to reflect removal of
  all contact info.
- Add note in changelog that `update_account` with the
  `--register-unsafely-without-email` flag will remove contact
  from an account.

* Reverse logic for field handling on serialization

Now forcably add contact when serilizing, but go back to base `jose`
field type.

* Responding to Review

- change out of date name
- update several comments
- update `from_data` function of `Registration`
- Update test to remove superfluous mock

* Responding to review

- Change comments to make from_data more clear
- Remove code worried about None (omitempty has got my back)
- Update test to be more reliable
- Add typing import with comment to avoid pylint bug
2020-08-26 15:22:51 -07:00
Brad Warren
f66a592e37 Try switching to the buster ARM image. (#8234) 2020-08-26 14:04:37 -07:00
Brad Warren
e8518bf206 Fix finding Augeas in the ARM snaps (#8230)
* Find Augeas on all architectures.

* Add changelog entry.

* add comment
2020-08-26 14:03:15 -07:00
Emily Bowman
2a047eb526 Update docs link in certbot unsupported error (#8168)
* Update docs link in certbot unsupported error

Co-authored-by: Adrien Ferrand <ferrand.ad@gmail.com>
2020-08-20 11:33:56 -07:00
Brad Warren
bc137103a3 Don't recommend using certbot-auto. (#8222)
Fixes https://github.com/certbot/certbot/issues/8165.

I moved `prerequisites` up to the "Running a local copy of the client" `contributing.html#prerequisites` still links to information about installing Cerbot's dependencies.

I left all certbot-auto documentation that wasn't explicitly encouraging its use. I think we can rip that out once the script is deprecated.
2020-08-20 11:13:35 -07:00
Brad Warren
085967ad29 Fix test farm tests on macOS and update macOS images (#8219)
* Run one of the test farm tests on macOS.

* it break with 38?

* Remove LOGDIR global

* add comment

* include macOS in name

* Update macOS image.
2020-08-19 18:26:28 -07:00
Brad Warren
4e9d3afcc4 Docker build improvements (#8218)
Fixes https://github.com/certbot/certbot/issues/8208.
Fixes https://github.com/certbot/certbot/issues/8198.

In addition to those two linked issues, this PR:

* Splits both the build and deploy steps based on architecture for performance. The Docker builds should no longer be the bottleneck in any stage of the pipeline.
* Skips building Docker images for ARM on `test-` branches like [we do for snaps](e8a232297d/.azure-pipelines/templates/jobs/packaging-jobs.yml (L67-L71)). I initially didn't want to do this, but the ARM builds take ~18 minutes which is significantly longer than any other job currently running on our `test-` branches.

You can see tests running on my fork at:

* [Release pipeline](https://dev.azure.com/bmw0523/bmw/_build/results?buildId=387&view=results)
* [Test pipeline](https://dev.azure.com/bmw0523/bmw/_build/results?buildId=388&view=results)
* [Nightly pipeline](https://dev.azure.com/bmw0523/bmw/_build/results?buildId=390&view=results)

* update script intro

* update readme

* ParseRequestedArch

* build all arch in Azure

* Build docker images during testing/packaging.

* require global variable?

* Error if TAG_BASE is empty.

* prepare build job

* change variable syntax

* Update deploy stage.

* remove old dockerTag param

* add displayName

* fix docker images command

* split docker_build by arch

* Allow deploying a subset of architectures.

* deploy in parallel

* Skip ARM builds on test- branches.

* fix spacing
2020-08-18 10:48:01 -07:00
ohemorange
acb6d34c5f Update test farm tests to stop using certbot-auto (#8207)
* Create bootstrap script

* Delete a whole bunch of the bootstrap script

* modify test_tests to use new script

* put python version checking in back in

* add x

* call the venv creation from inside the bootstrap

* add targets back

* modify test_apache2 to use new format

* shouldn't need virtualenv on rhel

* readd targets

* Update test_sdists to use new script

* move setting up venv back out of script so it's not run with sudo

* take venv3.py call out of bootstrap in all scripts

* add additional python3-devel pkg name

* fix test_sdists

* enable additional rhel7 repos

* clean up code and comments

* Update tests and instructions to use auto_targets.yaml with test_leauto_upgrades.sh and test_letsencrypt_auto_certonly_standalone.sh

* only install python3-devel.x86_64 for rhel7

* Upgrade python version for debian in test_apache2.sh

* don't run test_tests or test_sdists on debian 9 or ubuntu 16.04

* Add 20.04 and 20.04 arm images to targets.yaml

* use pyenv to upgrade to python3.5

* remove arm64 instance because it's having auth trouble

* correct pyenv usage on ubuntu

* add arm64 target to targets.yaml

* replace debian 9 arm64 with ubuntu 20

* don't try to upgrade a perfectly good python version

* let's just add ubuntu20 to apache2_targets while we're here

* uncomment test_apache2

* move adding python3-devel.x86_64 to bootstrap_os_packages to avoid potential race condition

* no need to specify the arch once extra rhel7 repos enabled

* explicitly specify python3

* don't fail if we can't enable rhel7 extras

* capture python36-devel as well
2020-08-18 10:07:27 -07:00
Brad Warren
63ec74276c Clean up our Docker scripts (#8214)
* rewrite build step

* rewrite deploy script

* fix exit status

* clean up comments

* fix typo

* correct comment
2020-08-18 10:51:30 +02:00
Brad Warren
e8a232297d Pin non-cb-auto dependencies in our plugin snaps (#8217)
This PR fixes our [Azure failures](https://dev.azure.com/certbot/certbot/_build/results?buildId=2492&view=results) by pinning our Python dependencies that are not included in certbot-auto.

This is done using the same approach as our [snap README](575092d603/tools/snap (build-the-snaps)) and [Docker images](575092d603/tools/docker/core/Dockerfile (L24-L25)) with some minor details changed to hopefully make the Python code more readable.

You can see tests passing with this change at https://dev.azure.com/certbot/certbot/_build/results?buildId=2495&view=results.
2020-08-17 11:54:29 -07:00
Brad Warren
575092d603 Drop Python 3.5 support (#8206)
* delete classifiers

* update python_requires

* Update py35 Azure jobs

* Revert "Add warnings about Python 3.5 deprecation in Certbot (#8154)"

This reverts commit 270b5535e2.

* Update other Python 3.5 references.

* update changelog

* bump MIN_PYTHON_3_VERSION
2020-08-16 13:19:08 -07:00
Brad Warren
2d62dec7ec Fix certbot.compat.os docs (#8209)
* Don't document stdlib os.

* Move isort out of docstring.

* fix os import

* fix comment length
2020-08-13 17:24:31 -07:00
Brad Warren
f93b90f87a You don't need to set --server (#8200) 2020-08-12 13:27:53 -07:00
Brad Warren
f40e5bdefe Automate Docker builds in Azure (#8193)
Fixes https://github.com/certbot/certbot/issues/8022, https://github.com/certbot-docker/certbot-docker/issues/25, and https://github.com/certbot-docker/certbot-docker/issues/20.

This PR builds on https://github.com/certbot/certbot/pull/8192 to set up similar builds in Azure to what we currently have at release time as well as nightly builds allowing us to catch problems in these images before a release. It also fully automates our Docker deployments removing a manual step from our release process. We'll need to update our release instructions once this PR lands.

If you're not familiar with our `certbot-docker` setup, you can read about how these scripts customized the build process on Docker Hub at https://docs.docker.com/docker-hub/builds/advanced/.

You can see the process working properly at:

* Nightly build on my fork: https://dev.azure.com/bmw0523/bmw/_build/results?buildId=345&view=logs&j=70ac378a-cb1f-50d1-b328-169807afbcfa
* Release build on my fork: https://dev.azure.com/bmw0523/bmw/_build/results?buildId=346&view=logs&j=70ac378a-cb1f-50d1-b328-169807afbcfa
* Nightly build on Certbot's Azure setup: https://dev.azure.com/certbot/certbot/_build/results?buildId=2426&view=logs&j=70ac378a-cb1f-50d1-b328-169807afbcfa

The builds on my fork pushed to https://hub.docker.com/u/certbottest. The credentials for this account are in our shared vault in 1password if you want to play around with this.

While the scripts will (almost?) always be run in CI, I tested the scripts successfully on macOS, Ubuntu 18.04, and Ubuntu 20.04, however, **the scripts do not seem to work when using the Docker snap, at least on Ubuntu 20.04.** It does work with the `docker.io` packages from `apt`. I was able to make things work by no longer setting `DOCKER_BUILDKIT`, but as I described in the code comments, this breaks things on Azure.

When writing this PR, I tried to make the minimal modifications to our current set up to get the behavior we want. I'm planning on working on splitting the Docker builds into different Azure jobs so it doesn't increase the overall build time, but this isn't trivial so I figured it would be best done in a separate PR.

* Remove license.

* update build scripts

* write deploy code

* Remove unused READMEs.

* rewrite readme

* Make testing on a fork easier.

* Set up Azure automation.

* fix typo

* Make output more verbose.

* clean up cleanup...everybody everywhere

* separate build and deploy

* Document docker-hub credentials

* Use Docker BuildKit when building.

* Remove unneeded .gitignore files.

* Fix tools/docker/README.md grammar

Co-authored-by: ohemorange <ebportnoy@gmail.com>

* Clarify <TAG> in README.

* no docker snap

* rename docker job

Co-authored-by: Erica Portnoy <ebportnoy@gmail.com>
2020-08-11 13:09:38 -07:00
Brad Warren
9bbcc0046c fix --archs default (#8195) 2020-08-11 12:52:24 -07:00
Brad Warren
b3dd2c09ba Remove release branch notification cruft. (#8196) 2020-08-11 12:50:54 -07:00
Brad Warren
8574313841 Remove final jessie references outside of cb-auto. (#8194) 2020-08-07 13:35:21 -07:00
kden
a677534462 Delete or update references to Debian 8 Jessie (#8065)
* Delete or update references to Debian 8 Jessie

* Don't delete oldest constraints from Jessie, but document in comments.

* Update tools/oldest_constraints.txt

Co-authored-by: ohemorange <ebportnoy@gmail.com>

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
Co-authored-by: ohemorange <ebportnoy@gmail.com>
2020-08-07 12:21:52 -07:00
ohemorange
22730dc0ac Merge pull request #8192 from certbot/docker-base
Add certbot-docker files to this repository preserving history
2020-08-06 16:46:17 -07:00
ohemorange
086e6c46b6 Improve github release creation process (#8189)
* Improve github release creation process

* Comment file

* Update tools/create_github_release.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* run chmod +x on tools/create_github_release.py

* Add description of create github release method

* remove references to unnecessary azure credential

* remove unnecessary import

* Add reminders to update other file to definitions in .azure-pipelines

* Raise an error if we fail to fetch the artifact from azure

* Create github release as a draft, upload artifacts, then un-draft, for hooks to be run at the right point

* get the version number from the release

* add new packages to dev3_extras so they're installed by tools/venv3.py

* remove unnecessary import

* fun fact: tempdirs behave differently when used as a context manager

* Move comment to construct.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-08-06 16:32:57 -07:00
osirisinferi
bc0ed3cb01 [Docs] Remove obsolete Gentoo installation instructions and add packages. (#8184)
It seems my old instruction isn't required any longer for Gentoo. To be honest, I don't have a clue since when, but my own Gentoo server isn't even using the workaround mentioned currently in the documentation at the moment. So it seems the Apache plugin works just fine without this workaround 🤦 

Also, the Gentoo repository obviously also includes the nginx since a long time. I guess my original text is ancient.. It also includes *one* of the many DNS plugins, with a different maintainer than the other "main" packages. It currently only has version 0.39.0, so I don't have a clue if it's being maintained officially.

* Remove obsolete Gentoo instructions and add packages.

* Capitalize note

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-08-06 14:13:53 -07:00
Felix Yan
220cc07239 Correct a typo in acme/tests/client_test.py (#8186) 2020-08-05 11:44:23 -07:00
Brad Warren
271be07267 Merge /Users/bmw/Development/github.com/certbot-docker/certbot-docker into master 2020-08-04 15:14:06 -07:00
ohemorange
48a0cc0c42 Merge pull request #8188 from certbot/candidate-1.7.0
Release 1.7.0
2020-08-04 13:09:34 -07:00
Erica Portnoy
5415fc201c Release version v1.7.0 2020-08-04 12:33:20 -07:00
Erica Portnoy
b08fdc7dfb Bump version to 1.8.0 2020-08-04 11:33:04 -07:00
Erica Portnoy
6eb5954f0e Add contents to certbot/CHANGELOG.md for next version 2020-08-04 11:33:04 -07:00
Erica Portnoy
6ec83d52b5 Release 1.7.0 2020-08-04 11:33:03 -07:00
Erica Portnoy
403ded5c58 Update changelog for 1.7.0 release 2020-08-04 11:20:15 -07:00
Brad Warren
4d3f6c23be Document how to revoke snapcraft credentials. (#8187) 2020-08-03 11:42:42 -07:00
Brad Warren
6d73b21dcf Automatically publish snaps to the beta channel (#8183)
* Expand documentation of snapcraft.cfg.

* Push to the beta channel on releases.

* fix template expansion

* update comment
2020-07-31 13:17:07 -07:00
Brad Warren
072c070c0c Do not run tests on pushes to *.x for performance. (#8185) 2020-07-31 12:47:47 -07:00
Brad Warren
df1ca726f9 Remove text about snap beta status. (#8178)
Part of #8140.
2020-07-29 13:18:45 -07:00
ohemorange
086c8b1b3e Mention the availability of DNS plugin snaps in our docs under certbot/docs. (#8176)
Part of #8142.

* Mention that DNS plugins are available as snaps

* Mention snaps in guide to writing third-party plugins
2020-07-27 13:21:31 -07:00
alexzorin
09ab4aea01 nginx: add --nginx-sleep-seconds (#8163)
* nginx: add --nginx-sleep-seconds

As described in #7422, reloading nginx is an asynchronous process and
Certbot does not know when it is complete. In an environment where this
reload takes a long time, the nginx plugin suffers from an issue where
it responds to and fails the ACME challenge before the nginx server is
ready to serve it.

Following the discussion in a previous PR #7740, this commit introduces
a new flag, --nginx-sleep-seconds, which may be used to increase the
duration that Certbot will wait for nginx to reload, from its previously
hard-coded value of 1s.

Fixes #7422

* update CHANGELOG

* nginx: update docstring for nginx_restart
2020-07-27 12:52:12 -07:00
Adrien Ferrand
a6f2061ff7 Improve log dump in snaps remote builds when an unexpected behavior is detected. (#8173)
Fixes #8169

This PR improves snaps remote builds script by dumping the output of `snapcraft remote-build` when unexpected behavior is detected:
* when all builds for a project finish with a zero status code, and none of them are marked as failed, we expect to have all the associated snap files available locally.
* when some builds are marked as failed, we expect to have a build output for each of them available locally.

In these two situations, if the expectation are not matched, then the script will display the output of `snapcraft remote-build` itself. I added also a control error to handle nicely the absence of an expected build output on the local machine.

* Improve log dump in snaps remote builds when an unexpected behavior is detected

* Use the manager

* Update tools/snap/build_remote.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-07-27 12:01:51 -07:00
Adrien Ferrand
02c1339753 Document that snaps are available for 3 architectures (#8174)
Part of #8051
2020-07-27 09:56:55 -07:00
Brad Warren
a1cd909247 remove old snapcraft files (#8167) 2020-07-23 02:29:32 +02:00
ohemorange
9ee4831f78 Make externally snapped plugin updates more stable (#8145)
Fixes #7863.

Connect command is `sudo snap connect certbot-dns-dnsimple:certbot-metadata certbot:certbot-metadata`
Logs are `cat /var/snap/certbot-dns-dnsimple/current/debuglog`
Echos in hook are only printed to terminal when it exits 0; otherwise, check logs in `debuglog` mentioned above.

Manual tests include all iterations of connected, unconnected, installed for the first, second time, etc, with passing and failing version checks.

* Make dnsimple not update if certbot is too old

* create an interface to read cb version

* add missing newline

* fix syntax

* trying to figure out the consumer syntax

* trying to figure out the consumer syntax, again

* only check post first install

* valid setting name

* test for first install differently

* snapctl doesn't error if it fails I guess

* time to do some print debugging

* continue playing with syntax

* once again, fooled by bash int vs string comparisons!

* debugging

* if we use post and pre together we can do this

* is this how content interface syntax works

* it's a directory?

* more debug

* what's that error message again?

* try other syntax

* if it's not documented just guess at syntax

* actually, I think this is the syntax

* oops didn't set for new hook

* test passing information along connection

* interface attributes can only be set during the execution of prepare hooks

* just do it with main connection

* undo last few test changes

* Add some printing to make sure we understand what's going on

* create empty directory to bind to

* put mkdir in the correct part

* let's inspect the environment

* it can't run bash directly.

* perhaps only directories can be shared via the contente interface

* update name of folder

* echo to debug log to understand what's going on exactly. we have file access though!

* update grep for new file

* more printing

* echo to the debug log

* ok NOW all print statements are going to the log

* why does echo need two >s

* remove unnecessary extra check, just check if the init file is available

* check if certbot version will be available post-refresh after all

* pre-refresh hook is not necessary to get certbot version

* update mkdir so we don't have to clean each time

* try comparing version numbers in python

* it's python3

* we need different prints for if we succeed or if we fail.

* improve bash syntax

* remove some debugging code

* Remove debug script

* remove spaces for clarity

* consolidate parts and remove more test code

* s/certbot-version/certbot-metadata/g

* use sys.exit instead of exit

* find and save certbot version on the certbot side

* change presence test to new file

* switch to using packaging.version.parse instead of LooseVersion

* switch to requiring certbot version >= plugin version

* add plugin snap changes to generate script

* Add comment to generation file saying not to edit generated files manually

* Create post-refresh hook for all plugins with script

* generate files using new script

* update snapcraft.yaml files for plugins

* bin/sh comes first

* Add packaging to install_requires

* Check that refresh is allowed in integration test

* switch plug and slot names in integration test

* Update tools/generate_dnsplugins_postrefreshhook.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* small bash fixes

* Update snap readme with new instructions

* Run tools/generate_dnsplugins_postrefreshhook.sh

* Update tools/snap/generate_dnsplugins_postrefreshhook.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-07-22 17:07:07 -07:00
Adrien Ferrand
14dfbdbea5 Build snaps using the remote-build feature (#8153)
Snapcraft has a feature name `remote-build`. It allows to compile snaps using the Canonical dedicated build architecture for several architectures. Compared to the QEMU-enabled Docker approach used currently, the remote build has several advantages:
* the builds are done on the native architecture, making them basically faster than what can be achieved on QEMU
* it avoids to depend on `adferrand/snapcraft` (which could be otherwise be fixed with the merge of https://github.com/snapcore/snapcraft/pull/3144, but this will not happen in the short term)
* when everything is good, all snaps build can be run in parallel and then can be orchestrated by one single Azure Pipeline job, since the heavy tasks are done remotely.

This PR makes the necessary ajustements to use the remote build feature instead of the QEMU-enabled docker approach.

One complex task was to be able to compile the `certbot` snap on `arm64` and `armhf`. Indeed on these architectures the pre-compiled wheel for `cffi` is not available. So it needs to be compiled during the snap build. Sadly, the current version of the python plugin in snapcraft is limited by the fact that `wheels` is not installed in the virtual environment set up to build the python packages, and there is no easy way to change that except by overridding the whole build process.

In the long term, I think I will open a PR on `snapcraft` Git repository to provide a consistent solution. But for the short term, I used the possibility to provide arguments to the `venv` module, to add the flag `--system-site-packages`. With it, the virtual environment can use the system site package, where `wheel` is available.

The other significant additions are in `tools/snap/build_remote.py` script. If invoking the remote build on a local machine is quite straight-forward, it is another story on the CI because we need build auditability and resiliency during these non-interactive actions. In particular we should avoid as possible inconsistent results on the nightly pipeline and the release pipeline.

So this script wraps the `snapcraft` call into a retry logic, and improves its logs in the context of parallel builds.

For the minor modifications, it is mainly about ensuring that plugins can be built (some of them also need `cffi` for instance), and simplify the Azure Pipeline since all snaps are retrieved in one go.

Please note that the `test-` branches still run only the `amd64` architecture. Indeed I noticed that builds on `arm64` and `armhf` are tending to be very slow to start (up to 40 min) while the `amd64` ones wait at max 10 mins, and usually 30 seconds only when the overall load on Canonical side is low.

To work on `certbot/certbot` repository, one secured file needs to be added, because `snapcraft` needs to be authenticated against Launchpad with credentials allowing remote builds. To do so, from a local machine that have this capability, one can extract the existing file at `$HOME/.local/share/snapcraft/provider/launchpad/credentials`, and register it as a secured file in Azure Pipeline with the name `snapcraftRemoteBuildCredentials`.

* Define scripts

* Setup pipeline to use remote builds

* Focus on packaging builds

* Set credentials

* Setup git

* Launch all builds in parallel

* Add dev dependencies to build cffi and cryptography

* Convert to a python logic

* Reorganize the pipeline

* Handle the fact that snap builds may be taken from cache

* Generate constraints

* Exit code

* Check existence

* Try to handle better non zero exit code

* Add --system-site-packages to get wheel in the venv

* Add executable permissions

* Troubleshoot

* Dynamic display, take the maximum timeout for snap build job

* Allow retries if the remote build does not start

* Trigger only amd64 builds for test branches

* Exit properly

* Update snapcraft.yaml

* Fix snap run

* Set secured file name

* Update .azure-pipelines/templates/jobs/packaging-jobs.yml

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update .azure-pipelines/templates/jobs/packaging-jobs.yml

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update .azure-pipelines/templates/jobs/packaging-jobs.yml

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Move order in deps

* Reactivate all builds

* Use Manager() as a context manager

* Use Pool as a context manager

* Some nice refactorings

* Check snapcraft execution interruption with exit codes

* Use f-string and format expressions

* Start log

* Consistent use of single/double quotes

* Better loop to extract lines

* Retry on build failures

* Few optimizations

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-07-22 16:05:20 -07:00
Adrien Ferrand
270b5535e2 Add warnings about Python 3.5 deprecation in Certbot (#8154)
Fixes #8149

This PR adds warnings to warn about the incoming deprecation of Python 3.5 in Certbot.

* Add warnings about Python 3.5 deprecation in Certbot

* Update certbot/certbot/__init__.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-07-20 17:46:38 -07:00
Adrien Ferrand
74b0340a13 Use a specific tag of adferrand/snapcraft to build QEMU snaps and avoid failures (#8158)
The latest builds of snapcraft introduced somehow several failures when snaps are built on QEMU for armhf. See https://dev.azure.com/certbot/certbot/_build/results?buildId=2326&view=logs&j=7c548e18-6053-5a42-b366-e6480da09a69&t=a7c7ca26-ae0c-54e6-0355-3bfcd7bab03c for instance.

This PR uses a specific tags from `adferrand/snapcraft`, extracted from the last known working `nightly` pipeline, to avoid these failures until a more permanent fix is done. Very likely the fix will be the move to snapcraft remote builds.

* Use a specific tag of adferrand/snapcraft to build snaps and avoid an error on QEMU for armhf.

* Update tools/snap/build.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update tools/snap/build_dns.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-07-20 17:17:10 -07:00
Adrien Ferrand
b13dfc6437 Do not create the symlink for test assets on Windows if the asset path is already a symlink (#8159) 2020-07-21 01:01:09 +02:00
schoen
c5bab9b07c Merge pull request #8157 from stefantalpalaru/linodedns
certbot_dns_linode: decrease the default propagation interval
2020-07-20 13:22:18 -07:00
Ștefan Talpalaru
b6964cae2e certbot_dns_linode: decrease the default propagation interval
«When you add or change DNS zones or records, your changes will now be
reflected at our authoritative nameservers in under 60 seconds. This is
down from the previous “every quarter hour” approach that we had for so
long.» - https://www.linode.com/blog/linode/linode-turns-17/
2020-07-19 16:44:37 +02:00
Brad Warren
ebf1349b15 Update to IPython with Python 3.8 support. (#8152) 2020-07-17 13:01:04 -07:00
Brad Warren
9d2e0ac013 Specify the Certbot snap grade. (#8147) 2020-07-17 12:47:11 -07:00
Thomas
05dbda4b51 added inwx plugin (#8115)
* added inwx plugin

* Update using.rst

fixed convention naming
2020-07-15 13:41:15 -07:00
Brad Warren
40a2a5b99f Release version v1.6.0 2020-07-14 17:17:36 -07:00
Adrien Ferrand
68b3b048b9 Use 3rd party plugins without prefix + set a deprecation path for the prefixed version (#8131)
Fixes #4351

This PR proposes a solution to use the third party plugins with the prefix `pip_package_name:` in the plugin name, plugin specific flags and keys in dns plugin credential files.

A first solution has been proposed in #6372, and a more advanced one in #7026. In #7026 was also added a deprecation warning when the old plugin name `pip_package_name:plugin_name` was used.

However there were some limitations with #7026, in particular the fact that existing flags of type `pip_package_name:dns_plugin_option` or keys like `pip_package_name:key` in dns plugin credential files were not read anymore. This would have led to silent failures during renewals if the configuration was not explicitly updated by the user.

I tried to fix that based on #7026, but the changes needed are complex, and create new problems on their own, like unexpected erasure of values in the renewal configurations.

Instead I try in this PR a new approach: the `PluginsRegistry` in `certbot._internal.plugins.disco` module register two plugins for a given entrypoint refering to a third party plugin when `find_all()` is called:
* one plugin with the name `plugin_name`
* one plugin with the name `pip_package_name:plugin_name` (like before)

This way, every existing configuration continues to work without any change (credentials, renewal configuration, CLI flags). And new configurations can refer to the new plugin name without prefix, and use the approriate CLI flags, credentials without this prefix.

On top of it I added the deprecation path given in #7026 (thanks @coldfix!):
* the plugin named `pip_package_name:plugin_name` is hidden from `certbot plugins` output
* the help for this plugin is still displayed, and a deprecation warning is displayed in the description
* when invoked, the same deprecation warning is displayed in the terminal

* Support both prefixed and not prefix third party plugins

* Adapt tests

* Add deprecation path

* Named parameters

* Add deprecation warning in CLI

* Add a changelog
2020-07-10 09:16:21 -07:00
Adrien Ferrand
d434b92945 Build the DNS plugins snaps (#8129)
Fixes #8041

This PR makes Azure Pipeline build the DNS plugins snaps for the 3 architectures during the CI.

It leverages the existing logic for building the Certbot snap in order to deploy a QEMU environment with Docker, and leverages the local PyPI index to speed up the build when installing `cffi` and `cryptography`.

All DNS plugins snaps are constructed in one unique docker container, in order to save the time required to install the system dependencies upon first start of `snapcraft`, and so speed up significantly the build.

Finally, all `amd64` DNS plugins snaps are built within 6 minutes. For `arm64` and `armhf`, it is around 40 mins: this is quite fast in fact, considering that 14 DNS plugins snaps are built.

However, this is still an extremely heavy task to make the full 3 architectures builds, even for Azure Pipelines and its 10 parallel jobs capability. That is why I make the `arm64` and `armhf` builds be skipped for the `full-test-suite`, and let them run only for `nightly` and `release`. This means however that these builds will not be done for the release branches. If this is a problem, I can put a more elaborate suspend condition to triggers the builds in this case.

All snaps are stored in the pipeline artifacts storage, making them available for publication during a `release` pipeline.

The PR is set as Draft for now, because I use temporarily `pr_test-suite` to validate the packaging jobs when commits are pushed. Once the PR is ready, I will revert it back to the normal configuration (run the standard tests).

* Configure a script to build DNS snaps

* Focus on packaging

* Trigger all architectures

* Add extra index

* Prepare conditional suspend

* Set final suspend logic

* Set final suspend value

* Loop for publication

* Use python3

* Clean before build

* Add a test

* Add test job in Azure

* Preserve env

* Apply normal config for pipelines

* Skip QEMU jobs only for test branches

* Makes snap run tests depends also on the Certbot snap build

* Update .azure-pipelines/templates/jobs/packaging-jobs.yml

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update .azure-pipelines/templates/stages/deploy-stage.yml

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* More accurate way to get the plugin snap name

* Integrate DNS snap tests into certbot-ci

* Fixes

* Update certbot-ci/snap_integration_tests/conftest.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update certbot-ci/snap_integration_tests/conftest.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Clean an _init_.py file

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-07-09 11:33:25 -07:00
bdeweygit
1697d66ba7 Be more informative about reasons for using Docker (#28)
People who are considering running Certbot with Docker are probably doing so because their webserver is to be run with Docker. These changes to the README should help them to understand that doing so will require knowledge of Docker volumes and that the architectural justification for running Certbot in a separate container is the "one service per container" best practice.
2020-07-09 09:47:38 -07:00
J0WI
a6a998d11b Upgrade to Alpine 3.12 (#27) 2020-07-08 18:32:32 +02:00
Brad Warren
f82e2cc714 s/snapcraft push/snapcraft upload/g (#8137) 2020-07-08 08:05:33 +02:00
Brad Warren
433c6f391c Merge pull request #8136 from certbot/candidate-1.6.0
Update files from 1.6.0 release
2020-07-07 11:45:39 -07:00
Brad Warren
d64bb81864 Fix typo (#26) 2020-07-07 20:18:06 +02:00
Brad Warren
88e183e69e Release version 1.6.0 2020-07-07 11:04:52 -07:00
Brad Warren
590eeca38a Bump version to 1.7.0 2020-07-07 10:33:16 -07:00
Brad Warren
b9a25c3987 Add contents to certbot/CHANGELOG.md for next version 2020-07-07 10:33:15 -07:00
Brad Warren
41b99eba79 Release 1.6.0 2020-07-07 10:33:13 -07:00
Brad Warren
de39a42e6a Update changelog for 1.6.0 release 2020-07-07 10:13:21 -07:00
Adrien Ferrand
183ccc64b1 Some improvements (#8132)
Short PR to improve some things during snap builds:
* cleanup snapcraft assets before a build, in order to avoid some weird errors when two builds are executed consecutively without cleanup
* use python3 explicitly in `tools/simple_http_server.py` because on several recent distributions, `python` binary is not exposed anymore, only `python2` or `python3`.
2020-07-06 16:04:59 -07:00
Brad Warren
6bca930752 Remove unnecessary symlink (#8135)
This isn't needed anymore thanks to the line:
```
AWS_EC2_PEM_FILE: $(testFarmPem.secureFilePath)
```
2020-07-06 15:31:24 -07:00
Brad Warren
cd993cdfb1 Remove grade devel from Certbot snap. (#8128)
If you go to a URL like https://snapcraft.io/certbot/releases and try to move the Certbot snap into the candidate or stable channels, you cannot do so. There is a tooltip which says that revisions with the grade devel cannot be promoted to candidate or stable channels.

The documentation for `grade` can be found at https://snapcraft.io/docs/snapcraft-yaml-reference where it says the value is optional and

> Defines the quality grade of the snap.
Type: enum
Can be either devel (i.e. a development version of the snap, so not to be published to the stable or candidate channels) or stable (i.e. a stable release or release candidate, which can be released to all channels)
Example: [stable or devel]

I'm working on a proposal for our next steps for snaps which involves moving the Certbot snap to the stable channel. I of course won't make those changes without giving others a chance to share their opinion, but I'd like to avoid the situation where we're technically unable to move the Certbot 1.6.0 snap to the stable channel despite wanting to do so.

I started to make the same changes to the DNS plugins, but I personally think it's too soon to propose stable versions of those yet and `grade` is a simple way to ensure we don't accidentally promote something there.

You can see the snap being built and run successfully with this change at https://dev.azure.com/certbot/certbot/_build/results?buildId=2246&view=results.
2020-07-06 12:31:55 -07:00
Brad Warren
9f994d7a50 Run at 4:30 UTC to have Azure reparse YAML file. (#8133) 2020-07-06 20:41:26 +02:00
Brad Warren
4f3dc8862d Switch build status to nightly pipeline. (#8127)
The advanced pipeline no longer exists.
2020-07-02 16:05:28 -07:00
Brad Warren
48139f382d Do not build pushes to master. (#8126) 2020-07-03 01:00:35 +02:00
Adrien Ferrand
8a3a8c7097 Migrate the CI pipeline from Travis to Azure Pipeline (#8098)
Fixes #8071 and fixes https://github.com/certbot/certbot/issues/8110.

This PR migrates every job from Travis in Azure Pipeline.

This PR essentially converts the Travis jobs into Azure Pipeline with a complete iso-fonctionality (or I made a mistake). The jobs are added in the relevant existing pipelines (`main`, `nightly`, `advanced-test`, `release`). A global refactoring thanks to the templating system is done to reduce greatly the verbosity of the pipeline descriptions.

A specific feature (not present in Travis) is added: the stage `On_Failure`. Using directly the Mattermost API, it allows to notify pipeline failure in a Mattermost channel with a link to the failed pipelines without the need to authenticate to Microsoft.

See https://github.com/certbot/certbot/pull/8098#issuecomment-649873641 for the post merge actions to do at the end of this work.
2020-07-02 15:01:21 -07:00
ohemorange
cb3ff9ef18 Set up CentOS 8 test farm tests (#8122)
Fixes #7420.

* Set up CentOS 8 test farm tests

* Don't add to apache2_targets until 7273 is resolved

* Start upgrade test from a version that works on centos 8

* remove when possible from targets
2020-07-01 17:07:41 -07:00
alexzorin
f743dbec3a certbot: add --preferred-chain (#8080)
* acme: add support for alternative cert. chains

* certbot: add --preferred-chain

* remove support for issuer SKI matching

* show --preferred-chain in "run" help

* warn if no chain matched and it's not a dry-run

* fix existing failing tests

* add unit, integration tests

* bump acme dependency to dev version

* simplify test to avoid py2.7 recursion bug

* add preferred_chain to STR_CONFIG_ITEMS

* reduce preferred_chain warning to info level

* acme: fix some docstrings in .messages

* certbot: fix docstring in crypto_util

* try to fix certbot-nginx acme dep problem
2020-06-30 17:45:39 -07:00
ohemorange
2af297d72f Make each DNS plugin respect EXCLUDE_CERTBOT_DEPS (#8117)
* Don't include certbot deps when EXCLUDE_CERTBOT_DEPS is set

* import os
2020-06-29 16:58:26 -07:00
Brad Warren
95ef53e5d5 Add missing spaces to manual plugin help. (#8116) 2020-06-29 13:34:24 -07:00
Brad Warren
24c5fab8b6 Add awscli to requirements.txt (#8113) 2020-06-25 16:52:56 -07:00
ohemorange
713b91495b Fix paths when calling out to programs outside of snap (#8108)
Fixes #8093.

This PR modifies and audits all uses of `subprocess` and `Popen` outside of tests, `certbot-ci/`, `certbot-compatibility-test/`, `letsencrypt-auto-source/`, `tools/`, and `windows-installer/`. Calls to outside programs have their `env` modified to remove the `SNAP` components of paths, if they exist. This includes any calls made from hooks, calls to `apachectl` and `nginx`, and to `openssl` from `ocsp.py`.

For testing manually, rsync flags will look something like:

```
rsync -avzhe ssh root@focal.domain:/home/certbot/certbot/certbot_*_amd64.snap .
rsync -avzhe ssh certbot_*_amd64.snap root@centos7.domain:/root/certbot/
```

With these modifications, `certbot plugins --prepare` now passes on Centos 7.

If I'm wrong and we package the `openssl` binary, the modifications should be removed from `ocsp.py`, and `env` should be passed into `run_script` rather than set internally in its calls from nginx and apache.

One caveat with this approach is the disconnect between why it's a problem (packaging) and where it's solved (internal to Certbot). I considered a wrapping approach, but we'd still have to audit specific calls. I think the best way to address this is robust testing; specifically, running the snap on other systems.

For hooks, all calls will remove the snap paths if they exist. This is probably fine, because even if the hook intends to call back into certbot, it can do that, it'll just create a new snap.

I'm not sure if we need these modifications for the Mac OS X/ Darwin calls, but they can't hurt.

* Add method to plugins util to get env without snap paths

* Use modified environment in Nginx plugin

* Pass through env to certbot.util.run_script

* Use modified environment in Apache plugin

* move env_no_snap_for_external_calls to certbot.util

* Set env internally to run_script, since we use that only to call out

* Add env to mac subprocess calls in certbot.util

* Add env to openssl call in ocsp.py

* Add env for hooks calls in certbot.compat.misc.

* Pass env into execute_command to avoid circular dependency

* Update hook test to assert called with env

* Fix mypy type hint to account for new param

* Change signature to include Optional

* go back to using CERTBOT_PLUGIN_PATH

* no need to modify PYTHONPATH in env

* robustly detect when we're in a snap

* Improve env util fxn docstring

* Update changelog

* Add unit tests for env_no_snap_for_external_calls

* Import compat.os
2020-06-25 15:36:29 -07:00
dmmortimer
0f4c31c9c7 Generalize renewal rate limit UI warning message (#3456) (#8061)
- Old text hard codes the rate limit
 - Let's Encrypt CA might change its rate limit
 - Other CAs might have different rate limits

Update CHANGELOG.md
2020-06-25 11:43:08 -07:00
dkp
b9a8248541 Remove SSL Labs From Certbot Output (#8109)
The Apache plugin expects clients to support SNI, but
SSL Labs tries without SNI and includes the results
in their score.

Closes certbot/certbot#7728
2020-06-25 11:42:07 -07:00
Brad Warren
8027430625 Correct plugin constraints. (#8104) 2020-06-23 14:16:41 -07:00
ohemorange
bce14ae65f Make DNS plugin snaps use core20 (#8106)
Fixes #8103.

* Update the DNS plugin generator script to core20 syntax

* Generate new snapcraft.yamls for the DNS plugins

* Update certbot.wrapper to search for python3.8 paths
2020-06-23 09:31:08 -07:00
Adrien Ferrand
25d1977d4f Add script and generated snapcraft.yaml files (#8096)
This PR adds a proper snapcraft.yaml file for each DNS plugin, and provides a shell script to generate them.
2020-06-22 17:07:08 -07:00
Brad Warren
46eb4ec7e3 Remove unneeded step to create constraints file. (#8102) 2020-06-22 16:48:50 -07:00
Brad Warren
3ae8fa640b Remove snap-plugin from README (#8101) 2020-06-22 15:49:18 -07:00
Adrien Ferrand
035b6514db Build the constraints file insided snapcraft (#8097)
Fixes #7957

This PR makes snapcraft generate itself the dependencies constraints file during the snap build, instead of having an external script that does it before calling snapcraft.
2020-06-22 13:54:18 -07:00
Brad Warren
f151099342 Do not unstage generated cffi augeas (#8094)
This line seems to refer to [augeas.py](https://github.com/hercules-team/python-augeas/blob/v0.5.0/augeas.py) from the version of Augeas we normally have pinned. This was necessary when we were installing each Certbot component in separate parts and only combining them later to ensure that the Augeas fork (which uses cffi) was used instead of the unmodified pinned version of Augeas.

Since everything is installed in one part and we're removing the Augeas pinning now though, this line is no longer necessary. You can see the snap being built and tested successfully with this change at https://travis-ci.com/github/certbot/certbot/builds/172134518.

* Do not unstage generated cffi augeas

* Add comment about deleted line.
2020-06-22 12:11:38 -07:00
Florian Klink
25e79e4aca tree-wide: use LooseVersion instead of StrictVersion (#8081)
According to `distutils/version.py`, StrictVersion is pretty strict in
what version numbers to accept:

> A version number consists of two or three dot-separated numeric
> components, with an optional "pre-release" tag on the end.  The
> pre-release tag consists of the letter 'a' or 'b' followed by a number.

This assumption already fails for some pretty basic python libraries
itself, like setuptools, also available in `46.1.3.post20200610`, a
completely valid version number according to
https://www.python.org/dev/peps/pep-0440/#post-releases.

There doesn't seem to be a particular reason on why StrictVersion has
been used here, so let's use LooseVersion, to be compatible with these
versions.

Co-authored-by: Adrien Ferrand <adferrand@users.noreply.github.com>
2020-06-19 17:11:35 +02:00
Brad Warren
db064a4109 Use certbot.wrapper during renewal. (#8095) 2020-06-18 16:41:12 -07:00
Adrien Ferrand
860af81fef Deferred EFF subscription until the first certificate is successfully issued (#8076)
* Base logic

* Various controls when email is None

* Adapt eff tests

* Forward compatibility

* Also for csr

* Explicit regr or meta updates in account objects

* Adapt logic to ask for eff subscription during registering

* Adapt tests

* Move dry-run control

* Add some relevant controls on handle_subscription call checks
2020-06-18 15:58:19 -07:00
ohemorange
70c8481fd8 Don't include certbot deps when EXCLUDE_CERTBOT_DEPS is set in plugins (#8091)
This will allow DNS plugin snaps to build if they rely on unreleased acme/certbot, and remove other copy of Certbot from externally snapped plugins. Fixes #8064 and fixes #7946. Implementation is based on the design [here](https://github.com/certbot/certbot/issues/8064#issuecomment-645513120).

To test, see reverted commit 8632064. Steps taken:

- added changes to setup.py and snapcraft.yaml
- successfully snapped, connected, ran `sudo certbot plugins --prepare`
- added temporary changes to have both certbot and certbot-dns-dnsimple use DNSAuthenticator2
- snapped and installed certbot, `certbot plugins` failed as expected.
- snapped and installed certbot-dns-dnsimple, `sudo certbot plugins --prepare` succeeded
- Inspected dns plugin's `bin` and `lib`; no `certbot` or `acme`, as expected.
```
$ ls /snap/certbot-dns-dnsimple/current/lib/python3.6/site-packages/
OpenSSL                                         future-0.18.2.dist-info        requests_file.py
PyYAML-5.3.1.dist-info                          idna                           setuptools
_cffi_backend.cpython-36m-x86_64-linux-gnu.so   idna-2.9.dist-info             setuptools-47.3.1.dist-info
certbot_dns_dnsimple                            lexicon                        six-1.15.0.dist-info
certbot_dns_dnsimple-1.6.0.dev0-py3.6.egg-info  libfuturize                    six.py
certifi                                         libpasteurize                  tldextract
certifi-2020.4.5.1.dist-info                    past                           tldextract-2.2.2.dist-info
cffi                                            pip                            urllib3
cffi-1.14.0.dist-info                           pip-20.1.1.dist-info           urllib3-1.25.9.dist-info
chardet                                         pkg_resources                  wheel
chardet-3.0.4.dist-info                         pyOpenSSL-19.1.0.dist-info     wheel-0.34.2.dist-info
cryptography                                    pycparser                      yaml
cryptography-2.8.dist-info                      pycparser-2.20.dist-info       zope
dns_lexicon-3.3.26.dist-info                    requests                       zope.interface-5.1.0-py3.6-nspkg.pth
easy_install.py                                 requests-2.23.0.dist-info      zope.interface-5.1.0.dist-info
future                                          requests_file-1.5.1.dist-info
$ ls /snap/certbot-dns-dnsimple/current/bin/
chardetect  futurize  lexicon  pasteurize  tldextract
```
- reset to HEAD^
- snapped and installed certbot to not have the DNSAuthenticator2 changes, `certbot plugins` failed as expected.

* Don't include certbot deps when EXCLUDE_CERTBOT_DEPS is set

* Set EXCLUDE_CERTBOT_DEPS in certbot-dns-dnsimple/snap/snapcraft.yaml
2020-06-18 15:24:10 -07:00
ohemorange
c5e5594ac3 Switch to using snap-constraints in dns plugin so the .gitignore catches it. (#8092) 2020-06-18 15:15:41 -07:00
ohemorange
ad8ffc1bf0 Merge pull request #8089 from certbot/squashed-snap-plugin
Merge the snap-plugin branch into master
2020-06-18 12:59:04 -07:00
Erica Portnoy
29a23d3148 Update README instructions for merged master
Use Focal when following README instructions

Discard changes to disco.py

bring README up to date with current knowledge

Move README to snap/local/
2020-06-18 12:49:58 -07:00
Brad Warren
44b1bd8e0e Update README
fix branch name

grammar

remove readme link

Remove links to Robie's repo.

say you cant use docker

Commands not command

Update publishing permissions section.

Have Certbot trust plugins.

Do not run snapcraft with sudo.
2020-06-18 12:20:57 -07:00
Robie Basak
0bebdedcbc Initial revision
Fix headings

Fix error in build instructions
2020-06-18 12:20:56 -07:00
Brad Warren
8ccc96bbdd update certbot-dns-dnsimple snapcraft.yml.
update dnsimple snapcraft.yml
2020-06-18 12:20:53 -07:00
Robie Basak
4a2618d415 Add CERTBOT_PLUGIN_PATH support
Initial revision

Fix interface export path

Switch to strict confinement

Normalise slot parameters
2020-06-18 12:20:18 -07:00
ohemorange
bcb3554836 Merge pull request #8086 from certbot/core20-squashed
Upgrade snap to be based on core20

This PR makes several changes to be built on top of the core20 base snap. Fixes #7854.

The main changes are to `snapcraft.yaml`. With Snapcraft 4.0/core20 base, the python plugin is a thin wrapper, basically creating a `venv` and installing the packages from the source. The trouble with this is that it runs pycache, creating caches that conflict from the different parts. So to solve that, we put everything in a single part. Other changes include:

- We use classic confinement, so we need to specify a bunch of python packages to `stage-packages`, as mentioned [here](https://forum.snapcraft.io/t/trouble-bundling-python-with-classic-confinement-in-core20-4-0-4/18234/2).
- The certbot executable now lives in `bin`, so specify running `certbot/bin`.
- Since `python-augeas` is now being pulled into the single part, remove the pinning from constraints so we can use the latest version directly from github.
- Precompile our `cryptography` and `cffi` wheels to be based on python3.8.

Separately, we had to upgrade the snapcraft docker image to be based on focal, due to the thin wrapper situation. This was accomplished [here](https://github.com/adferrand/snapcraft/pull/1).
2020-06-17 17:20:46 -07:00
Erica Portnoy
0f53e8ad4e Upgrade snap to be based on core20
* Get rid of a whole bunch of error message
* Remove some more overlaps
* don't use certbot from nginx and apache
* use python3 from bin
* certbot needs to be in bin
* try to exclude just the certbot folder
* try a couple things to use the python from the venv bin
* play around with which versions of things we want from each package
* ok, certbot-nginx does need to stage bin
* certbot needs to not stage bin. why does certbot not put certbot in bin?
* fail to inspect more versions of things in the container shell
* take cffi backend from python-augeas
* if we use certbot from bin things should work?
* why is bin not in path? no idea, but let's get it compiled then inspect things in the snap shell
* use snap.certbot instead of bin/certbot
* it does require bin/certbot. I don't know why.
* let's see if we can stick it all in one step
* try installing local subdirectories
* move python-augeas into the single part
* remove after
* put back python-augeas part for now; ERROR: Could not satisfy constraints for 'python-augeas': installation from path or url cannot be constrained to a version
* how was this previously working without git installed? install git.
* maybe it needs to already have python3-wheel installed
* maybe wheel will install first if I change it to -e
* no -e
* maybe try a different python3 package to stage
* this last change wasn't necessary
* remove the bin/ from renew
* nope, it does need bin/certbot
* back to wget
* stage a bare python3
* add all necessary python packages to stage-packages
* pretty sure we don't actually need wheel. let's try removing it!
* remove python-augeas, since we have it pinned to an older version in cb-auto that might work
* stage augeas
* still need libaugeas-dev
* ok let's try building
* combining into one part works! just make sure to unpin python-augeas when generating snap-constraints.txt
* change our scripts to unpin python-augeas
* Use ubuntu 20 in compile_native_wheels.sh
* .travis.yml should use python3-dev instead of python-dev
* jk! we don't need python3-dev in travis
* Update cffi and cryptography wheels for ubuntu20 version of python
* looks like we need python3-dev to build things
* Remove deprecated i386 wheels
2020-06-17 16:57:51 -07:00
Brad Warren
2a18ae6d57 Trivially upgrade to core20? 2020-06-17 16:52:17 -07:00
Brad Warren
3c4b922197 Remove the need for TRAVIS to be set. (#8084)
I initially added this when the script was doing things like migrating all LXD containers to the snap. I think the external side effects are now pretty minimal thought so I think we can remove the need for this environment variable which makes it easier to use outside of CI for manual testing.
2020-06-17 14:41:11 -07:00
Adrien Ferrand
0bb1f0b2ce Drop i386 architecture on snap build (#8083)
This PR remove the i386 architecture in the snap build process, because the base snap `core20` is not available for this architecture.
2020-06-16 15:57:05 -07:00
Brad Warren
ba192f321d Remove on_cancel which isn't recognized by Travis. (#8077) 2020-06-15 13:27:23 -07:00
Cameron Steel
961c573864 dns-cloudflare: Update docs and error messages to reflect new API permissions (#8015)
* Tweaks for improved Cloudflare API

* Update docs for dns-cloudflare

* Update tests and changelog

* Fix bad merge

* Fix error code for record add

* Improve error message

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-06-12 20:38:13 +02:00
Brad Warren
fb39de7d01 add changelog entry (#8067) 2020-06-11 10:19:51 +02:00
Brad Warren
97fcfd40d1 starts not stats (#8066) 2020-06-10 16:58:46 -07:00
Adrien Ferrand
9ac476e87b Certbot snap multiarch build (#8016)
This PR proposes a way to build the certbot snap for several architectures, using a QEMU-base emulation approach, and several optimizations to keep the build time of each snap below 20 minutes.

Most of the reasoning for the approach proposed here is described in the original PR: https://github.com/basak/certbot-snap-build/pull/27

On top of it, I added a docker pull to a pre-compiled snapcraft docker, instead of compiling it during the Travis pipeline, in order to save 5 to 7 minutes more on each snap build. The snap images are compiled and stored here: https://hub.docker.com/repository/docker/adferrand/snapcraft. Depending on the time the PR will be reviewed, we can:
* continue to use `adferrand/snapcraft`
* move its logic to certbot scope and use something like `certbot/snapcraft`
* wait for https://github.com/snapcore/snapcraft/pull/3144 to be merged, and use `snapcore/snapcraft`.

* Backport https://github.com/basak/certbot-snap-build/pull/27 into Certbot project

* Fix build deps

* Integrate proactively #8012 to fix builds on non-amd64 archs

* Configure jobs on Travis

* Focus on snap builds. Disable temporarily some jobs. Disable deploy actions by security.

* Specify TARGET_ARCH during snap build

* Do not do anything if TOXENV is not set

* Various optimizations

* Use recent version of ubuntu for get correct features on snap out of the box

* Add up to date wheels

* Organizing scripts

* Set dest dir

* Get back original configuration for Travis

* Add comments

* Update common_libs.sh

* Use adferrand/snapcraft

* Test build

* Stable snapcraft

* Update build_and_install.sh

* Move back snap builds to the cron/release pipeline

* Update snap/local/compile_native_wheels.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update snap/local/compile_native_wheels.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update snap/local/compile_native_wheels.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update snap/local/build_and_install.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Enable i386 builds, various optimizations

* Update dependencies

* Configure a simple http server to serve the pre compiled wheels

* Fix wheels compilation

* Relax permissions

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-06-10 14:33:02 -07:00
ohemorange
8d776fb7ac Make Certbot find externally snapped plugins (#8054)
* Add snap plugin support

Switch to a PoC branch of certbot that supports the new
CERTBOT_PLUGIN_PATH and wrap the snap to set this variable correctly
based on the content interfaces connected.

(cherry picked from commit 7076a55fd82116d068e2aca7239209b7203917d2)

* Modify certbot.wrapper to append to PYTHONPATH instead of separate CERTBOT_PLUGIN_PATH variable

* Update certbot-wrapper to python3.6 version

* add source field

* Update changelog

* Use bash instead of sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Exit if something goes wrong

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* No leading : when modifying empty PYTHONPATH

* Improve bash handling of PYTHONPATH manipulation

Co-authored-by: Robie Basak <robie.basak@canonical.com>
Co-authored-by: Brad Warren <bmw@eff.org>
Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-06-10 13:52:56 -07:00
Adrien Ferrand
50fa04ba0c Implement umask for Windows (#7967)
This PR gets its root from an observation I did on current version of Certbot (1.3.0): the `renewal-hooks` directory in Certbot configuration directory is created on Windows with write permissions to everybody.

I thought it was a critical bug since this directory contains hooks that are executed by Certbot, and you certainly do not want this folder to be open to any malicious hook that could be inserted by everyone, then executed with administrator privileges by Certbot.

Turns out for this specific problem that the bug is not critical for the hooks, because the scripts are expected to be in subdirectories of `renewal-hooks` (namely `pre`, `post` and `deploy`), and these subdirectories have proper permissions because we set them explicitly when Certbot is starting.

Still, there is a divergence here between Linux and Windows: on Linux all Certbot directories without explicit permissions have at maximum `0o755` permissions by default, while on Windows it is a `0o777` equivalent. It is not an immediate security risk, but it is definitly error-prone, not expected, and so a potential breach in the future if we forget about it.

Root cause is that umask is not existing in Windows. Indeed under Linux the umask defines the default permissions when you create a file or a directory. Python takes that into account, with an API for `os.open` and `os.mkdir` that expose a `mode` parameter with default value of `0o777`. In practice it is never `0o777` (either you the the `mode` explictly or left the default one) because the effective mode is masked by the current umask value in the system: on Linux it is `0o022`, so files/directories have a maximum mode of `0o755` if you did not set the umask explicitly, and it is what it is observed for Certbot.

However on Windows, the `mode` value passed (and got from default) to the `open` and `mkdir` of `certbot.compat.filesystem` module is taken verbatim, since umask does not exit, and then is used to calculate the DACL of the newly created file/directory. So if the mode is not set explicitly, we end up with files and directories with `0o777` permissions.

This PR fixes this problem by implementing a umask behavior in the `certbot.compat.filesystem` module, that will be applied to any file or directory created by Certbot since we forbid to use the `os` module directly.

The implementation is quite straight-forward. For Linux the behavior is not changed. On Windows a `mask` parameter is added to the function that calculates the DACL, to be invoked appropriately when file or directory are created. The actual value of the mask is taken from an internal class of the `filesystem` module: its default value is `0o755` to match default umasks on Linux, and can be changed with the new method `umask` that have the same behavior than the original `os.umask`. Of course `os.umask` becomes a forbidden function and `filesystem.umask` must be used instead.

Existing code that is impacted have been updated, and new unit tests are created for this new function.

* Implement umask for Windows

* Set umask at the beginning of tests

* Fix lint, update local oldest requirements

* Update certbot-apache/setup.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Improve tests

* Adapt filesystem.makedirs for Windows

* Fix

* Update certbot-apache/setup.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Changelog entries

* Fix lint

* Update certbot/CHANGELOG.md

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-06-09 17:08:22 -07:00
Brad Warren
e31834a6cd Stop running snapcraft with sudo (#8063)
* Do not use sudo when building the snap.

* add user to lxd

* Run with lxd group.
2020-06-09 14:46:11 -07:00
Brad Warren
340a4280ea Merge pull request #8053 from certbot/upgrade-acmev1
Read acmev1 Let's Encrypt server URL from renewal config as acmev2 URL
2020-06-09 11:43:06 -07:00
Brian Heim
cc07722b3e Fix certbot.compat.filesystem documentation (#8058)
* Fix bad rst docstrings

* AUTHORS.md: add Brian Heim

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2020-06-08 14:00:16 -07:00
Rasesh Patel
67bcf0f6bd Remove documentation for configuring ciphersuites (#8027) (#8056)
Issue #1123 discusses a feature that allows users to set the cipher
security level. That feature wasn't built. It didn't provide enough
user value to justify the corresponding increase in complexity. The
feature request and the associated discussion threads were closed.
However, the proposed API spec and the TODO section remained in the
cipher docs. They're a vestige of that issue from olden days and this PR
removes those last living traces...

Fixes #8027.
2020-06-08 12:15:44 -07:00
Brad Warren
1b2328f18b Add comment about pyca's use of tools script (#8044) 2020-06-08 12:14:02 -07:00
Brian Heim
560b9e5012 AUTHORS.md: fix GH url for Brandon Kreisel (#8059) 2020-06-08 12:13:29 -07:00
Lloyd Parkes
2f6fbe9987 Add support for NetBSD (#8033)
* Add support for NetBSD by telling certbot-nginx where the nginx
configuration directory is.

* Update the CHANGELOG.

* Pass the right type of sequence to "in". Thanks lint.

* Adjust the CHANGELOG.md entry following feedback from ohemorange.

Co-authored-by: Lloyd Parkes <lloyd@must-have-coffee.gen.nz>
2020-06-08 12:06:38 -07:00
Erica Portnoy
bebcad0588 update changelog 2020-06-04 15:57:34 -07:00
Erica Portnoy
92f26367eb Merge remote-tracking branch 'alexzorin/7979_restore_v1_as_v2' into upgrade-acmev1 2020-06-04 14:29:56 -07:00
alexzorin
d135e6140b apache: handle statically linked mod_ssl (#8007)
In #7771, the Apache configurator gained the ability to identify what
version of OpenSSL Apache's ssl_module is linked against. However, the
detection was only functional if the module was built as a DSO (which is
almost always the case).

This commit covers the case where the ssl_module is statically linked
within the Apache binary. It requires the user to specify the path to
the binary (with --apache-bin) and emits a warning if static linking is
detected but no path has been provided.
2020-06-04 10:34:10 -07:00
Adrien Ferrand
010b38fa10 Upgrade Certbot dependencies (#8036)
This PR upgrades Certbot pinned dependencies through `letsencrypt-auto-source/rebuild_dependencies.py` while taking into account the problems detected in https://github.com/certbot/certbot/pull/8035:
* `cryptography` is pinned to `2.8` to continue to support OpenSSL 1.0.1 on non-x86 ancient Linux distributions (RHEL 6 + Debian 8)
* `parsedatetime` is pinned to `2.5` because of an incompatibility with Python 2.7 (see https://github.com/bear/parsedatetime/issues/246)
* `letsencrypt-auto-source/rebuild_dependencies.py` now takes into account the environment markers that are aded to `AUTHORITATIVE_CONSTRAINTS`: this is used for the `enum34` dependency, to not install it on Python 3.6+ and not break the distribution by swapping the built-in `enum` module during the setup of Certbot venv.

Fixes #8030

* Pin cryptography and parsedatetime

* Upgrade dependencies

* Remove authoritative constraint

* Upgrade dependencies

* Rebuild certbot-auto

* Update letsencrypt-auto-source/rebuild_dependencies.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Honor specific requirements in the AUTHORITATIVE_CONSTRAINTS

* Fix injection

* Update dependencies

* Update rebuild_dependencies.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-06-04 08:59:45 -07:00
ohemorange
8c8d3fab91 Merge pull request #8040 from certbot/candidate-1.5.0
Release 1.5.0
2020-06-02 12:19:39 -07:00
Brad Warren
8192e3eb85 Release version v1.5.0 2020-06-02 11:43:12 -07:00
Brad Warren
baf69d210b Bump version to 1.6.0 2020-06-02 10:32:41 -07:00
Brad Warren
beea2d2208 Add contents to certbot/CHANGELOG.md for next version 2020-06-02 10:32:40 -07:00
Brad Warren
4938273e0f Release 1.5.0 2020-06-02 10:32:38 -07:00
Brad Warren
466b4fbf71 Update changelog for 1.5.0 release 2020-06-02 10:12:33 -07:00
Brad Warren
95ae5f69f5 Make rebuild_dependencies.py executable (#8039) 2020-06-02 19:11:47 +02:00
ohemorange
2acc1dcc89 Fix TLS-ALPN tests with newer versions of OpenSSL (#8026)
Fixes #7988. As described there, the steps involved are:

1. Update our tests so they fail due to this problem.
2. Update the keys used in the tests so they pass with the new changes.

For 1, see a [failing travis run](https://travis-ci.com/github/certbot/certbot/jobs/340710511) with the included change. And for the full output to confirm that this is what is failing, see a [run on debian 10](https://github.com/certbot/certbot/files/4692350/debian_run_log.txt).

This PR adds `rsa4096_key.pem` and `rsa4096_cert.pem`, updates the `TLS-ALPN` test to use those keys in place of the 1024-bit versions, and fixes the README in that `testdata` folder with correct instructions to generate these files.

* export PIP_NO_BINARY in pip install subshell in test_sdists.sh

* set environment variable on the line that installs most packages

* Generate 4096-bit rsa key and cert, and fix README instructions to do so.

* Update TLS_ALPN test to use 4096-bit key instead of 1024-bit key.

* Update changelog

* Older versions of Python have an error when both VIRTUAL_NO_DOWNLOAD and PIP_NO_BINARY are set, so only apply the latter at the install phase.

* Add enum34 constraint manually, since rebuild_dependencies.py seems to be broken.

* only delete key if it exists

* Check OpenSSL version before trying to set PIP_NO_BINARY

* Add comment explaining why we only set PIP_NO_BINARY at the install step
2020-06-01 15:18:38 -07:00
Brad Warren
fa55b468c8 Revert "Upgrade pinned certbot dependencies (#8012)" (#8035)
This reverts commit 6b97ac3344.
2020-06-01 20:17:26 +02:00
Brad Warren
cd27dcc32c Add the content interface to Certbot (#8009)
* Add the content interface to Certbot

This commit contains a subset of the changes from 7076a55fd82116d068e2aca7239209b7203917d2.

* Normalise slot parameters

(cherry picked from commit 810941979bcf609c1e0be18e9263abf046b90e82)

Co-authored-by: Robie Basak <robie.basak@canonical.com>
2020-05-27 13:59:08 -07:00
Adrien Ferrand
6b97ac3344 Upgrade pinned certbot dependencies (#8012)
* Upgrade certbot dependencies

* Rebuild letsencrypt-auto
2020-05-26 15:19:10 -07:00
ohemorange
332def46da Require explicit confirmation of snap plugin permissions before connecting (#8013)
Fixes #7667.

Implements the plan described in #7667.

Here's a terminal log showing that it does so:

```
# sudo snap connect certbot:plugin certbot-dns-dnsimple
error: cannot perform the following tasks:
- Run hook prepare-plug-plugin of snap "certbot" (run hook "prepare-plug-plugin": 
-----
Only connect this interface if you trust the plugin author to have root on the system
Run `snap set certbot trust-plugin-with-root=ok` to acknowledge this and then run this command again to perform the connection
-----)
# snap set certbot trust-plugin-with-root=ok
# sudo snap connect certbot:plugin certbot-dns-dnsimple
# sudo snap disconnect certbot:plugin certbot-dns-dnsimple:certbot
# sudo snap connect certbot:plugin certbot-dns-dnsimple
error: cannot perform the following tasks:
- Run hook prepare-plug-plugin of snap "certbot" (run hook "prepare-plug-plugin": 
-----
Only connect this interface if you trust the plugin author to have root on the system
Run `snap set certbot trust-plugin-with-root=ok` to acknowledge this and then run this command again to perform the connection
-----)
```

* Add plugin connection hook to accept root trust

* snapctl requires a configure hook to set options

* Add sh notice

* Update changelog
2020-05-26 12:02:33 -07:00
Adrien Ferrand
b42e24178a Consistent directory mode apply when makedirs is called (#8010)
Fixes #7993 

This PR uses `os.umask()` during `certbot.compat.filesystem.makedirs()` call to ensure that all directories, and not only the leaf one, have the provided `mode` when created. This ensures a safe and consistent behavior independently from the Python version, since the behavior of `os.makedirs` changed on that matter with Python 3.7.

* Implement logic to apply the same permission on all dirs created by makedirs

* Add a test

* Add comment

* Update certbot/certbot/compat/filesystem.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-05-21 15:29:06 -07:00
Alex Zorin
8a15bd7927 renewal: disregard acme-v01 in renewal configs
Fixes #7979
2020-05-21 23:01:53 +10:00
ohemorange
3ea5170647 Error out earlier in apache installer when mod_ssl is not available (#7984)
* Error out in apache installer when mod_ssl is not available

* Update to MisconfigurationError and add/fix tests

* Remove error cases we no longer hit and associated test

* mock out function to have consistent error across machines

* improve changelog message

* only check key in modules list, not value
2020-05-19 15:34:21 -07:00
schoen
0b53c0d476 Merge pull request #7952 from ntkme/allow-empty-existing-dir
Allow existing but empty archive and live dir to be used when creating new lineage
2020-05-19 15:25:32 -07:00
Brad Warren
4eb9a71a4c remove quay cruft (#8003)
Our README still has links to our old quay.io builds which we shutdown a while ago. See #4343. This PR simply removes the old stray links.
2020-05-19 14:29:28 -07:00
Brad Warren
96e003d1a3 mention python3-venv in docs (#8006)
The error message from `python3 -m venv` when you don't have `python3-venv` installed is pretty good, but lets skip the failure and make sure it is installed the first time.
2020-05-19 14:28:41 -07:00
schoen
7a7c6737cc Merge pull request #8000 from certbot/make-exit-message-red
Print cause of exit in red text
2020-05-18 17:13:03 -07:00
Brad Warren
0e59c6ba1b handle more cases 2020-05-18 10:17:32 -07:00
Brad Warren
d230dcafeb Print cause of exit in red text. 2020-05-18 09:31:15 -07:00
alexzorin
bcf33c6659 ocsp: add support for public key hash ResponderIDs (#7989)
For both cases where the the response is signed by the issuer, or by a
delegated OCSP signer.

Resolves #7986
2020-05-13 15:55:35 -07:00
Brad Warren
71e3d82e47 Remove passthrough because its no longer needed (#7956) 2020-05-06 14:28:15 -07:00
Adrien Ferrand
bb6a660b21 Platform-agnostic method to copy owner and permissions between files (#7968)
Related to #7649 since @joohoi needs a method to copy owner and permissions together from a source file to a destination file.

This PR creates the method `copy_ownership_and_mode()` in `certbot.compat.filesystem` module to achieve this goal. Its behavior is consistent across Linux and Windows in respect to the security model that have been defined for Certbot on Windows.

The method behaves globally the same than `copy_ownership_and_apply_mode`, but this time the permissions are extracted from the source file. For Windows it means that the DACL is copied from the source to the destination with the same content.

* Create copy_ownership_and_mode to copy both owner and mode from src to dst

* Update certbot/tests/compat/filesystem_test.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Fix docstring

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-05-06 12:33:50 -07:00
Brad Warren
d8e9f558c2 Release version v1.4.0-2 2020-05-05 17:11:24 -07:00
Adrien Ferrand
3a997a5631 Install pipstrap to pin setuptools/pip/wheels, since setuptools and pip continues to play with us. (#23)
So, setuptools broke the installation setup, by removing a deprecated API that is still used by some of our dependencies (see pypa/setuptools#2017)

This PR fixes the Docker build by using pipstrap to pin pip/setuptools/wheels, like it is done in several critical places (certbot-auto, ...).

An issue in certbot is opened to fix more generally the problem in most recent versions of setuptools: certbot/certbot#7976

It rebuilt locally all dockers (certbot + dns plugins) for the three architectures, and all have passed.
2020-05-05 17:10:51 -07:00
Brad Warren
c35054fd11 Merge pull request #7974 from certbot/candidate-1.4.0
Release 1.4.0
2020-05-05 17:06:31 -07:00
Erica Portnoy
361d1f732e Release version v1.4.0 2020-05-05 14:45:57 -07:00
Erica Portnoy
b224b49986 Bump version to 1.5.0 2020-05-05 13:44:23 -07:00
Erica Portnoy
1a72cdecf2 Add contents to certbot/CHANGELOG.md for next version 2020-05-05 13:44:23 -07:00
Erica Portnoy
5586ae071a Release 1.4.0 2020-05-05 13:44:21 -07:00
Erica Portnoy
ca60ad52b9 Update changelog for 1.4.0 release 2020-05-05 12:37:33 -07:00
Brad Warren
9154e7965f drop min certbot coverage (#7972)
`tox -e cover` fails for me on macOS. This is due to the differences in the code that is run when on Linux vs. other platforms in `certbot.util` and its tests. Diffing my local coverage with Travis, the only difference in lines missing test coverage is:
```
$ diff travis.txt local.txt
< certbot/certbot/util.py                                        238     24    90%   132-134, 210, 275-281, 318, 330-340, 347, 381-382
> certbot/certbot/util.py                                        239     34    86%   30, 132-134, 210, 275-281, 301, 305, 316-318, 330-340, 347, 367-373, 381-382
< certbot/tests/util_test.py                                     392      0   100%
> certbot/tests/util_test.py                                     375     26    93%   483-487, 492-502, 507-514, 548-550, 555-557
```
I think tests on `master` should not be failing locally for people.

While there would be other ways to fix this by adding `# pragma: no cover` lines or writing mocked out tests for other platforms, I personally just think dropping the minimum coverage one percentage point is fine at least for now.
2020-05-05 09:38:20 -07:00
Brad Warren
ac2d691ade Add warning about ignoring our own warnings (#7971)
Coming out of the conversation at #7863 in the linked Google Doc, we should always have at least 1 release between updating one of our plugins to stop using a deprecated acme/certbot API and removing it from acme/certbot. Doing this gives the plugin changes time to propagate rather than potentially having the plugin break because Certbot was updated before the plugin had made the necessary changes.

This comment here should help ensure this.

* Add pytest warnings warning.

* clarify comment
2020-05-04 16:54:09 -07:00
schoen
5536c91223 Merge pull request #7938 from taixx046/1618-awkward-language-during-email-problems
Fixed #1618 awkward language during email problems
2020-04-27 18:33:07 -07:00
Adrien Ferrand
9cbb13ef04 Run hooks with Powershell on Windows (#7800)
Fixes #7713.

As discussed in #7713, providing a Powershell script as hook for Certbot is not working currently. This is because hooks are run in a `cmd` environment, that recognizes only `.bat` files as valid scripts that can be run from their bare name on command line.

On the other hand, the Powershell both `.bat` and `.ps1` scripts as valid scripts.

This PR makes hooks command be executed by Powershell, instead of `cmd` as `Popen` does by default when `shell=true` is used. It also modifies the tests to handle this new environment, in particular in term of encoding (UTF-16-LE is the default one in Powershell).

* Run hooks in powershell on Windows

* Fix hook test

* Fallback to unittest.mock

* In fact, shell_cmd as a list of str could not work. Declare only str as acceptable input for shell_cmd.

* Added changelog
2020-04-27 09:38:30 -07:00
なつき
db6de76b11 Allow existing but empty archive and live dir 2020-04-25 01:47:30 -07:00
Brad Warren
01dc981a09 Merge pull request #7948 from certbot/snap-build-squashed
Despite this PR (only) being ~200 lines containing mostly code copied from another repo, there is a lot going on here. For the sake of making it both easier to review and to remember some of these things in the future by referring back to this PR, I've documented a lot of noteworthy things with section headers below. With that said, it's probably not necessary to read each section unless you're interested in that topic.

The most noteworthy thing for the reviewer is **this PR should be merged and not squashed** to preserve authorship. To merge this code, once we're happy with this PR, I'll probably open a new PR squashing any commits I make in response in review comments back into a single commit to try to keep history somewhat clean. To help prevent this PR from being accidentally squashed, I'm making this a draft PR for now.

### Git history of https://github.com/basak/certbot-snap-build

I think it is worth preserving the git history of https://github.com/basak/certbot-snap-build that this PR is based on in this repo to help us track why things were done a certain way. To do this while keeping our git history somewhat clean, I took the approach described at https://stackoverflow.com/questions/1425892/how-do-you-merge-two-git-repositories/21495718#21495718 to move all history of https://github.com/basak/certbot-snap-build into a `snap` directory. I then squashed all commits so that sequential commits from the same author are one commit. I probably could have reordered commits to try and squash things a little more, but I personally don't think it's worth the trouble. Finally, I merged this rewritten history into this branch of the Certbot repo.

The contents of the `snap` directory are identical to the current contents of https://github.com/basak/certbot-snap-build before my final commit in this PR which makes the changes to make things work in this repo.

### Travis stages

This is described in general at https://docs.travis-ci.com/user/build-stages/, but I don't think we should deploy the snap if any of our tests are failing. To accomplish this, I created a "Snap" stage that builds, tests, and deploys the snap which is only executed after a "Test" stage that contains all of our other tests. The "Snap" stage will not run until the "Test" stage completes successfully.

### snap/local

This directory is ignored by `snapcraft` which I think makes it a good place to store `snap` specific scripts like `build_and_install.sh`.

See https://bugs.launchpad.net/snapcraft/+bug/1792203 for more info.

### Why remove certbot-compatibility-test from apacheconftest toxenvs?

Because it's not used. In theory, it could go in its own PR, but it'll create merge conflicts with this one so I'd personally prefer to include this simple change in this PR as well.

### Checklist for landing this PR

- [x] Squash all of my commits into one commit
- [x] Update the release instructions to have to move the snap to the beta channel
- [x] Shut down Robie's nightly builds probably by updating his repo to say that the code has moved here and deleting everything
2020-04-24 14:13:09 -07:00
Brad Warren
335894ab3b Merge snap code into the Certbot repo
* merge .gitignore

* Move snapcraft.yml up one level.

* update source

* move test.sh to tox.ini

* use new tox.ini in .travis.yml

* move snap build code

* make script executable

* remove unused python3-dev

* don't use deprecated classic flag

* go back to stable channel

* add nginx in snap addons

* add deploy steps

* Add comments explaining external tox envs.

* error if not in CI

* don't use --depth

* remove old .travis.yml

* Add big comment about SNAP_TOKEN.

* Set all_branches: true.

* Add repo setting.

* run travis on tags

* Add more documenting comments to .travis.yml.
2020-04-24 13:47:36 -07:00
Brad Warren
4cce3458f3 Avoid deleting the workspace twice. (#7923) 2020-04-23 23:29:16 +02:00
Brad Warren
d71d3a1144 dont use venv.py (#7936) 2020-04-23 23:27:05 +02:00
Brad Warren
b06dacdfb5 Add --chain-path to --help install output. (#7937) 2020-04-23 23:26:34 +02:00
taixx046
e5cde2c598 updated changelog.md 2020-04-23 15:29:38 -05:00
taixx046
84b6c3cebb reorganized error message when a user entered an invalid email address 2020-04-23 15:20:32 -05:00
ohemorange
a06d5ac7a1 Deprecate certbot-auto on Gentoo, macOS, and FreeBSD (#7926)
* Deprecate certbot-auto on Gentoo

* Deprecate certbot-auto on macOS

* Deprecate certbot-auto on FreeBSD

* build le-auto

* Update Changelog

* Deprecate certbot-auto on Gentoo, FreeBSD, and macOS
2020-04-23 12:49:35 -07:00
Brad Warren
9d94c6c5ef Remove useless before_install lines (#7920) 2020-04-22 13:10:13 -07:00
Brad Warren
ed9648b4a3 Add snap instructions (#7889)
Part of #7671.

* Add snap instructions.

* Mention Linux+systemd
2020-04-22 11:30:33 -07:00
Adrien Ferrand
2bcabe6626 Fix certbot part build in snap
* Declare properly source-subdir to build the certbot part

* Use snapcraft 3.10+, remove the custom python plugin
2020-04-21 16:36:49 -07:00
Brad Warren
c12baf7d8c Fix path to apache-conf-test 2020-04-21 16:36:49 -07:00
Sergio Schvezov
193b44a0fa Snap plugin
* snap: move snapcraft.yaml to snap directory

Signed-off-by: Sergio Schvezov <sergio.schvezov@canonical.com>

* snap: use a local plugin to get around the delivered plugin

Add a plugin to the project which behaves as expected until a version
of snapcraft satisfies the project needs.

Additional snapcraft.yaml changes were made to accommodate for the snap
to build.

Signed-off-by: Sergio Schvezov <sergio.schvezov@canonical.com>

* snap: compile pycache in the last step for the last part

Signed-off-by: Sergio Schvezov <sergio.schvezov@canonical.com>
2020-04-21 16:36:49 -07:00
Robie Basak
b69f5588f4 Continued improvements
* Remove legacy Store upload credentials

These have not been needed since 5d7969a.

* Work around dev part dependency failure case

Get pip to install certbot from its VCS repository directly during the
build of the nginx and apache plugin parts.

This works around issue #12 when it affects the interaction between the
apache or nginx plugin and certbot itself.

It does not work around the case where the same problem occurs in the
interaction between certbot and acme. This looks harder to work around
because pip's VCS URL handling doesn't appear to include a facility to
install from a subdirectory of a git repository and this is where the
acme source is located.

* Switch to using lxd for the snapcraft run

The Docker image is 16.04 only. Before we can switch to 18.04, we need
to remove Docker, which means using the snapcraft snap using Travis' new
snap support, and then using the lxd functionality in snapcraft so that
it is enabled to build in the appropriate environment.

* Switch build to use core 18

Fixes: #14

* Move constraints into a list

This seems to be a requirement either either newer snapcraft, snapcraft
in lxd or in the move to core18 (it isn't clear to me which). This fixes
the error:

Failed to load plugin: properties failed to load for certbot: The 'constraints' property does not match the required schema: '$SNAPCRAFT_PART_SRC/constraints.txt' is not of type 'array'

* version-script -> snapcraftctl set-version

The use of version-script seems to break with either newer snapcraft,
snapcraft with lxd or core18 (it's not clear to me which). The breakage
is related to "parts/certbot/src" not being found. This can be fixed
with $SNAPCRAFT_PART_SRC, but this doesn't seem to be defined at
"version-script time".

However https://snapcraft.io/docs/deprecation-notices/dn10 deprecates
the use of version-script, and if we convert to the recommended new way,
then we use override-pull instead and $SNAPCRAFT_PART_SRC is defined
there, so this conveniently fixes both problems at once.

* Do not explicitly install snapd

Since this is now handled by the Travis addon, we do not need to do it
explicitly.
2020-04-21 16:36:49 -07:00
Adrien Ferrand
2c622944dd Various optimizations part 2
* Revert logic of building against a tag
* Fix schema, add nginx
* Update snapcraft.yaml
* Update snapcraft.yaml
* Update snapcraft.yaml
* Update test.sh
* Update test.sh
* Update test.sh
* Update test.sh
* Config tests
* Add an apache test
* Relaunch CI
* Clean config
* Install venv
* Decompose steps
* Update test.sh
* Use virtual environment
* Update python-augeas
* Add fork python-augeas
* Update .travis.yml
* Exclusion rule
* Try with after
2020-04-21 16:36:49 -07:00
Robie Basak
e0b72d9a62 Travis improvements
* Add Travis notifications

* Adjust automatic snap deployment configuration

Travis now has a documented[1] "snap" provider and the previous
experimental mechanism seems to have stopped working, presumably because
it was deprecated in favour of this new mechanism.

[1] https://docs.travis-ci.com/user/deployment/snaps/
2020-04-21 16:36:49 -07:00
Adrien Ferrand
d63be466a8 Various optimizations part 1
* Configure for python3
* Update tests
* Use appropriate virtualenv
* Install nginx for the integration tests
* Try use LD_LIBRARY_PATH to find augeas shared library in snap when python-augeas is invoked
* Update travis to use build-in setup capabilities
* Update .travis.yml
* Add acme build
* Update tests
* Try more recent dist
* Update command
* Clean tests
* Add back augeas
* Add env
* Revert to last working snapcraft config
* Add a gitignore
* Reintegrate acme. Declare augeas in certbot parts
* Use release version of certbot
* Try new approach
* Fix config
* Directly install version of python-augeas from pypi
* Restart from basic
* Clone only once certbot repository. Use pinned versions of dependencies from certbot-auto.
* Try relatively to source
* Use snapcraft env variables
* Strip hashes
* Fix path
* Redefine path
* Continue to prepare the runtime
* Fix command line
* Update .travis.yml
* Add back certbot-apache
* Update snapcraft.yaml
* Build snap against the latest release of certbot
2020-04-21 16:36:49 -07:00
Robie Basak
0f6486ec7f Initial commit
* Add renewal timer

* Install libaugeas0 in python-augeas part build

This part needs libaugeas0 to build.

* Bump to 0.26.1

* Always act directly on upstream master

I want to keep this always working, so move to master. We can
reintroduce upstream stable releases when we are ready for general use.

Closes: #5

That particular issue seems to no longer happen. Presumably something
changed in upstream git or in PyPI. If it happens again, hopefully I'll
have CI against upstream master up by then and I'll be able to pin it
down.

* Add empty Travis build

* Add Travis automatic snap edge publication

* Add integration test

This uses upstream's test suite from their source tree to check the
built snap to make sure it behaves as expected, before attempting upload
to the store.

* Point Augeas to its lens library

Augeas defaults to looking in /usr/share/augeas/lenses, which in a snap
isn't found at this path, but inside $SNAP. So set AUGEAS_LENS_LIB to
where the lenses can be found within the snap.

This fixes the Apache plugin that uses Augeas.
2020-04-21 16:36:49 -07:00
Brad Warren
06e68cce44 flip condition (#7927)
In #7925 we accidently changed the logic here. Before it was:

type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))

now it's

type = cron OR (type = push AND branch = master)

We want to be able to run our full test suite on things like test-* branches. The reason we had been excluding master is it has the full test suite run on it (through what Travis calls cron) nightly and not running on every commit helps prevent us on waiting on CI since our nightly tests spin up so many jobs.

This PR changes things back to the intended behavior.

(We could talk about changing the condition to just type = cron OR type = push if we want, but I'd rather do that in a separate PR once things like test- branches are fixed.)
2020-04-21 16:34:50 -07:00
ohemorange
eed45827ad Remove references to the apache-parser-v2 branch (#7925)
Fixes #7786.
2020-04-21 13:06:30 -07:00
ohemorange
751d836746 Update using.rst to accurately describe acmev2 behavior (#7924)
Fixes #7268

I removed the reference to automatically selecting which ACME protocol we use, since at some point we'll want to rip out the non-spec-compliant ACMEv1 code.
2020-04-21 12:47:59 -07:00
Brad Warren
6693e87500 Update dev docs to reflect Python 2 EOL. (#7914)
Python 2 is going to get harder and harder to install locally so I don't think we should assume/require devs to have it installed.

This PR builds on #7905 so our developer guide only has people use Python 3.
2020-04-21 10:27:48 -07:00
schoen
08cea381c8 Merge pull request #7917 from certbot/fedora31
Test on fedora 31
2020-04-20 19:36:24 -07:00
schoen
84b5c571c0 Merge pull request #7916 from certbot/ubuntu-1910
Test on Ubuntu 19.10
2020-04-20 19:35:41 -07:00
schoen
0f35836deb Merge pull request #7919 from certbot/rate-limits-url
Update rate limits url
2020-04-20 19:29:18 -07:00
Brad Warren
5b749ff8f7 Use Python 3 in the release script. (#7918)
Fixes #7902.
2020-04-20 14:44:53 -07:00
Brad Warren
41306e1e37 update rate limits url 2020-04-20 09:20:31 -07:00
Brad Warren
864ea08341 test on fedora 31 2020-04-16 15:00:28 -07:00
Brad Warren
74eea40905 test on ubuntu 19.10 2020-04-16 14:57:36 -07:00
Brad Warren
859dc38cb9 Consolidate cover envs and default to py3-cover (#7905)
* Consolidate cover envs and default to py3-cover

* use py38 for code coverage in Travis

* Disable coverage on Python < 3.6 line.
2020-04-16 08:59:40 -07:00
April King
f66314926a Update URL for Mozilla SSL Configuration Generator (#7912) 2020-04-15 13:54:17 -07:00
ohemorange
9c345ac301 Do not require mock in Python 3 in certbot-dns modules (#7900)
Part of #7886.

This PR conditionally installs `mock` in `certbot-dns-*/setup.py` based on setuptools version and python version, when possible. It then updates the tests to use `unittest.mock` when `mock` isn't available.

* Do not require mock in Python 3 in certbot-dns modules

* update changelog

* error when trying to build wheels with old setuptools

* add type: ignores
2020-04-15 11:54:44 -07:00
ohemorange
af21d1d56e Do not require mock in Python 3 in certbot-compatibility-test module (#7899)
* Do not require mock in Python 3 in certbot-compatibility-test module

* error when trying to build wheels with old setuptools

* add type: ignores
2020-04-15 11:40:03 -07:00
ohemorange
49912732ac Do not require mock in Python 3 in nginx module (#7898)
* Do not require mock in Python 3 in nginx module

* error when trying to build wheels with old setuptools

* add type: ignores
2020-04-15 11:39:44 -07:00
ohemorange
8fb9a395ab Do not require mock in Python 3 in apache module (#7896)
Part of #7886.

This PR conditionally installs mock in `apache/setup.py` based on setuptools version and python version, when possible. It then updates `apache` tests to use `unittest.mock` when `mock` isn't available.

* Conditionally install mock in apache

* error out on newer python and older setuptools

* error when trying to build wheels with old setuptools

* use unittest.mock when third-party mock isn't available in apache, with no cover and type ignore
2020-04-15 11:30:08 -07:00
ohemorange
35fb99b86f Do not require mock in Python 3 in certbot module (#7911)
This PR is exactly the same as #7895, but know we know a little bit more about what was going on with `mypy`.

Part of #7886.

This PR conditionally installs mock in `certbot/setup.py` based on setuptools version and python version, when possible. It then updates `certbot` tests to use `unittest.mock` when `mock` isn't available.

* Conditionally install mock in certbot

* use unittest.mock when third-party mock isn't available in certbot

* Add type:ignores because of https://github.com/python/mypy/issues/1153

* error out on newer python and older setuptools

* error when trying to build wheels with old setuptools
2020-04-15 11:28:47 -07:00
ohemorange
127d2dc307 Do not require mock in Python 3 in acme module (#7910)
Part of #7886.

This PR conditionally installs mock in `acme/setup.py` based on setuptools version and python version, when possible. It then updates `acme` tests to use `unittest.mock` when `mock` isn't available.

Now with `type: ignore` as appropriate. Once the "future steps" of #7886 are finished, and mypy is on Python 3, the `pragma no cover`s and `type ignore`s will be gone.

* Conditionally install mock in acme

* error out on newer python and older setuptools

* error when trying to build wheels with old setuptools

* use unittest.mock when third-party mock isn't available in acme, with no cover and type ignore
2020-04-15 11:27:55 -07:00
Brad Warren
569df2d37a Remove Ubuntu 19.04 tests. (#7906)
This PR fixes the Travis failures that can be seen https://travis-ci.com/certbot/certbot/builds/160258644. Running the tests locally, it looks like Ubuntu has started shutting down the 19.04 repos which makes sense as this release has been EOL'd. See https://wiki.ubuntu.com/Releases.

I have the full suite including the test farm tests running at https://travis-ci.com/github/certbot/certbot/builds/160269969 with this change.

The issue of adding 19.10 to our test farm tests is tracked by #7851. I think that issue is important and it's in our current milestone, but I'd personally rather get our tests passing for now and try to expand them to run on other systems later.
2020-04-14 17:01:59 -07:00
ohemorange
ff732bf975 Revert the last two mock PRs (#7903)
* Revert "Do not require mock in Python 3 in certbot module (#7895)"

This reverts commit 77871ba71c.

* Revert "Do not require mock in Python 3 in acme module (#7894)"

This reverts commit cd0acf5dcc.
2020-04-13 17:09:24 -07:00
ohemorange
77871ba71c Do not require mock in Python 3 in certbot module (#7895)
Part of #7886.

This PR conditionally installs mock in `certbot/setup.py` based on setuptools version and python version, when possible. It then updates `certbot` tests to use `unittest.mock` when `mock` isn't available.

* Conditionally install mock in certbot

* use unittest.mock when third-party mock isn't available in certbot

* Add type:ignores because of https://github.com/python/mypy/issues/1153

* error when trying to build wheels with old setuptools
2020-04-13 14:33:23 -07:00
ohemorange
cd0acf5dcc Do not require mock in Python 3 in acme module (#7894)
Part of #7886.

This PR conditionally installs mock in acme/setup.py based on setuptools version and python version, when possible. It then updates acme tests to use unittest.mock when mock isn't available.

* Conditionally install mock in acme

* use unittest.mock when third-party mock isn't available in acme

* error when trying to build wheels with old setuptools
2020-04-13 14:32:22 -07:00
Karan Suthar
8e4dc0a48c Minor bugfixes (#7891)
* Fix dangerous default argument

* Remove unused imports

* Remove unnecessary comprehension

* Use literal syntax to create data structure

* Use literal syntax instead of function calls to create data structure

Co-authored-by: deepsource-autofix[bot] <62050782+deepsource-autofix[bot]@users.noreply.github.com>
2020-04-13 10:41:39 -07:00
ohemorange
316e4640f8 Upgrade the test farm tests to use Python 3 (#7876)
Fixes #7857.

* stop using urllib2 in test farm tests

* use six for urllib instead

* remove fabric lcd usage

* correct lcd removal

* remove fabric cd

* convert some remote calls to v2

* move more cxns to v2

* get run working with prefix

* get sudo commands working

* remove final fabric v1 references including local

* update requirements and README

* add new venv to gitignore

* update version used in travis

* remove deploy_script unused kwargs

* fix killboulder implementation so I can test creating a new boulder server

* hardcode the gopath due to broken env manamagement in fabric2

* Update letstest readme

* move the comment about hardcoding the ggopath

* catch BaseException instead of Exception

* work around fabric #2007

* use connections as context managers to ensure they're closed

* remove reference to virtualenv
2020-04-09 14:35:47 -07:00
inejge
537bee0994 Add minimal proxy support for OCSP verification (#7892)
Translate a proxy specified by an environment variable ("http_proxy"
or "HTTP_PROXY") into options recognized by "openssl ocsp". Support
is limited to HTTP proxies which don't require authentication.

Fixes #6150
2020-04-09 11:25:39 -07:00
alexzorin
e9895d2ec6 Fix fullchain parsing for CRLF chains (#7877)
Fixes #7875 .

After [this comment](https://github.com/certbot/certbot/issues/7875#issuecomment-608145208) and evaluating the options, I opted to go with `stricttextualmsg`, as required by RFC 8555. Reasoning is that the ACME v1 code path (via OpenSSL) produces a `fullchain_pem` which satisfies `stricttextualmsg`, so we don't need to be more generous than that.

One downside of the `re` approach is that it doesn't seem capable of capturing repeating group matches. As a result, it matches each certificate individually, silently passing over any data in between the encapsulation boundaries, such as explanatory text, which is prohibited by RFC 8555.

It would be ideal to raise an error when encountering such a non-conformant chain, but we'd need to create a mini-parser to do it, I think.

* Fix fullchain parsing for CRLF chains.

fullchain parsing now works in two passes:

1. A first pass which is generous with what it accepts - basically
   preeb(CERTIFICATE)+anything+posteb(CERTIFICATE). This determines
   the boundaries for each certificate.
2. A second pass which normalizes (by parsing and re-encoding) each
   certificate found in the first pass.

* typo in docstring

* remove redundant group in regex

* can't use assertRaisesRegex until py27 is gone
2020-04-07 16:19:13 -07:00
Brad Warren
5992d521e2 Print boulder logs when boulder setup fails (#7885)
This is part of https://github.com/certbot/certbot/issues/7303.

* Print boulder logs if boulder fails to start

* Print description and fix command.

* Change output to stderr.
2020-04-07 15:26:19 -07:00
alexzorin
4ca86d9482 acme: socket timeout for HTTP standalone servers (#7388)
* acme: socket timeout for HTTP standalone servers

Adds a default 30 second timeout to the StreamRequestHandler for clients
connecting to standalone HTTP-01 servers. This should prevent most cases
of an idle client connection from preventing the standalone server from
shutting down.

Fixes #7386

* use idiomatic kwargs default value

* move HTTP01Server lower to fix mypy forward ref.

* fix test crash on macOS due to socket double-close

* maybe its not an OSError?

* disable coverage check on useless branch
2020-04-01 23:53:58 +02:00
Adrien Ferrand
bc3088121b Add a step to check powershell version in vs2017-win2016 (#7870)
Following discussion at https://github.com/certbot/certbot/pull/7539#issuecomment-572318805, this PR adds a check for Powershell version: we expect that the `vs2017-win2016` node that will test the installer has Powershell 5.x, and nothing else.

This ensure that at least one node of the pipeline is testing the installer with the lowest Powershell version supported by Certbot.

One full pipeline success can be seen here: https://dev.azure.com/adferrand/certbot/_build/results?buildId=713

I also create on purpose a failing pipeline, that would check that Powershell 6.x is installed. Its result can be seen here: https://dev.azure.com/adferrand/certbot/_build/results?buildId=714
2020-03-31 13:12:42 -07:00
ohemorange
dbda499b08 Remove interactive redirect ask (#7861)
Fixes #7594.

Removes the code asking interactively if the user would like to add a redirect.

* Remove interactive redirect ask

* display.enhancements is no longer used, so remove it.

* update changelog

* remove references to removed display.enhancements

* add redirect_default flag to enhance_config to conditionally set default for redirect value

* Update default in help text.
2020-03-31 12:12:14 -07:00
Brad Warren
6df90d17ae Wait 5 minutes for boulder to start. (#7864) 2020-03-25 14:50:13 -07:00
alexzorin
e4a0edc7af Add a 10-second timeout to OCSP queries. (#7860)
* Add a 10-second timeout to OCSP queries.

Closes #7859

* Update CHANGELOG

* Fix test
2020-03-24 15:02:53 +01:00
m0namon
1285297b23 [Apache v2] Load apacheconfig tree and gate related tests (#7710)
* Load apacheconfig dependency, gate behind flag

* Bump apacheconfig dependency to latest version and install dev version of apache for coverage tests

* Move augeasnode_test tests to more generic parsernode_test

* Revert "Move augeasnode_test tests to more generic parsernode_test"

This reverts commit 6bb986ef786b9d68bb72776bde66e6572cf505a9.

* Mock AugeasNode into DualNode's place, and run augeasnode tests exclusively on AugeasNode

* Don't calculate coverage for skeleton functions

* clean up helper function in augeasnode_test
2020-03-23 17:05:22 -07:00
ohemorange
9e3c348dff Disable TLS session tickets in Apache (#7771)
Fixes #7350.

This PR changes the parsed modules from a `set` to a `dict`, with the filepath argument as the value. Accordingly, after calling `enable_mod` to enable `ssl_module`, modules now need to be re-parsed, so call `reset_modules`.

* Add mechanism for selecting apache config file, based on work done in #7191.

* Check OpenSSL version

* Remove os imports

* debian override still needs os

* Reformat remaining apache tests with modules dict syntax

* Clean up more apache tests

* Switch from property to method for openssl and add tests for coverage.

* Sometimes the dict location will be None in which case we should in fact return None

* warn thoroughly and consistently in openssl_version function

* update tests for new warnings

* read file as bytes, and factor out the open for testing

* normalize ssl_module_location path to account for being relative to server root

* Use byte literals in a python 2 and 3 compatible way

* string does need to be a literal

* patch builtins open

* add debug, remove space

* Add test to check if OpenSSL detection is working on different systems

* fix relative test location for cwd

* put </IfModule> on its own line in test case

* Revert test file to status in master.

* Call augeas load before reparsing modules to pick up the changes

* fix grep, tail, and mod_ssl location on centos

* strip the trailing whitespace from fedora

* just use LooseVersion in test

* call apache2ctl on debian systems

* Use sudo for apache2ctl command

* add check to make sure we're getting a version

* Add boolean so we don't warn on debian/ubuntu before trying to enable mod_ssl

* Reduce warnings while testing by setting mock _openssl_version.

* Make sure we're not throwing away any unwritten changes to the config

* test last warning case for coverage

* text changes for clarity
2020-03-23 16:49:52 -07:00
schoen
3bfaf41d3d Merge pull request #7849 from TechplexEngineer/patch-1
Fix plugin links
2020-03-16 11:51:45 -07:00
Brad Warren
06599a1e18 Cleanup more pylint issues (#7848)
This PR builds on #7657 and cleans up additional unnecessary pylint comments and some stray comments referring to pylint: disable comments that have been deleted that I didn't notice in my review of that PR.

* Remove stray pylint link.

* Cleanup more pylint comments

* Cleanup magic_typing imports

* Remove unneeded pylint: enable comments
2020-03-16 09:43:48 -07:00
schoen
30ec4cafe1 Merge pull request #7797 from g6123/nginx-utf8
Use UTF-8 encoding for nginx plugin
2020-03-13 17:07:40 -07:00
schoen
c6d35549d6 Merge branch 'master' into nginx-utf8 2020-03-13 16:28:24 -07:00
Blake Bourque
9a256ca4fe Fix plugin links 2020-03-13 15:26:15 -04:00
Adrien Ferrand
809cb516c9 Fix acme compliance to RFC 8555 (#7176)
This PR is an alternative to #7125.

Instead of disabling the strict mode on Pebble, this PR fixes the JWS payloads regarding RFC 8555 to be compliant, and allow certbot to work with Pebble v2.1.0+.

* Fix acme compliance to RFC 8555.

* Working mixin

* Activate back pebble strict mode

* Use mixin for type

* Update dependencies

* Fix also in fields_to_partial_json

* Update pebble

* Add changelog
2020-03-13 09:56:35 -07:00
Adrien Ferrand
07abe7a8d6 Reimplement tls-alpn-01 in acme (#6886)
This PR is the first part of work described in #6724.

It reintroduces the tls-alpn-01 challenge in `acme` module, that was introduced by #5894 and reverted by #6100. The reason it was removed in the past is because some tests showed that with `1.0.2` branch of OpenSSL, the self-signed certificate containing the authorization key is sent to the requester even if the ALPN protocol `acme-tls/1` was not declared as supported by the requester during the TLS handshake.

However recent discussions lead to the conclusion that this behavior was not a security issue, because first it is coherent with the behavior with servers that do not support ALPN at all, and second it cannot make a tls-alpn-01 challenge be validated in this kind of corner case.

On top of the original modifications given by #5894, I merged the code to be up-to-date with our `master`, and fixed tests to match recent evolution about not displaying the `keyAuthorization` in the deserialized JSON form of an ACME challenge.

I also move the logic to verify if ALPN is available on the current system, and so that the tls-alpn-01 challenge can be used, to a dedicated static function `is_available` in `acme.challenge.TLSALPN01`. This function is used in the related tests to skip them, and will be used in the future from Certbot plugins to trigger or not the logic related to tls-alpn-01, depending on the OpenSSL version available to Python.

* Reimplement TLS-ALPN-01 challenge and standalone TLS-ALPN server from #5894.

* Setup a class method to check if tls-alpn-01 is supported.

* Add potential missing parameter in validation for tls-alpn

* Improve comments

* Make a class private

* Handle old versions of openssl that do not terminate the handshake when they should do.

* Add changelog

* Explicitly close the TLS connection by the book.

* Remove unused exception

* Fix lint
2020-03-12 13:53:19 -07:00
osirisinferi
2fd85a4f36 Add serial number to certificates output (#7842)
Fixes #7835

I had to mock out `get_serial_from_cert` to keep a test from failing, because `cert_path` was mocked itself in `test_report_human_readable`. 

Also, I kept the same style for the serial number as the recent Let's Encrypt e-mail: lowercase hexadecimal without a `0x` prefix and without colons every 2 chars. Shouldn't be a problem to change the format if required.
2020-03-12 09:37:49 -07:00
Adrien Ferrand
44b97df4e9 Exposes environment variable to let hooks scripts know when the last challenge is handled (#7837)
Fixes #5484 

This PRs makes Certbot expose two new environment variables in the auth and cleanup hooks of the `manual` plugin:
* `CERTBOT_REMAINING_CHALLENGES` contains the number of challenges that remain after the current one (so it equals to 0 when the script is called for the last challenge)
* `CERTBOT_ALL_DOMAINS` contains a comma-separated list of all domains concerned by a challenge for the current certificate

With these variables, an hook script can know when it is run for the last time, and then trigger appropriate finalizers for all challenges that have been executed. This will be particularly useful for certificates with a lot of domains validated with DNS-01 challenges: instead of waiting on each hook execution to check that the relevant DNS TXT entry has been inserted, these waits can be avoided thanks to the latest hook verifying all domains in one run.

* Inject environment variables in manual scripts about remaining challenges

* Adapt tests

* Less variables and less lines

* Update manual.py

* Update manual_test.py

* Add documentation

* Add changelog
2020-03-12 09:29:03 -07:00
radek-sprta
78168a5248 Add CloudDNS to third-party plugins (#7840) 2020-03-11 13:27:19 -07:00
Brad Warren
69aec55ead Remove --no-site-packages outside of certbot-auto. (#7832) 2020-03-09 13:05:35 -07:00
Brad Warren
7f63141e41 Add changes to the correct changelog entry (#7833)
https://github.com/certbot/certbot/pull/7742 and https://github.com/certbot/certbot/pull/7738 landed after our 1.2.0 release, but the 1.2.0 changelog entry was modified instead of the one for master/1.3.0.

This PR moves the changelog entries to the 1.3.0 section.
2020-03-06 09:46:30 -08:00
Brad Warren
d72a1a71d2 Fix issues with Azure Pipelines (#7838)
This PR fixes two issues.

First, it fixes #7814 by removing our tests on Windows Server 2012. I also added the sentence "Certbot supports Windows Server 2016 and Windows Server 2019." to https://community.letsencrypt.org/t/beta-phase-of-certbot-for-windows/105822.

Second, it fixes the test failures which can be seen at https://dev.azure.com/certbot/certbot/_build/results?buildId=1309&view=results by no longer manually installing our own version of Python and instead using the one provided by Azure.

These small changes are in the same PR because I wanted to fix test failures ASAP and `UsePythonVersion` is not available on Windows 2012. See https://github.com/certbot/certbot/pull/7641#discussion_r358510854.

You can see tests passing with this change at https://dev.azure.com/certbot/certbot/_build/results?buildId=1311&view=results.

* stop testing on win2012

* switch to UsePythonVersion
2020-03-05 11:50:52 -08:00
ohemorange
68f4ae12be Merge pull request #7831 from certbot/candidate-1.3.0
Update files from 1.3.0 release
2020-03-03 17:34:31 -08:00
Brad Warren
9483b33ec1 Release version v1.3.0 2020-03-03 13:29:23 -08:00
Brad Warren
144d4f2b44 Bump version to 1.4.0 2020-03-03 12:43:04 -08:00
Brad Warren
e362948d45 Add contents to certbot/CHANGELOG.md for next version 2020-03-03 12:43:03 -08:00
Brad Warren
6edb4e1a39 Release 1.3.0 2020-03-03 12:43:02 -08:00
Brad Warren
b1fb3296e9 Update changelog for 1.3.0 release 2020-03-03 12:36:36 -08:00
Brad Warren
3147026211 Check OCSP as part of determining if the certificate is due for renewal (#7829)
Fixes #1028.

Doing this now because of https://community.letsencrypt.org/t/revoking-certain-certificates-on-march-4/.

The new `ocsp_revoked_by_paths` function  is taken from https://github.com/certbot/certbot/pull/7649 with the optional argument removed for now because it is unused.

This function was added in this PR because `storage.py` uses `self.latest_common_version()` to determine which certificate should be looked at for determining renewal status at 9f8e4507ad/certbot/certbot/_internal/storage.py (L939-L947)

I think this is unnecessary and you can just look at the currently linked certificate, but I don't think we should be changing the logic that code has always had now.

* Check OCSP status as part of determining to renew

* add integration tests

* add ocsp_revoked_by_paths
2020-03-03 11:07:15 -08:00
Michael Brown
9f8e4507ad Document safe and simple usage by services without root privileges (#7821)
Certificates are public information by design: they are provided by
web servers without any prior authentication required.  In a public
key cryptographic system, only the private key is secret information.

The private key file is already created as accessible only to the root
user with mode 0600, and these file permissions are set before any key
content is written to the file.  There is no window within which an
attacker with access to the containing directory would be able to read
the private key content.

Older versions of Certbot (prior to 0.29.0) would create private key
files with mode 0644 and rely solely on the containing directory
permissions to restrict access.  We therefore cannot (yet) set the
relevant default directory permissions to 0755, since it is possible
that a user could install Certbot, obtain a certificate, then
downgrade to a pre-0.29.0 version of Certbot, then obtain another
certificate.  This chain of events would leave the second
certificate's private key file exposed.

As a compromise solution, document the fact that it is safe for the
common case of non-downgrading users to change the permissions of
/etc/letsencrypt/{live,archive} to 0755, and explain how to use chgrp
and chmod to make the private key file readable by a non-root service
user.

This provides guidance on the simplest way to solve the common problem
of making keys and certificates usable by services that run without
root privileges, with no requirement to create a custom (and hence
error-prone) executable hook.

Remove the existing custom executable hook example, so that the
documentation contains only the simplest and safest way to solve this
very common problem.

Signed-off-by: Michael Brown <mbrown@fensystems.co.uk>
2020-02-27 16:44:23 -08:00
Brad Warren
50ea608553 Don't run advanced tests on PRs. (#7820)
When I wrote https://github.com/certbot/certbot/pull/7813, I didn't understand the default behavior for pull requests if you don't specify `pr` in the yaml file. According to https://docs.microsoft.com/en-us/azure/devops/pipelines/build/triggers?view=azure-devops&tabs=yaml#pr-triggers:

> If no pr triggers appear in your YAML file, pull request builds are automatically enabled for all branches...

This is not the behavior we want. This PR fixes the problem by disabling builds on PRs.

You should be able to see this working because the advanced tests should not run on this PR but they did run on https://github.com/certbot/certbot/pull/7811.
2020-02-27 15:07:33 -08:00
Brad Warren
fa67b7ba0f Remove codecov (#7811)
After getting a +1 from everyone on the team, this PR removes the use of `codecov` from the Certbot repo because we keep having problems with it.

Two noteworthy things about this PR are:

1. I left the text at 4ea98d830b/.azure-pipelines/INSTALL.md (add-a-secret-variable-to-a-pipeline-like-codecov_token) because I think it's useful to document how to set up a secret variable in general.
2. I'm not sure what the text "Option -e makes sure we fail fast and don't submit to codecov." in `tox.cover.py` refers to but it seems incorrect since `-e` isn't accepted or used by the script so I just deleted the line.

As part of this, I said I'd open an issue to track setting up coveralls (which seems to be the only real alternative to codecov) which is at https://github.com/certbot/certbot/issues/7810.

With my change, failure output looks something like:
```
$ tox -e py27-cover
...
Name                                                         Stmts   Miss  Cover   Missing
------------------------------------------------------------------------------------------
certbot/certbot/__init__.py                                      1      0   100%
certbot/certbot/_internal/__init__.py                            0      0   100%
certbot/certbot/_internal/account.py                           191      4    98%   62-63, 206, 337
...
certbot/tests/storage_test.py                                  530      0   100%
certbot/tests/util_test.py                                     374     29    92%   211-213, 480-484, 489-499, 504-511, 545-547, 552-554
------------------------------------------------------------------------------------------
TOTAL                                                        14451    647    96%
Command '['/path/to/certbot/dir/.tox/py27-cover/bin/python', '-m', 'coverage', 'report', '--fail-under', '100', '--include', 'certbot/*', '--show-missing']' returned non-zero exit status 2
Test coverage on certbot did not meet threshold of 100%.
ERROR: InvocationError for command /Users/bmw/Development/certbot/certbot/.tox/py27-cover/bin/python tox.cover.py (exited with code 1)
_________________________________________________________________________________________________________________________________________________________ summary _________________________________________________________________________________________________________________________________________________________
ERROR:   py27-cover: commands failed
```
I printed the exception just so we're not throwing away information.

I think it's also possible we fail for a reason other than the threshold not meeting the percentage, but I've personally never seen this, `coverage report` output is not being captured so hopefully that would inform devs if something else is going on, and saying something like "Test coverage probably did not..." seems like overkill to me personally.

* remove codecov

* remove unused variable group

* remove codecov.yml

* Improve tox.cover.py failure output.
2020-02-27 14:44:39 -08:00
Brad Warren
6309ded92f Remove references to deprecated flags in Certbot. (#7509)
Related to https://github.com/certbot/certbot/pull/7482, this removes some references to deprecated options in Certbot.

The only references I didn't remove were:

* In `certbot/tests/testdata/sample-renewal*` which contains a lot of old values and I think there's even some value in keeping them so we know if we make a change that suddenly causes old renewal configuration files to error.
* In the Apache and Nginx plugins and I created https://github.com/certbot/certbot/issues/7508 to resolve that issue.
2020-02-27 14:43:28 -08:00
m0namon
5a4f158c55 Merge pull request #7541 from certbot/no-client-plugins
Fix docstring
2020-02-27 14:36:59 -08:00
Peter Dräxler
bc5b079b2a Add a paragraph about Docker & Certbot to README (#22)
This partly addresses issue certbot-docker#2
2020-02-27 11:23:18 -08:00
Brad Warren
a2be8e1956 Fix tests on macOS Catalina (#7794)
This PR fixes the failures that can be seen at https://dev.azure.com/certbot/certbot/_build/results?buildId=1184&view=results.

You can see this code running on macOS Catalina at https://dev.azure.com/certbot/certbot/_build/results?buildId=1192&view=results.
2020-02-27 10:50:20 -08:00
Brad Warren
2f737ee292 Change how _USE_DISTRO is set for mypy (#7804)
If you run `mypy --platform darwin certbot/certbot/util.py` you'll get:
```
certbot/certbot/util.py:303: error: Name 'distro' is not defined
certbot/certbot/util.py:319: error: Name 'distro' is not defined
certbot/certbot/util.py:369: error: Name 'distro' is not defined
```
This is because mypy's logic for handling platform specific code is pretty simple and can't figure out what we're doing with `_USE_DISTRO` here. See https://mypy.readthedocs.io/en/stable/common_issues.html#python-version-and-system-platform-checks for more info.

Setting `_USE_DISTRO` to the result of `sys.platform.startswith('linux')` solves the problem without changing the overall behavior of our code here though.

This fixes part of https://github.com/certbot/certbot/issues/7803, but there's more work to be done on Windows.
2020-02-27 10:49:50 -08:00
Brad Warren
8c75a9de9f Remove unused notify code. (#7805)
This code is unused and hasn't been modified since 2015 except for various times our files have been renamed. Let's remove it.
2020-02-27 10:47:56 -08:00
Brad Warren
24aa1e9127 update letstest reqs (#7809)
I don't fully understand why, but since I updated my macbook to macOS Catalina, the test script currently fails to run for me with the versions of our dependencies we have pinned. Updating the dependencies solves the problem though and you can see Travis also successfully running tests with these new dependencies at https://travis-ci.com/certbot/certbot/builds/150573696.
2020-02-27 10:47:43 -08:00
Brad Warren
f4c0a9fd63 Split advanced pipeline (#7813)
I want to do what I did in https://github.com/certbot/certbot/pull/7733 to our Azure Pipelines setup, but unfortunately this isn't currently possible. The only filters available for service hooks for the "build completed" trigger are the pipeline and build status. See 
![Screen Shot 2020-02-26 at 3 04 56 PM](https://user-images.githubusercontent.com/6504915/75396464-64ad0780-58a9-11ea-97a1-3454a9754675.png)

To accomplish this, I propose splitting the "advanced" pipeline into two cases. One is for builds on protected branches where we want to be notified if they fail while the other is just used to manually run tests on certain branches.
2020-02-27 10:43:41 -08:00
m0namon
f169c37153 Merge pull request #7742 from osirisinferi/force-non-restrictive-umask
Force non restrictive umask when creating challenge directory in Apache plugin
2020-02-26 17:09:20 -08:00
cumul0529
a489079208 Update parser test to better assert logging output 2020-02-25 13:26:36 +09:00
cumul0529
ddf68aea80 Update comment in testdata file 2020-02-25 13:21:10 +09:00
cumul0529
2ae090529e Fixed typo & some trivial documentation change 2020-02-25 13:18:03 +09:00
Brad Warren
4ea98d830b remove _internal docs (#7801) 2020-02-24 21:31:16 +01:00
martin-c
4fd04366aa Fix issue #7165 in _create_challenge_dirs(), attempt to fix pylint errors (#7568)
* fix issue #7165 by checking if directory exists before trying to create it, fix possible pylint issues in webroot.py

* fix get_chall_pref definition

* Update CHANGELOG.md

* Update CHANGELOG.md

Co-authored-by: Adrien Ferrand <adferrand@users.noreply.github.com>
2020-02-23 22:14:51 +01:00
alexzorin
2633c3ffb6 acme: ignore params in content-type check (#7342)
* acme: ignore params in content-type check

Fixes the warning in #7339

* Suppress coverage complaint in test

* Update CHANGELOG

* Repair symlink

Co-authored-by: Adrien Ferrand <adferrand@users.noreply.github.com>
2020-02-23 21:49:42 +01:00
cumul0529
5b29e4616c Add simple comments 2020-02-24 05:30:54 +09:00
cumul0529
32904d8c9e Add TestCase.assertLogs() backport for Python 2.7 2020-02-24 05:13:34 +09:00
cumul0529
d68f37ae88 Add this change to CHANGELOG.md 2020-02-24 02:56:43 +09:00
cumul0529
b3071aab29 Add my name to AUTHORS.md
:)
2020-02-24 02:52:41 +09:00
cumul0529
2aac24c982 Trivial code clean-up 2020-02-24 02:49:08 +09:00
cumul0529
20df5507ae Add logging test for _parse_files() 2020-02-24 02:48:55 +09:00
cumul0529
36311a276b Add test case for _parse_ssl_options() 2020-02-24 02:46:27 +09:00
cumul0529
22685ef86f Remove unicode_support/ path in test case 2020-02-24 02:23:46 +09:00
cumul0529
c3cfd412c9 Relpace deprecated logger.warn() with logger.warning() 2020-02-24 01:47:12 +09:00
Seth Schoen
0b21e716ca Fix lint problems with long lines 2020-02-24 01:45:23 +09:00
cumul
8b90b55518 Added test for valid/invalid unicode characters 2020-02-24 01:35:00 +09:00
cumul
247d9cd887 Use io module instead of codecs
See https://mail.python.org/pipermail/python-list/2015-March/687124.html
2020-02-24 01:29:37 +09:00
cumul
d6ef34a03e Use UTF-8 encoding for nginx plugin 2020-02-24 01:25:16 +09:00
osirisinferi
9819443440 Add test 2020-02-22 15:22:27 +01:00
Raklyon
84b57fac93 Refactor cli.py, splitting in it smaller submodules (#6803)
* Refactor cli.py into a package with submodules

* Added unit tests for helpful module in cli.

* Fixed linter errors

* Fixed pylint issues

* Updated changelog.md

* Fixed test failing and mypy error. Appeared a new pylint error (seems to be in conflict with mypy)

mypy require zope.interface to be imported but when imported it is not used and pylint throws an error.

* Fixed pylint errors

* Apply changes to cli since last merge from master (efc8d49806)

* Fix lint

* Remaining lint errors

Co-authored-by: Adrien Ferrand <adferrand@users.noreply.github.com>
2020-02-21 21:30:58 +01:00
Brad Warren
7d79c91e9b Move our macOS tests to Azure Pipelines (#7793)
[Our macOS tests are failing](https://travis-ci.com/certbot/certbot/builds/149965318) again this time due to the problem described at https://travis-ci.community/t/macos-build-fails-because-of-homebrew-bundle-unknown-command/7296/14.

I tried adding `update: true` to the Homebrew config as described in that thread, but [it didn't work](https://travis-ci.com/certbot/certbot/builds/150070374). I also tried updating the macOS image we use which [didn't work](https://travis-ci.com/certbot/certbot/builds/150072389).

Since we continue to have problems with macOS on Travis, let try moving the tests to Azure Pipelines.

* test macos

* Remove Travis macOS setup

* add displayName
2020-02-21 11:18:53 -08:00
Brad Warren
c883efde0f add pgp key docs (#7765)
Fixes #7613.
2020-02-20 14:35:47 -08:00
Brad Warren
42dda355c5 Correct AutoHSTS docs (#7767)
domains is a list of strings, not a single string.

* Correct AutoHSTS docs.

* Fix Apache enable_autohsts docs.
2020-02-18 14:54:07 -08:00
Brad Warren
99b1538d0a Fix spurious pylint errors. (#7780)
This fixes (part of) the problem identified in https://github.com/certbot/certbot/pull/7657#issuecomment-586506340.

When I tested our pylint setup on Python 3.5.9, 3.6.9, or 3.6.10, tests failed with:
```
************* Module acme.challenges
acme/acme/challenges.py:57:15: E1101: Instance of 'UnrecognizedChallenge' has no 'jobj' member (no-member)
************* Module acme.jws
acme/acme/jws.py:28:16: E1101: Class 'Signature' has no '_orig_slots' member (no-member)
```
These errors did not occur for me on Python 3.6.7 or Python 3.7+.

You also cannot run our lint setup on Python 2.7 because our pinned version of pylint's dependency `asteroid` does not support Python 2. Because of this, `pylint` is not installed in the virtual environment created by `tools/venv.py` and our [`lint` environment in tox specifies that Python 3 should be used](fd64c8c33b/tox.ini (L132)).

I tried updating pylint and its dependencies to fix the problem, but they still occur so I think adding back these disable checks on these lines again is the best fix for now.
2020-02-18 11:55:48 -08:00
Brad Warren
fd64c8c33b Remove letshelp-certbot (#7761)
* remove references to letshelp

* remove letshelp files

* Remove line continuation

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2020-02-14 17:19:19 -08:00
Brad Warren
3f52695ec2 more robustly stop patches (#7763) 2020-02-14 17:18:53 -08:00
Adrien Ferrand
fc7e5e8e60 Remove useless pylint error suppression directives (#7657)
As pylint is evolving, it improves its accuracy, and several pylint error suppression (`# pylint: disable=ERROR) added in certbot codebase months or years ago are not needed anymore to make it happy.

There is a (disabled by default) pylint error to detect the useless suppressions (pylint-ception: `useless-suppression`). It is not working perfectly (it has also false-positives ...) but it is a good start to clean the codebase.

This PR removes several of these useless suppressions as detected by the current pylint version we use.

* Remove useless suppress

* Remove useless lines
2020-02-13 13:56:16 -08:00
m0namon
bcaee66b0a Merge pull request #7766 from certbot/min-pyparsing-version
Clarify the minimum pyparsing version
2020-02-12 13:26:10 -08:00
Brad Warren
df584a3b90 Remove _internal from docstring. 2020-02-12 13:12:03 -08:00
Brad Warren
7d540fc33a update pyparsing comment 2020-02-11 14:44:37 -08:00
Brad Warren
605ef40656 Remove duplicate pyparsing pin 2020-02-11 14:20:29 -08:00
Brad Warren
b8856ac810 Fix unpinned tests (#7760)
Our nightly tests failed last night due to a new release of `virtualenv` and `pip`'s lack of dependency resolution: https://travis-ci.com/certbot/certbot/jobs/285797857#L280. It looks like we were not the only ones affected by this problem: https://github.com/pypa/virtualenv/issues/1551

This fixes the problem by using `-I` to skip the logic where `pip` decides a dependency is already satisfied and has it reinstall/update the packages passed to `pip` and all of their dependencies.

You can see our nightly tests passing with this change at https://github.com/certbot/certbot/runs/439231061.
2020-02-11 12:05:29 -08:00
Brad Warren
02bf7d7dfc Print script output in case of a failure. (#7759)
These tests failed at https://travis-ci.com/certbot/certbot/jobs/285202481 but do not include any output from the script about what went wrong because the string created from `subprocess.CalledProcessError` does not include value of output.

This PR fixes that by printing these values which `pytest` will include in the output if the test fails.
2020-02-10 11:01:17 -08:00
Joona Hoikkala
e6f050dbe9 Move ocsp.py to public api (#7744)
We should move ocsp.py to public API, as an upcoming OCSP prefetching functionality in Apache plugin relies on it, and as the plugins are note released in lockstep with the Certbot core, we need to be careful when changing those APIs.

* Move ocsp.py to public api

* Fix type annotations, move to pointing to an interface and fix linting

* Add certbot.ocsp to documentation table of contents

* Modify tests to reflect the changes in ocsp.py

* Add changelog entry

* Fix notAfter mock for tests
2020-02-10 09:52:42 -08:00
Brad Warren
5607025e9b Really remove old docs link from README (#7758) 2020-02-07 12:58:15 -08:00
Brad Warren
7cc6cf2604 Remove link to letsencrypt readthedocs (#7757)
After a brief discussion in Mattermost, I shut down letsencrypt.readthedocs.io. Turns out we were linking to it in our README here so let's remove the broken link.

I didn't update the link to point to one of the readthedocs projects we still have because are main Certbot docs are self-hosted rather than being on readthedocs.
2020-02-07 11:04:07 -08:00
Brad Warren
86a6cc53cf Remove text that certbot.tests.utils isn't public (#7754) 2020-02-07 09:08:41 +01:00
Brad Warren
1859fb059d Don't display todo comments in docs (#7753)
Currently if you go to https://certbot.eff.org/docs/api/certbot.crypto_util.html, there is a todo comment displayed at the top of the page. These todos were written for developers, not users, so I do not think they should be shown from our documentation.

This PR makes the quick and easy fix of configuring Sphinx not to show these todo items. I created #7752 to track removing all of these todos from our docstrings and disabling the Sphinx todo extension.

* Set todo_include_todos=False in sphinx-quickstart

* Remove todos from existing docs.
2020-02-06 15:39:47 -08:00
ohemorange
c5a2ba03da Merge pull request #7735 from certbot/apache-parser-v2
[Apache v2] Merge apache-parser-v2 feature branch back to master
2020-02-06 15:29:28 -08:00
schoen
995e70542a Merge pull request #7738 from osirisinferi/nginx-hostname
[nginx] Parse $hostname in `server_name`
2020-02-06 14:44:03 -08:00
OsirisInferi
4f80f8b910 Fixing existing tests 2020-02-06 21:24:25 +01:00
OsirisInferi
0e03f82733 Remove todo:: 2020-02-06 21:12:17 +01:00
OsirisInferi
5035a510a2 Add test for $hostname parsing 2020-02-06 21:10:41 +01:00
Adrien Ferrand
ef388a309f Merge pull request #7751 from Pilifer/master
Don't verify certificate in HTTP01Response.simple_verify (certbot#6614)
2020-02-06 16:58:39 +01:00
Filip Lajszczak
c98183c998 restore CHANGELOG in root directory 2020-02-06 15:27:20 +00:00
Filip Lajszczak
2b051dd197 Merge branch 'master' of https://github.com/certbot/certbot 2020-02-06 15:14:17 +00:00
ohemorange
bca73f9932 Grammar improvements (#18)
Update the README with improved grammar.
2020-02-05 16:08:57 -08:00
Brad Warren
7da5196206 Add triggers for only a single CI system (#7748)
* Configure travis-test to only run on Travis.

* Configure azure-test to only run on Azure.

* Add docs and comments to keep it up-to-date.
2020-02-05 23:49:01 +01:00
Brad Warren
cc764b65c1 Set recreate = true in tox.ini. (#7746)
Fixes #7745.
2020-02-05 14:37:39 -08:00
Adrien Ferrand
7b35abbcb4 Windows installer integration tests (#7724)
As discussed in #7539, we need proper tests of the Windows installer itself in order to variety that all the logic contained in a production-grade runtime of Certbot on Windows is correctly setup by each version of the installer, and so for a variety of Windows OSes. 

This PR handles this requirement. The new `windows_installer_integration_tests` module in `certbot-ci` will:
* run the given Windows installer
* check that Certbot is properly installed and working
* check that the scheduled renew task is set up
* check that the scheduled task actually launch the Certbot renew logic

The Windows nightly tests are updated accordingly, in order to have the tests run on Windows Server 2012R2, 2016 and 2019.

These tests will evolve as we add more logic on the installer. 

* Configure an integration test testing the windows installer

* Write the test module

* Configurable installer path, prepare azure pipelines

* Fix option

* Update test_main.py

* Add confirmation for this destructive test

* Use regex to validate certbot --version output

* Explicit dependency on a log output

* Use an exception to ask confirmation

* Use --allow-persistent-changes
2020-02-05 14:12:29 -08:00
Brad Warren
6601d03ce8 Merge pull request #7743 from certbot/candidate-1.2.0
Candidate 1.2.0
2020-02-05 13:48:51 -08:00
OsirisInferi
d3a4b8fd8c Missing import 2020-02-05 22:27:12 +01:00
OsirisInferi
f3ed133744 Wrap makedirs() within exception handelrs 2020-02-05 22:17:29 +01:00
m0namon
1a2189f4df Merge pull request #7729 from certbot/fix-nginx-typo
Fix a typo in Nginx
2020-02-04 17:00:13 -08:00
Erica Portnoy
a180d5d5c9 Release version v1.2.0 2020-02-04 15:34:00 -08:00
Erica Portnoy
6a4b610269 Bump version to 1.3.0 2020-02-04 14:01:04 -08:00
Erica Portnoy
97ae63efa6 Add contents to certbot/CHANGELOG.md for next version 2020-02-04 14:01:03 -08:00
Erica Portnoy
3907b53b4b Release 1.2.0 2020-02-04 14:01:02 -08:00
Erica Portnoy
6c5959d892 Update changelog for 1.2.0 release 2020-02-04 13:46:57 -08:00
OsirisInferi
601a114d1b Update changelog 2020-02-04 19:47:27 +01:00
OsirisInferi
86926dff92 Use unrestrictive umask for challenge directory 2020-02-04 19:27:27 +01:00
OsirisInferi
9b35dbf2be Forgot to remove a breakpoint() statement 2020-02-02 22:31:05 +01:00
OsirisInferi
05e35ff2e0 Add CHANGELOG entry 2020-02-02 21:59:03 +01:00
OsirisInferi
7d0651c315 Parse $hostname in server_name 2020-02-02 21:56:09 +01:00
Brad Warren
174fa0e05c Turn off Travis notifications in test branches. (#7733)
When I want to manually run the full test suite to test something, I've been manually deleting our notification setup from `.travis.yml` to avoid spamming IRC with my personal test failures.

This PR sets this behavior up to happen automatically by turning off IRC notifications in test branches. You can see this working by noticing the IRC notification section in the bottom of the config for this PR at https://travis-ci.com/certbot/certbot/builds/146827907/config and the fact that it is absent from a `test-` branch based on this one at https://travis-ci.com/certbot/certbot/jobs/282059094/config.
2020-01-30 13:26:39 -08:00
Brad Warren
8d9943cb08 Update instructions about how to build docs (#7605) 2020-01-30 11:47:48 -08:00
Brad Warren
715899d5a8 Merge pull request #7731 from certbot/master_to_ap2
[Apache v2] Merge master to apache-parser-v2
2020-01-30 10:21:16 -08:00
Joona Hoikkala
882335c7ec Merge remote-tracking branch 'origin/master' into ap2_to_master 2020-01-30 17:08:16 +02:00
Brad Warren
35fa4c0457 Add space between words. 2020-01-29 15:30:51 -08:00
ohemorange
11e402893f Remove SSLCompression off line from all config options (#7726)
Based on discussion at https://github.com/certbot/certbot/pull/7712#discussion_r371451761.

* Remove SSLCompression off line from all config options

* Update changelog
2020-01-29 15:21:17 -08:00
Brad Warren
2338ab36fd Add backwards compatibility docs (#7611)
Fixes #7463.

* Add backwards compatibility docs.

* Exclude certbot-auto
2020-01-27 13:13:38 -08:00
Cameron Steel
e3c996de10 dns-cloudflare: Implement limited-scope API Tokens (#7583)
A while ago Cloudflare added support for limited-scope API Tokens in place of using a global API key, but support for them in cloudflare/python-cloudflare took a while to get through.

In summary, this PR:
- Implements token functionality through the INI file parameter `dns_cloudflare_api_token` (in addition to the traditional `dns_cloudflare_email` and `dns_cloudflare_api_key`). This needed a more advanced parameter validator than the built in `required_variables` mechanism.
- Updates the docs to reflect the new option, needed token permissions, and version details of the `cloudflare` module

* Update python-cloudflare version

* Add Cloudflare API Token support to certbot-dns-cloudflare

* Add token-specific errors to certbot-dns-cloudflare

* Tidy up certbot-dns-cloudflare

* Implement Cloudflare API Tokens in testing for certbot-dns-cloudflare(needs work)

* Further tidying of certbot-dns-cloudflare

* Update CHANGELOG with Cloudflare API Tokens implementation

* Improve testing of certbot-dns-cloudflare

* Improve certbot-dns-cloudflare test formatting

* Further improve testing for certbot-dns-cloudflare

* Change needed permissions for token

* Add documentation regarding python-cloudflare version

* Fix changelog, references to python-cloudflare and docs

* Fix behaviour when domain does not match cloudflare root domain. Improve error handling.

* Improve testing

* Improve hints and error handling
2020-01-24 15:25:03 -08:00
Brad Warren
b8a9dd75eb Update dns-lexicon version. (#7723) 2020-01-25 00:02:57 +01:00
Brad Warren
2072599bd7 Unpin Python 3.4 dependencies (#7709)
* Unpin dependencies pinned back for py3.4 support.

* update pinned packages

* run build.py

* Update boto3 and deps to work with requests
2020-01-24 23:02:54 +01:00
ohemorange
b1a8e7175b Disable old SSL versions and ciphersuites to follow Mozilla recommendations in Apache (#7712)
Part of #7204.

Makes the smaller changes described at https://github.com/certbot/certbot/issues/7204#issuecomment-571838185 to disable many old ciphersuites and TLS versions < 1.2. Does not add checks for OpenSSL version or modify session tickets.

Since Apache uses TLS protocol blacklisting instead of whitelisting (as in NGINX), we additionally may not need to determine if the server supports TLS1.3 and turn it on or off based on Apache version.

* Update SSL versions and ciphersuites based on Mozilla intermediate recommendations for apache

* Update constants with hashes of new config files

* Update changelog
2020-01-24 13:37:42 -08:00
Brad Warren
1e2f70b17a Drop Python 3.4 support (#7721)
Fixes #7393.

* Remove Python 3.4 classifiers

* Remove unneeded typing dependency

* Exclude Python 3.4 in python_requires

* Remove Python 3.4 deprecation warning

* update changelog
2020-01-24 12:32:07 -08:00
ohemorange
896c1e0b66 Remove ECDHE-RSA-AES128-SHA from NGINX ciphers list (#7719)
As mentioned in https://github.com/certbot/certbot/pull/7712#discussion_r370419867, it's time to remove this ciphersuite now that Windows 2008 R2 and Windows 7 are EOLed.

* Remove ECDHE-RSA-AES128-SHA from NGINX ciphers list to celebrate Windows 2008 R2 deprecation

* Update changelog
2020-01-24 10:09:28 -08:00
Hugo van Kemenade
2f24726d4c Fix collections.abc imports for Python 3.9 (#7707)
* Fix collections.abc imports for Python 3.9

* Update AUTHORS.md

* No longer ignore collections.abc deprecation warning

* Update changelog

* Remove outdated comment

* Disabling no-name-in-module not needed as linting is on Python 3
2020-01-24 14:13:58 +01:00
Amjad Mashaal
5f315b46e9 Update documentation files to remove claiming support for Python 3.4 (#7395) 2020-01-23 16:35:39 -08:00
Josh McCullough
a342eb5546 fixes #1948 -- MD5 on FIPS systems (#7708)
* use MD5 in non-security mode to get around FIPS issue

* update CHANGELOG

* add myself to AUTHORS

* ignore hashlib params
2020-01-23 10:58:36 -08:00
Brad Warren
90fd1afc38 unpin macos (#7705) 2020-01-22 08:20:52 +01:00
Brad Warren
4473fd25cb Don't run Python 3.5 tests twice. (#7704) 2020-01-22 08:18:21 +01:00
Brad Warren
a6772043d6 Minor release script improvements (#7697)
* Do not use git diff.

* Add a warning on exit.
2020-01-21 15:53:31 -08:00
Amjad Mashaal
7234d8922d Drop Travis tests for Python 3.4 (#7394) 2020-01-21 15:34:34 -08:00
Brad Warren
07dc2400eb Downgrade NSIS and upgrade Python (#7702)
* Add --allow-downgrade to chocolatey command.

* Upgrade tests to use Python 3.8.1.
2020-01-21 23:53:19 +01:00
Ville Skyttä
1702cb90fd Spelling and grammar fixes (#7695) 2020-01-17 18:55:51 +01:00
Ville Skyttä
fcdeaf48f2 Include added/deleted TXT record name in RFC 2136 debug log (#7696) 2020-01-17 16:42:10 +02:00
Brad Warren
702ad99090 Don't run some tests multiple times. (#7685) 2020-01-16 23:08:38 +01:00
Brad Warren
5f0703cbf1 Fix minimum certbot version in plugins (#7684)
Fixes the problem found at https://github.com/certbot/certbot/pull/7682#discussion_r367140415.
2020-01-16 13:54:25 -08:00
Brad Warren
9a3186a67e Cleanup disabled warnings list in pytest.ini. (#7690) 2020-01-16 22:47:23 +01:00
Brad Warren
91ce42ce9c Do not list the name twice. (#7689) 2020-01-16 22:44:08 +01:00
Brad Warren
097c76f512 Merge branch 'master' into no-client-plugins 2020-01-16 11:56:41 -08:00
osirisinferi
6e07e8b5c0 Add missing directory field (#7687)
Fixes #7683.

* Add missing directory field to error message

* Added change to CHANGELOG.md
2020-01-16 11:31:22 -08:00
Brad Warren
fd91643a7f Merge pull request #7682 from certbot/candidate-1.1.0
Update files from 1.1.0 release
2020-01-15 16:14:22 -08:00
Brad Warren
78624a2b8c Release version v1.1.0 2020-01-14 11:49:36 -08:00
Brad Warren
619b17753e Bump version to 1.2.0 2020-01-14 10:52:05 -08:00
Brad Warren
60cd920bcb Add contents to certbot/CHANGELOG.md for next version 2020-01-14 10:52:05 -08:00
Brad Warren
f512b5eaa2 Release 1.1.0 2020-01-14 10:52:03 -08:00
Brad Warren
9800e5d8fc Update changelog for 1.1.0 release 2020-01-14 10:41:32 -08:00
J0WI
695107bc98 Update Python to 3.8 (#16)
https://github.com/certbot/certbot/pull/7392
2020-01-13 10:39:36 -08:00
Adrien Ferrand
e84ed49c56 Fix certbot-auto regarding python 3.4 -> python 3.6 migration for CentOS 6 users (#7519)
* Revert "Add back Python 3.4 support (#7510)"

This reverts commit 9b848b1d65.

* Fix certbot-auto

* Use a more consistent way to enable rh-python36

* Avoid to call CompareVersions unecessarily

* Control rh-python36 exit code

* Fix travis config

* Remove vscode config

* Ignore vscode

* Fix merge conflicts regarding #7587 (#70)

* Add changelog entry

* Finish sentence

* Update certbot/CHANGELOG.md

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update comments

* Improve warning message

* Update changelog

Co-authored-by: Joona Hoikkala <joohoi@users.noreply.github.com>
2020-01-13 09:24:41 +01:00
J0WI
fb323e083a Update Alpine to 3.11 (#14) 2020-01-10 17:36:58 -08:00
Brad Warren
ceea41c1e2 Do not document private members (#7675)
It looks like we're currently documenting functions that are marked private (prefixed with an underscore) such as https://certbot.eff.org/docs/api/certbot.crypto_util.html#certbot.crypto_util._load_cert_or_req. I do not think we should do this because the functionality is private, should not be used, and including it in our docs just adds visual noise.

This PR stops us from documenting private code and fixes up `tools/sphinx-quickstart.sh` so we don't document it in future modules.

* Do not document private code.

* Don't document private members in the future.
2020-01-10 16:48:01 -08:00
Vladimir Varlamov
456122e342 improve help about supply selecting in delete command (#7673)
for #6625
2020-01-09 11:34:04 -08:00
Adrien Ferrand
84c1b912d9 Implement a sunset mechanism in certbot-auto for systems not supported anymore (#7587)
* Sunset mechanism

* Simplify code

* Update letsencrypt-auto-source/letsencrypt-auto.template

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update template

* Deprecate for all RHEL/CentOS 6 32bits flavors

* Add a wrapper to uname to do tests on fake 32 bits versions

* Replace all occurences

* Add some tests about sunset mechanism

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Various corrections

* Recreate script

* Update comment position

* Test also install only

* Fix docker

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* What error command is doing here ?

* Fix permissions

* Rebuild script

* Add changelog

* Update letsencrypt-auto-source/letsencrypt-auto.template

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update changelog

* Trigger CI

* Handle old venv path

* Modify test

* Fix test error detection from subpaths

* Edit echo

* Use set -e

* Update letsencrypt-auto-source/letsencrypt-auto.template

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Corrections

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-01-08 16:36:34 +01:00
sydneyli
0e78436b05 [Apache v2] Add apacheconfig as a dependency (#7643)
* Add apacheconfig as a dependency.

* Change apacheconfig to a dev dependency

* Bump apacheconfig dep to 0.3.1
2020-01-07 09:57:43 -08:00
ohemorange
9b5b27597c Merge pull request #7618 from certbot/ap2_merge_master
[Apache v2] Merge changes from master to apache-parser-v2
2020-01-06 17:54:29 -08:00
Joona Hoikkala
3b065238b3 Modifications needed for merging to master 2020-01-06 17:19:33 +02:00
Joona Hoikkala
0f5bda4ff9 Merge remote-tracking branch 'origin/master' into ap2_merge_master 2020-01-06 17:17:31 +02:00
Joona Hoikkala
70be256c66 Fix gating to ensure that no parsernode functionality is run unless explicitly requested (#7654) 2020-01-02 15:44:11 -08:00
Adrien Ferrand
fda655370a Update CHANGELOG.md (#7659) 2020-01-02 23:44:16 +01:00
Adrien Ferrand
887d72fd5d Remove POST-as-GET fallback to GET (#6994) 2020-01-02 12:48:55 -08:00
Brad Warren
5713decf23 Update other links to point to new GH org. (#13) 2020-01-02 21:41:10 +01:00
Guillaume Vincent
c194381f04 Fix broken link (#12) 2020-01-02 14:27:47 -05:00
Brad Warren
6d527bcc42 Include header files for compilation. (#7650) 2019-12-19 14:02:24 -08:00
Barbz
6ca80b7ce8 How to uninstall certbot-auto (#7648) 2019-12-19 13:30:13 -08:00
Joona Hoikkala
4401eacaac Implement get_virtual_hosts() for ParserNode interfaces (#7564) 2019-12-19 10:51:41 +02:00
Brad Warren
f520d482fd Remove other 3.8-dev references. (#7646) 2019-12-18 23:00:49 +01:00
Adrien Ferrand
b5a31bec03 Add docker-compose as a requirement of certbot-ci (#7120)
Fixes #7110 

This PR declares docker-compose as a requirement for certbot-ci. This way, a recent version of docker-compose is installed in the standard virtual environment set up by `tools/venv.py` and `tools/venv3.py`, and so is available to pytest integration tests from `tox` or in the virtual environment enabled.

* Add docker-compose as a dev dependency and declares it in certbot-ci requirements

* Update docker-compose 1.25.0
2019-12-18 13:21:54 -08:00
Brad Warren
6ac7aabaf7 Remove warning about dev preview (#7640) 2019-12-18 11:14:58 -08:00
Brad Warren
24fdea5fd8 discourage dns plugins (#7639) 2019-12-18 11:13:57 -08:00
Adrien Ferrand
4a906484ee Execute Windows installer integration tests on several Windows versions (#7641)
This PRs extends the installer tests on Azure Pipeline, in order to run the integration tests on a certbot instance installed with the Windows installer for several Windows versions, corresponding to the scope of supported versions on Certbot:
* Windows Server 2012 R2
* Windows Server 2016
* Windows Server 2019

One can see the result on: https://dev.azure.com/adferrand/certbot/_build/results?buildId=311

* Try specific installer-build step

* Install Python manually

* Add tests on windows 2019
2019-12-16 16:03:39 -08:00
Adrien Ferrand
9e5bca4bbf Lint certbot code on Python 3, and update Pylint to the latest version (#7551)
Part of #7550

This PR makes appropriate corrections to run pylint on Python 3.

Why not keeping the dependencies unchanged and just run pylint on Python 3?
Because the old version of pylint breaks horribly on Python 3 because of unsupported version of astroid.

Why updating pylint + astroid to the latest version ?
Because this version only fixes some internal errors occuring during the lint of Certbot code, and is also ready to run gracefully on Python 3.8.

Why upgrading mypy ?
Because the old version does not support the new version of astroid required to run pylint correctly.

Why not upgrading mypy to its latest version ?
Because this latest version includes a new typshed version, that adds a lot of new type definitions, and brings dozens of new errors on the Certbot codebase. I would like to fix that in a future PR.

That said so, the work has been to find the correct set of new dependency versions, then configure pylint for sane configuration errors in our situation, disable irrelevant lintings errors, then fixing (or ignoring for good reason) the remaining mypy errors.

I also made PyLint and MyPy checks run correctly on Windows.

* Start configuration

* Reconfigure travis

* Suspend a check specific to python 3. Start fixing code.

* Repair call_args

* Fix return + elif lints

* Reconfigure development to run mainly on python3

* Remove incompatible Python 3.4 jobs

* Suspend pylint in some assertions

* Remove pylint in dev

* Take first mypy that supports typed-ast>=1.4.0 to limit the migration path

* Various return + else lint errors

* Find a set of deps that is working with current mypy version

* Update local oldest requirements

* Remove all current pylint errors

* Rebuild letsencrypt-auto

* Update mypy to fix pylint with new astroid version, and fix mypy issues

* Explain type: ignore

* Reconfigure tox, fix none path

* Simplify pinning

* Remove useless directive

* Remove debugging code

* Remove continue

* Update requirements

* Disable unsubscriptable-object check

* Disable one check, enabling two more

* Plug certbot dev version for oldest requirements

* Remove useless disable directives

* Remove useless no-member disable

* Remove no-else-* checks. Use elif in symetric branches.

* Add back assertion

* Add new line

* Remove unused pylint disable

* Remove other pylint disable
2019-12-10 14:12:50 -08:00
Joona Hoikkala
5c588a6f8d [Apache v2] Implement parsed_files (#7562)
* Implement parsed_files

* Add parsed_files stub to ApacheParserNodes and fix assertions

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Add more descriptive comments

* Update certbot-apache/certbot_apache/augeasparser.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot-apache/certbot_apache/dualparser.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>
2019-12-10 13:20:00 -05:00
Adrien Ferrand
e048da1e38 Reorganize imports (#7616)
* Isort execution

* Fix pylint, adapt coverage

* New isort

* Fix magic_typing lint

* Second round

* Fix pylint

* Third round. Store isort configuration

* Fix latest mistakes

* Other fixes

* Add newline

* Fix lint errors
2019-12-09 15:50:20 -05:00
Brad Warren
34b568f366 Don't list adding type annotations as a PR req. (#7627) 2019-12-04 20:22:10 +01:00
ohemorange
b99bfe8ab7 Merge pull request #7622 from certbot/candidate-1.0.0
Release 1.0
2019-12-04 14:15:49 -05:00
Brad Warren
5da61564d9 Don't list DNS plugins as alpha quality. (#7624)
They should be considered production quality like our other packaged code.
2019-12-03 19:56:16 -08:00
Brad Warren
b45f79d0ab fix bad links in docs (#7623)
This PR fixes the failures at https://travis-ci.com/certbot/website/builds/139193502#L1316.

Once this PR lands, I'll update certbot/website#508 to include this commit.
2019-12-03 11:05:23 -08:00
Brad Warren
b92eb6f620 Release version v1.0.0 2019-12-03 10:17:50 -08:00
Brad Warren
3cfa63483d Add full API documentation (#7614)
A lot of Certbot's files don't have API documentation which is fixed by this PR. To do this, from the top level certbot directory I ran:
```
sphinx-apidoc -Me -o docs/api certbot
```
I then merged the resulting `modules.rst` file with `docs/api.rst`.
2019-12-03 09:54:37 -08:00
Brad Warren
27d6f62a96 update external plugin (#7604)
The old plugin at https://github.com/marcan/certbot-external says it's obsolete and points people to https://github.com/EnigmaBridge/certbot-external-auth. The new plugin is also an installer.

I also removed the reference to #2782 about us adding similar functionality since that's been done for a long time. We could reference our manual plugin instead, but I think that devalues their plugin a bit which I don't think is necessary or correct as it has different features.
2019-12-03 09:52:05 -08:00
Brad Warren
e32033f1ec document main (#7610)
I deleted the exceptions because I think it's not feasible to document the possible exceptions raised by all of Certbot.
2019-12-03 09:51:43 -08:00
Brad Warren
d2bad803f3 Bump version to 1.1.0 2019-12-03 09:27:30 -08:00
Brad Warren
5debf7af7e Add contents to certbot/CHANGELOG.md for next version 2019-12-03 09:27:30 -08:00
Brad Warren
6102cc440b Release 1.0.0 2019-12-03 09:27:28 -08:00
Brad Warren
bc80195a58 Update changelog for 1.0.0 release 2019-12-03 09:20:30 -08:00
Felix Schwarz
2008e3cc77 acme/setup.py: comment refers to "PyOpenSSL" not "mock" (#7619) 2019-12-03 01:16:41 +01:00
Joona Hoikkala
6148e5c355 [Apache v2] Move the apachectl parsing to apache_util (#7569)
* Move the Apache CLI parsing to apache_util

* Fix test mocks

* Address review comments

* Fix the parsernode metadata dictionary
2019-12-02 18:00:53 -05:00
Adrien Ferrand
4c652b9c82 Upgrade to pywin32>=227 (#7615)
Current version of pywin32 used in certbot (225) does not have wheels available for Python 3.8. Installing certbot for development in this case requires to build from source. On Windows, this implies a Visual Studio C++ environment up and ready, which is absolutely not fun.

Let's upgrade to pywin32 227, that provides these wheels for all Python versions from 3.5 up to current dev status of 3.9.
2019-12-02 13:39:31 -08:00
Adrien Ferrand
ea44834c41 Fix docker build regarding the new certbot layout (#11)
This PR adds appropriate corrections to certbot dockerfile to work with the new layout (moving certbot python project in its own subdirectory).

This PR has been tested with success using the `build.sh` script on a fake v1.0 version of certbot published on my fork (https://github.com/adferrand/certbot/releases/tag/v1.0) instead of archives from `certbot/certbot`.
2019-12-02 12:39:55 -08:00
Joona Hoikkala
06fdbf2a55 [Apache v2] Implement find_ancestors (#7561)
* Implement find_ancestors

* Create the node properly and add assertions

* Update certbot-apache/certbot_apache/augeasparser.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Remove comment
2019-12-02 16:25:39 +02:00
Brad Warren
84b770b56e Defines the RenewableCert API (#7603)
This is my proposed fix for #7540. I would ideally like this to be included in our 1.0 release.

I came up with this design by adding all attributes used either in our own plugins, 3rd party plugins listed at https://certbot.eff.org/docs/using.html#third-party-plugins, or our public API code.

Despite me thinking that zope is unneeded nowadays, I initially tried to use it to define this interface since we have it and it gives us a way to define expected attributes, but it doesn't work because zope interface objects also have a method called `names` which conflict with the API.

I talked about this with Adrien out of band and did some of my own research and there are some minor benefits with this new approach of using properties:

1. It's more conventional.
2. If you also change the implementation to inherit from the class, Python will error if all properties aren't defined.
3. The PEP 526 style type annotations with mypy seem to (currently) only be used to validate code using the class, not the class implementation itself. You can add a type annotation saying the class needs to have this attribute, never define it, and mypy won't complain.

With this new approach, I had to fix `names` because pylint was complaining that the arguments differed, however, we never used the optional parameter to `names` outside of tests so I just deleted the code altogether.

* fixes #7540

* move to properties
2019-11-27 11:32:00 -08:00
ohemorange
6c1dfe43c7 Refactor tests out of packaged module for apache plugin (#7607)
Part of #7593.

* Refactor tests out of packaged module for apache plugin

* Exclude pycache and .py[cod]

* Change tests path in tox.ini
2019-11-27 09:57:35 -08:00
ohemorange
a8e711d281 Refactor tests out of packaged module for nginx plugin (#7606)
* Refactor tests out of packaged module for nginx plugin

* Exclude pycache and .py[cod]
2019-11-26 17:45:18 -08:00
ohemorange
f36b93267c Exclude pycache and .py[cod] from certbot package (#7608) 2019-11-26 17:45:07 -08:00
ohemorange
d2b65b47f2 Refactor tests out of packaged module for acme plugin (#7600)
* Move acme tests to tests/ directory outside of acme module

* Fix call to messages_test in client_test

* Move test_util.py and testdata/ into tests/

* Update manifest to package tests

* Exclude pycache and .py[cod]
2019-11-26 15:25:41 -08:00
ohemorange
b624172f68 Refactor tests out of packaged module for dns plugins (#7599)
* Refactor tests out of module for certbot-dns-cloudflare

* Refactor tests out of module for certbot-dns-cloudxns

* Refactor tests out of module for certbot-dns-digitalocean

* Refactor tests out of module for certbot-dns-dnsimple

* Refactor tests out of module for certbot-dns-dnsmadeeasy

* Refactor tests out of module for certbot-dns-gehirn

* Refactor tests out of module for certbot-dns-google

* Refactor tests out of module for certbot-dns-linode

* Refactor tests out of module for certbot-dns-luadns

* Refactor tests out of module for certbot-dns-nsone

* Refactor tests out of module for certbot-dns-ovh

* Refactor tests out of module for certbot-dns-rfc2136

* Refactor tests out of module for certbot-dns-sakuracloud

* Refactor tests out of module for certbot-dns-route53

* Move certbot-dns-google testdata/ under tests/

* Use pytest for dns plugins

* Exclude pycache and .py[cod]
2019-11-26 15:25:28 -08:00
ohemorange
6d1472bf8c Implement redirect by default (#7595)
* Change redirect default to yes so that it happens automatically in noninteractive mode

* Update changelog
2019-11-25 18:53:20 -08:00
ohemorange
5c8083851a Fix refactor (#7597)
Clean up some places missed by #7544.

Found this when running test farm tests. They were working as of 5d90544, and I will truly shocked if subsequent changes (all to the windows installer) made them stop working.

* Release script needs to target new CHANGELOG location

* Clean up various other CHANGELOG path references

* Update windows paths for new certbot location

* Add certbot to packages list for windows installer
2019-11-25 18:24:20 -08:00
ohemorange
345bdb46e0 Update pull_request_template.md (#7596)
* Update pull_request_template.md

* Remove line breaks

Github seems to be keeping the line breaks rather than ignoring them, making it be formatted weirdly, so remove them.
2019-11-25 15:42:01 -08:00
ohemorange
e023f889ff Make the contents of the nginx plugin private (#7589)
Part of #5775.

* Create _internal folder certbot-nginx

* Move configurator.py to _internal

* Move constants.py to _internal

* Move display_ops.py to _internal

* Move http_01.py to _internal

* Move nginxparser.py to _internal

* Move obj.py to _internal

* Move parser_obj.py to _internal

* Move parser.py to _internal

* Update location and references for tls_configs

* exclude parser_obj from coverage
2019-11-25 14:30:24 -08:00
ohemorange
4abd81e218 Refactor certbot/ and certbot/tests/ to use the same structure as the other packages (#7544)
Summary of changes in this PR:
- Refactor files involved in the `certbot` module to be of a similar structure to every other package; that is, inside a directory inside the main repo root (see below).
- Make repo root README symlink to `certbot` README.
- Pull tests outside of the distributed module.
- Make `certbot/tests` not be a module so that `certbot` isn't added to Python's path for module discovery.
- Remove `--pyargs` from test calls, and make sure to call tests from repo root since without `--pyargs`, `pytest` takes directory names rather than package names as arguments.
- Replace mentions of `.` with `certbot` when referring to packages to install, usually editably.
- Clean up some unused code around executing tests in a different directory.
- Create public shim around main and make that the entry point.

New directory structure summary:
```
repo root ("certbot", probably, but for clarity all files I mention are relative to here)
├── certbot
│   ├── setup.py
│   ├── certbot
│   │   ├── __init__.py
│   │   ├── achallenges.py
│   │   ├── _internal
│   │   │   ├── __init__.py
│   │   │   ├── account.py
│   │   │   ├── ...
│   │   ├── ...
│   ├── tests
│   │   ├── account_test.py
│   │   ├── display
│   │   │   ├── __init__.py
│   │   │   ├── ...
│   │   ├── ... # note no __init__.py at this level
│   ├── ...
├── acme
│   ├── ...
├── certbot-apache
│   ├── ...
├── ...
```

* refactor certbot/ and certbot/tests/ to use the same structure as the other packages

* git grep -lE "\-e(\s+)\." | xargs sed -i -E "s/\-e(\s+)\./-e certbot/g"

* git grep -lE "\.\[dev\]" | xargs sed -i -E "s/\.\[dev\]/certbot[dev]/g"

* git grep -lE "\.\[dev3\]" | xargs sed -i -E "s/\.\[dev3\]/certbot[dev3]/g"

* Remove replacement of certbot into . in install_and_test.py

* copy license back out to main folder

* remove linter_plugin.py and CONTRIBUTING.md from certbot/MANIFEST.in because these files are not under certbot/

* Move README back into main folder, and make the version inside certbot/ a symlink

* symlink certbot READMEs the other way around

* move testdata into the public api certbot zone

* update source_paths in tox.ini to certbot/certbot to find the right subfolder for tests

* certbot version has been bumped down a directory level

* make certbot tests directory not a package and import sibling as module

* Remove unused script cruft

* change . to certbot in test_sdists

* remove outdated comment referencing a command that doesn't work

* Install instructions should reference an existing file

* update file paths in Dockerfile

* some package named in tox.ini were manually specified, change those to certbot

* new directory format doesn't work easily with pyargs according to http://doc.pytest.org/en/latest/goodpractices.html#tests-as-part-of-application-code

* remove other instance of pyargs

* fix up some references in _release.sh by searching for ' . ' and manual check

* another stray . in tox.ini

* fix paths in tools/_release.sh

* Remove final --pyargs call, and now-unnecessary call to modules instead of local files, since that's fixed by certbot's code being one layer deeper

* Create public shim around main and make that the entry point

* without pyargs, tests cannot be run from an empty directory

* Remove cruft for running certbot directly from main

* Have main shim take real arg

* add docs/api file for main, and fix up main comment

* Update certbot/docs/install.rst

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Fix comments in readthedocs requirements files to refer to current package

* Update .[docs] reference in contributing.rst

* Move plugins tests to certbot tests directory

* add certbot tests to MANIFEST.in so packagers can run python setup.py test

* move examples directory inside certbot/

* Move CHANGELOG into certbot, and create a top-level symlink

* Remove unused sys and logging from main shim

* nginx http01 test no longer relies on certbot plugins common test
2019-11-25 14:28:05 -08:00
ohemorange
d56cd4ef01 Make the contents of the DNS plugins private (#7580)
Part of #5775.

```
modify_item () {
    mkdir certbot-dns-$1/certbot_dns_$1/_internal
    git grep -l "from certbot_dns_$1 import dns_$1" | xargs sed -i "s/from certbot_dns_$1 import dns_$1/from certbot_dns_$1._internal import dns_$1/g"
    git grep -l "certbot_dns_$1\.dns_$1" | xargs sed -i "s/certbot_dns_$1\.dns_$1/certbot_dns_$1._internal.dns_$1/g"
    git checkout -- certbot-dns-$1/certbot_dns_$1/__init__.py
    echo '"""Internal implementation of \`~certbot_dns_$1.dns_$1\` plugin."""' > certbot-dns-$1/certbot_dns_$1/_internal/__init__.py
    mv certbot-dns-$1/certbot_dns_$1/dns_$1.py certbot-dns-$1/certbot_dns_$1/_internal
    git checkout -- CHANGELOG.md
    git status
    git add -A
    git commit -m "Move certbot-dns-$1 to _internal structure"
}
```

Structure now looks like this:
```
certbot-dns-cloudflare/
├── certbot_dns_cloudflare
│   ├── dns_cloudflare_test.py
│   ├── __init__.py
│   └── _internal
│       ├── dns_cloudflare.py
│       └── __init__.py
```

* Move certbot-dns-cloudflare to _internal structure

* Move certbot-dns-cloudxns to _internal structure

* Move certbot-dns-digitalocean to _internal structure

* Move certbot-dns-dnsimple to _internal structure

* Move certbot-dns-dnsmadeeasy to _internal structure

* Move certbot-dns-gehirn to _internal structure

* Move certbot-dns-google to _internal structure

* Move certbot-dns-linode to _internal structure

* Move certbot-dns-luadns to _internal structure

* Move certbot-dns-nsone to _internal structure

* Move certbot-dns-ovh to _internal structure

* Move certbot-dns-rfc2136 to _internal structure

* Move certbot-dns-sakuracloud to _internal structure

* Init file comments need to be comments

* Move certbot-dns-route53 to _internal structure

* Fix comment in route53 init
2019-11-25 10:26:05 -08:00
ohemorange
8139689d4c Make the contents of the apache plugin private (#7579)
Part of #5775.

Tree:
```
certbot-apache/certbot_apache
├── __init__.py
├── _internal
│   ├── apache_util.py
│   ├── augeas_lens
│   │   ├── httpd.aug
│   │   └── README
│   ├── centos-options-ssl-apache.conf
│   ├── configurator.py
│   ├── constants.py
│   ├── display_ops.py
│   ├── entrypoint.py
│   ├── http_01.py
│   ├── __init__.py
│   ├── obj.py
│   ├── options-ssl-apache.conf
│   ├── override_arch.py
│   ├── override_centos.py
│   ├── override_darwin.py
│   ├── override_debian.py
│   ├── override_fedora.py
│   ├── override_gentoo.py
│   ├── override_suse.py
│   └── parser.py
└── tests
    ├── ...
```

* Create _internal folder for certbot_apache

* Move apache_util.py to _internal

* Move display_ops.py to _internal

* Move override_centos.py to _internal

* Move override_gentoo.py to _internal

* Move override_darwin.py to _internal

* Move override_suse.py to _internal

* Move override_debian.py to _internal

* Move override_fedora.py to _internal

* Move override_arch.py to _internal

* Move parser.py to _internal

* Move obj.py to _internal

* Move http_01.py to _internal

* Move entrypoint.py to _internal

* Move constants.py to _internal

* Move configurator.py to _internal

* Move augeas_lens to _internal

* Move options-ssl-apache.conf files to _internal

* move augeas_lens in MANIFEST

* Clean up some stray references to certbot_apache that could use _internal

* Correct imports and lint
2019-11-25 09:44:40 -08:00
ohemorange
a27b1137a5 Remove unused nginx docs (#7576)
Part of #5775. We don't use these docs anywhere, so delete them.

Removes:
- `certbot-nginx/readthedocs.org.requirements.txt`
- `certbot-nginx/docs/` folder
- docs include in `MANIFEST.in`
- docs dependencies in `setup.py`

* Remove unused nginx docs

* Add changelog entry about the removal
2019-11-25 09:18:12 -08:00
Brad Warren
5809aa6a2c remove unused route53 tools (#7586) 2019-11-22 22:24:51 +01:00
ohemorange
d8ca555eed Remove DNS plugin API docs. (#7578)
Replace DNS plugins' API documentation with a note that plugins adhere to certbot's plugin interface.
2019-11-22 12:58:06 -08:00
ohemorange
bd35e71b5c Remove unused certbot-compatibility-test docs (#7577)
Part of #5775. We don't use these docs anywhere, so delete them.

Removes:
- `certbot-compatibility-test/readthedocs.org.requirements.txt`
- `certbot-compatibility-test/docs/` folder
- docs include in `MANIFEST.in`
- docs dependencies in `setup.py`
2019-11-22 12:54:18 -08:00
ohemorange
70e4cb7853 Remove unused apache docs (#7575)
Part of #5775. We don't use these docs anywhere, so delete them.

Removes:
- `certbot-apache/readthedocs.org.requirements.txt`
- `certbot-apache/docs/` folder
- docs include in `MANIFEST.in`
- docs dependencies in `setup.py`
2019-11-22 12:50:01 -08:00
Joona Hoikkala
ac1a60ff0b Implement add_child_comment (#7518) 2019-11-18 21:21:51 +02:00
sydneyli
b70f9c4744 [Apache v2] Initial ApacheParser skeleton (#7559)
* Fix metadata & primary references in Augeas tests.

When performing actions only on one of the trees in DualNodeParser, the two
trees get out-of-sync. Similarly, we can't expect that the metadata between
the two trees will remain the same.

Did a pass over the tests to re-wire metadata and primary usage.

* Add ApacheParser skeleton.

Fix plumbing in configurator & dualparser to initialize ApacheParser
alongside AugeasParser.

* Silence coverage reports for now
2019-11-15 12:22:18 +02:00
Joona Hoikkala
517ff5cb19 [Apache v2] Implement delete_child() (#7521)
* Implement delete_child

* Fix linter
2019-11-12 14:19:35 -08:00
Joona Hoikkala
d14eec9ecf [Apache v2] Implement save() and unsaved_files() (#7520)
* Implement save() and unsaved_files()

* Linter fix
2019-11-12 14:19:21 -08:00
Joona Hoikkala
bdf24d2bed Implement add_child_directive (#7517) 2019-11-12 14:19:02 -08:00
Brad Warren
2bc64183a8 fix docstring 2019-11-11 17:11:47 -08:00
Joona Hoikkala
578ca1c6af [Apache v2] Adding nodes 1/3 : add_child_block() (#7497)
* Implement add_child_block()

* Add comments and example

* Check augas path inconsistencies in initialization
2019-11-11 11:33:14 -08:00
Joona Hoikkala
19de05c72f [Apache v2] Implement set_parameters() (#7461)
* find_comments implementation and AugeasCommentNode creation

* set_parameters implementation

* Change parameters to a property

* Remove parameters property setter

* More pythonic iteration handling
2019-11-06 10:33:24 -08:00
Brad Warren
a730b00a36 Release version v0.40.1 2019-11-05 19:47:14 -08:00
Erica Portnoy
5e01467e2c Release version v0.40.0 2019-11-05 15:08:42 -08:00
Joona Hoikkala
d645574839 [Apache v2] AugeasBlockNode find_comments() implementation (#7457)
* find_comments implementation and AugeasCommentNode creation

* Use dummy value for ancestor

* Add NotImplementedError when calling find_comments with exact parameter

* Remove parameter 'exact' from find_comments interface

* Fix comment
2019-10-30 11:03:23 -07:00
Joona Hoikkala
9b2322a573 Use dummy values for ancestor (#7462) 2019-10-25 14:54:28 -07:00
Joona Hoikkala
79caaa8e6f Merge pull request #7466 from certbot/update-apache-parser-v2-v2
Update the apache-parser-v2 branch
2019-10-25 18:56:47 +03:00
Brad Warren
8620dcf06f Merge branch 'master' into apache-parser-v2 2019-10-24 09:59:53 -07:00
Joona Hoikkala
3f36298716 [Apache v2] find_blocks and find_directives implementation (#7443)
* Implement AugeasDirectiveNode and AugeasBlockNode find and create functions

* Add tests for _aug_get_block_name
2019-10-21 12:52:00 -07:00
Oriol Teixidó
e9a9a180bb Multiarch (#3)
* Add one Dockerfile for each supported architecture

* Update multi arch hooks

* Create multi arch scripts

* Update README.md

* WIP. Use build args instead of multiple Dockerfiles in build script

* WIP. Fix typo mistake

* Use build args instead of multiple Dockerfiles in build script

* WIP. Build all the architectures in one DockerHub build

* Add arm64v8 architecture

* WIP. Testing build all the architectures in one DockerHub build

* Revert "WIP. Testing build all the architectures in one DockerHub build"

This reverts commit 94a89398a4120b183d2851ac7cb9c93db0e3d187.

* Refactor tag docker images in hooks/post_push files

* Use variables instead of positional arguments

* Export externally used variables

* Use ${variable//search/replace} instead of echo $variable | sed.

* Update README.md

* Add Cleanup in build.sh script

* Fix tagging error in post_push hook

* Add "-ex" flags to bash script

* Test tagging images in build hook

* Tagging in hook/post_build instead of hook/post_push

* Push built architecture dependent image

* Use Dockerfile argument instead of fixed value

* Fix typo

* Use parameter instead of global variable

* Use custom "hook/push" to prevent duplicated push

* Make a short doctype for each function declared in common
2019-10-11 17:07:45 +02:00
Adrien Ferrand
63c7dd109c Merge pull request #7435 from certbot/add-azure-pipelines
Add azure pipelines
2019-10-10 19:38:32 +02:00
Brad Warren
8f6fc67378 Merge branch 'master' into add-azure-pipelines 2019-10-08 17:05:12 -07:00
Erica Portnoy
67fddae90d Release version v0.39.0 2019-10-01 13:30:11 -07:00
Joona Hoikkala
a156d37ee1 Merge pull request #7400 from certbot/update-apache-parser-v2
Update apache parser v2
2019-09-26 10:49:36 +03:00
Brad Warren
1756ef8620 Merge branch 'master' into update-apache-parser-v2 2019-09-25 11:55:33 -07:00
Joona Hoikkala
feacbe9671 [Apache v2] DualParserNode implementation 3/3 (#7376)
* DualParserNode, DualCommentNode and DualDirectiveNode implementations

* Add DualBlockNode

* DualBlockNode find_ methods

* Address review comments

* Address review comments

* Simplify isPass

* Add explanation to _create_matching_list pydoc

* Remove unnecessary conditional block

* Address review comments
2019-09-23 16:27:48 -04:00
Joona Hoikkala
c224340330 [Apache v2] DualParserNode implementation 2/3 (#7375)
* DualParserNode, DualCommentNode and DualDirectiveNode implementations

* Add DualBlockNode

* Address review comments

* Address review comments

* Call the right assertion after name change

* Simplify isPass

* Add explanation to _create_matching_list pydoc

* Break when match was found
2019-09-19 17:44:50 -04:00
Joona Hoikkala
23fb6d2877 [Apache v2] DualParserNode implementation 1/3 (#7374)
* DualParserNode, DualCommentNode and DualDirectiveNode implementations

* Address review comments

* Address review comments

* Simplify isPass
2019-09-18 16:31:44 -04:00
Joona Hoikkala
9620cc75d4 [Apache v2] Allow initialization of ParserNode instances using metadata dictionary instead of required arguments (#7366)
Add metadata keyword argument to the ParserNode interface, allowing the initialization of the object from contents of the metadata - if the implementation allows it. As an example, Augeas implementation needs nothing more than the Augeas DOM path of a configuration directive to be able to populate the ParserNode instance with all data relevant to the DirectiveNode.

The checks also allow skipping the otherwise required keyword arguments if metadata is provided.

* Allow creating ParserNode instances using information from metadata dictionary

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Address review comments

* Fix filepath comment

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>
2019-09-09 12:09:09 -07:00
Brad Warren
7337f64180 Release version v0.38.0 2019-09-03 14:15:44 -07:00
Joona Hoikkala
af1c66b28f [Apache v2] Modifications to ParserNode interfaces (#7330)
This PR contains the changes requested in initial pre-review comments of #7308

Move properties to class pydocs in interfaces.py
Prefer class ABC register() functionality instead of class inheritance for interface classes
Add apache implementation specific functions to interfaces

* Move class argument definitions to class pydoc

* Add apache specific functionality to the interface

* Bring inheritance back

* Define initialization for different ParserNode classes

* Add parsernode utils to check keyword arguments and document the defaults in pydoc

* Fix pydocs and make BlockNode a child of DirectiveNode

* Refine docs, and remove unused __init__ from BlockNode

* Split parsernode util tests to their own respective file

* Skip cover for dummy calls to super

* Add types to method documentation

* Add documentation for children
2019-08-30 13:42:18 -07:00
J0WI
d296ef2dcd Update to Python 3.7 (#5)
certbot/certbot#6759
closes #4
2019-08-28 12:43:58 -07:00
Erica Portnoy
f64386c73c Release version v0.37.2 2019-08-21 16:13:06 -07:00
Joona Hoikkala
270754deff [Apache v2] New ParserNode interface abstraction (#7246)
* New ParserNode interface abstraction

* Add the test assertions, and fix interface

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Add dummy tests and change arguments to properties

* Add more context to comment property docstring

* Add documentation to the main docstring

* Streamline the parameter naming

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Add explicit instructions how whitespaces are treated in set_parameters

* Add the information about lookups being case insensitive.

* Add context about whitespacing to add_ - methods

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>
2019-08-13 09:22:51 -07:00
Joona Hoikkala
a83f9eb4e4 Merge pull request #7321 from certbot/fix-apache-parser-v2-cover
Fix apache-parser-v2 tests
2019-08-13 15:28:24 +03:00
Brad Warren
fed2264dac Merge branch 'master' into fix-apache-parser-v2-cover 2019-08-09 11:21:15 -07:00
Erica Portnoy
1666e85118 Release version v0.37.1 2019-08-08 17:58:41 -07:00
Brad Warren
db522aa155 Release version v0.37.0 2019-08-07 11:46:22 -07:00
Brad Warren
31a8d086fc Merge pull request #7289 from certbot/fix-apache-parser-v2
Fix AppVeyor on the apache-parser-v2 branch
2019-08-02 11:22:53 -07:00
Brad Warren
d0d7521215 Release version v0.36.0 2019-07-24 15:48:45 -07:00
Brad Warren
2fc6f6e619 Make deploy.sh executable. (#3) 2019-07-23 10:32:43 +02:00
J0WI
d8ab321894 Upgrade to Alpine 3.10 (#2)
certbot/certbot#7250
2019-07-22 17:07:14 -07:00
Adrien Ferrand
62b054f265 Create the deployment logic (#1)
* Integrate original adferrand/certbot-docker

* Make build.sh more symetric for readability

* Update README.md

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update README.md

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Add post_push hooks to update the latest tag

* Create error on build

* Revert "Create error on build"

This reverts commit d578d67130d3a0c4db7756209b0ade52953041a3.

* Update deploy.sh

* Fix deploy.sh with hotfixes versions

* Fix deploy.sh toward certbot version and release branch

* Enable push
2019-07-18 15:45:27 +02:00
Brad Warren
1d1c096067 add readme 2019-06-03 16:04:45 -07:00
Brad Warren
bcffaab602 add LICENSE.txt 2019-06-03 15:59:05 -07:00
Giles Thomas
b27e5804b9 Added change description to CHANGELOG.md 2019-02-05 18:43:03 +00:00
Giles Thomas
4ca03aec8d Don't verify existing certificate in HTTP01Response.simple_verify (certbot#6614) 2019-02-05 18:37:09 +00:00
953 changed files with 20176 additions and 14479 deletions

View File

@@ -69,12 +69,12 @@ Access can be defined for all or only selected repositories, which is nice.
```
- Redirected to Azure DevOps, select the account created in _Having an Azure DevOps account_ section.
- Select the organization, and click "Create a new project" (let's name it the same than the targetted github repo)
- Select the organization, and click "Create a new project" (let's name it the same than the targeted github repo)
- The Visibility is public, to profit from 10 parallel jobs
```
!!! ACCESS !!!
Azure Pipelines needs access to the GitHub account (in term of beeing able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
Azure Pipelines needs access to the GitHub account (in term of being able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
```
_Done. We can move to pipelines configuration._

View File

@@ -0,0 +1,14 @@
# Advanced pipeline for running our full test suite on demand.
trigger:
# When changing these triggers, please ensure the documentation under
# "Running tests in CI" is still correct.
- test-*
pr: none
variables:
# We don't publish our Docker images in this pipeline, but when building them
# for testing, let's use the nightly tag.
dockerTag: nightly
stages:
- template: templates/stages/test-and-package-stage.yml

View File

@@ -1,20 +0,0 @@
# Advanced pipeline for isolated checks and release purpose
trigger:
- test-*
- '*.x'
pr:
- test-*
# This pipeline is also nightly run on master
schedules:
- cron: "0 4 * * *"
displayName: Nightly build
branches:
include:
- master
always: true
jobs:
# Any addition here should be reflected in the release pipeline.
# It is advised to declare all jobs here as templates to improve maintainability.
- template: templates/tests-suite.yml
- template: templates/installer-tests.yml

View File

@@ -1,12 +1,7 @@
trigger:
# apache-parser-v2 is a temporary branch for doing work related to
# rewriting the parser in the Apache plugin.
- apache-parser-v2
- master
trigger: none
pr:
- apache-parser-v2
- master
- '*.x'
jobs:
- template: templates/tests-suite.yml
- template: templates/jobs/standard-tests-jobs.yml

View File

@@ -0,0 +1,18 @@
# Nightly pipeline running each day for master.
trigger: none
pr: none
schedules:
- cron: "30 4 * * *"
displayName: Nightly build
branches:
include:
- master
always: true
variables:
dockerTag: nightly
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/deploy-stage.yml
- template: templates/stages/notify-failure-stage.yml

View File

@@ -1,13 +1,18 @@
# Release pipeline to build and deploy Certbot for Windows for GitHub release tags
# Release pipeline to run our full test suite, build artifacts, and deploy them
# for GitHub release tags.
trigger:
tags:
include:
- v*
pr: none
jobs:
# Any addition here should be reflected in the advanced pipeline.
# It is advised to declare all jobs here as templates to improve maintainability.
- template: templates/tests-suite.yml
- template: templates/installer-tests.yml
- template: templates/changelog.yml
variables:
dockerTag: ${{variables['Build.SourceBranchName']}}
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/changelog-stage.yml
- template: templates/stages/deploy-stage.yml
parameters:
snapReleaseChannel: beta
- template: templates/stages/notify-failure-stage.yml

View File

@@ -1,14 +0,0 @@
jobs:
- job: changelog
pool:
vmImage: vs2017-win2016
steps:
- bash: |
CERTBOT_VERSION="$(python -c "import certbot; print(certbot.__version__)")"
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
displayName: Prepare changelog
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: changelog
displayName: Publish changelog

View File

@@ -1,32 +0,0 @@
jobs:
- job: installer
pool:
vmImage: vs2017-win2016
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.7
architecture: x86
addToPath: true
- script: python windows-installer/construct.py
displayName: Build Certbot installer
- task: CopyFiles@2
inputs:
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
contents: '*.exe'
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: windows-installer
displayName: Publish Windows installer
- script: $(Build.ArtifactStagingDirectory)\certbot-beta-installer-win32.exe /S
displayName: Install Certbot
- script: |
python -m venv venv
venv\Scripts\python tools\pip_install.py -e certbot-ci
displayName: Prepare Certbot-CI
- script: |
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
displayName: Run integration tests

View File

@@ -0,0 +1,89 @@
jobs:
- job: extended_test
variables:
- name: IMAGE_NAME
value: ubuntu-18.04
- name: PYTHON_VERSION
value: 3.8
- group: certbot-common
strategy:
matrix:
linux-py36:
PYTHON_VERSION: 3.6
TOXENV: py36
linux-py37:
PYTHON_VERSION: 3.7
TOXENV: py37
linux-py37-nopin:
PYTHON_VERSION: 3.7
TOXENV: py37
CERTBOT_NO_PIN: 1
linux-boulder-v1-integration-certbot-oldest:
TOXENV: integration-certbot-oldest
ACME_SERVER: boulder-v1
linux-boulder-v2-integration-certbot-oldest:
TOXENV: integration-certbot-oldest
ACME_SERVER: boulder-v2
linux-boulder-v1-integration-nginx-oldest:
TOXENV: integration-nginx-oldest
ACME_SERVER: boulder-v1
linux-boulder-v2-integration-nginx-oldest:
TOXENV: integration-nginx-oldest
ACME_SERVER: boulder-v2
linux-boulder-v1-py27-integration:
PYTHON_VERSION: 2.7
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py27-integration:
PYTHON_VERSION: 2.7
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py36-integration:
PYTHON_VERSION: 3.6
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py36-integration:
PYTHON_VERSION: 3.6
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py37-integration:
PYTHON_VERSION: 3.7
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py37-integration:
PYTHON_VERSION: 3.7
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py38-integration:
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py38-integration:
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: boulder-v2
nginx-compat:
TOXENV: nginx_compat
le-auto-oraclelinux6:
TOXENV: le_auto_oraclelinux6
docker-dev:
TOXENV: docker_dev
macos-farmtest-apache2:
# We run one of these test farm tests on macOS to help ensure the
# tests continue to work on the platform.
IMAGE_NAME: macOS-10.15
PYTHON_VERSION: 3.8
TOXENV: test-farm-apache2
farmtest-leauto-upgrades:
PYTHON_VERSION: 3.7
TOXENV: test-farm-leauto-upgrades
farmtest-certonly-standalone:
PYTHON_VERSION: 3.7
TOXENV: test-farm-certonly-standalone
farmtest-sdists:
PYTHON_VERSION: 3.7
TOXENV: test-farm-sdists
pool:
vmImage: $(IMAGE_NAME)
steps:
- template: ../steps/tox-steps.yml

View File

@@ -0,0 +1,221 @@
jobs:
- job: docker_build
pool:
vmImage: ubuntu-18.04
strategy:
matrix:
amd64:
DOCKER_ARCH: amd64
# Do not run the heavy non-amd64 builds for test branches
${{ if not(startsWith(variables['Build.SourceBranchName'], 'test-')) }}:
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
steps:
- bash: set -e && tools/docker/build.sh $(dockerTag) $DOCKER_ARCH
displayName: Build the Docker images
# We don't filter for the Docker Hub organization to continue to allow
# easy testing of these scripts on forks.
- bash: |
set -e
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}')
docker save --output images.tar $DOCKER_IMAGES
displayName: Save the Docker images
# If the name of the tar file or artifact changes, the deploy stage will
# also need to be updated.
- bash: set -e && mv images.tar $(Build.ArtifactStagingDirectory)
displayName: Prepare Docker artifact
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: docker_$(DOCKER_ARCH)
displayName: Store Docker artifact
- job: docker_run
dependsOn: docker_build
pool:
vmImage: ubuntu-18.04
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: docker_amd64
path: $(Build.SourcesDirectory)
displayName: Retrieve Docker images
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
displayName: Load Docker images
- bash: |
set -ex
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}:{{.Tag}}')
for DOCKER_IMAGE in ${DOCKER_IMAGES}
do docker run --rm "${DOCKER_IMAGE}" plugins --prepare
done
displayName: Run integration tests for Docker images
- job: installer_build
pool:
vmImage: vs2017-win2016
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.7
architecture: x86
addToPath: true
- script: python windows-installer/construct.py
displayName: Build Certbot installer
- task: CopyFiles@2
inputs:
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
contents: '*.exe'
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
artifact: windows-installer
displayName: Publish Windows installer
- job: installer_run
dependsOn: installer_build
strategy:
matrix:
win2019:
imageName: windows-2019
win2016:
imageName: vs2017-win2016
pool:
vmImage: $(imageName)
steps:
- powershell: |
if ($PSVersionTable.PSVersion.Major -ne 5) {
throw "Powershell version is not 5.x"
}
condition: eq(variables['imageName'], 'vs2017-win2016')
displayName: Check Powershell 5.x is used in vs2017-win2016
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- task: DownloadPipelineArtifact@2
inputs:
artifact: windows-installer
path: $(Build.SourcesDirectory)/bin
displayName: Retrieve Windows installer
# pip 9.0 provided by pipstrap is not able to resolve properly the pywin32 dependency
# required by certbot-ci: as a temporary workaround until pipstrap is updated, we install
# a recent version of pip, but we also to disable the isolated feature as described in
# https://github.com/certbot/certbot/issues/8256
- script: |
python -m venv venv
venv\Scripts\python -m pip install pip==20.2.3 setuptools==50.3.0 wheel==0.35.1
venv\Scripts\python tools\pip_install.py -e certbot-ci
env:
PIP_NO_BUILD_ISOLATION: no
displayName: Prepare Certbot-CI
- script: |
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\windows_installer_integration_tests --allow-persistent-changes --installer-path $(Build.SourcesDirectory)\bin\certbot-beta-installer-win32.exe
displayName: Run windows installer integration tests
- script: |
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
displayName: Run certbot integration tests
- job: snaps_build
pool:
vmImage: ubuntu-18.04
timeoutInMinutes: 0
variables:
# Do not run the heavy non-amd64 builds for test branches
${{ if not(startsWith(variables['Build.SourceBranchName'], 'test-')) }}:
ARCHS: amd64 arm64 armhf
${{ if startsWith(variables['Build.SourceBranchName'], 'test-') }}:
ARCHS: amd64
steps:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
sudo snap install --classic snapcraft
displayName: Install dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- task: DownloadSecureFile@1
name: credentials
inputs:
secureFile: launchpad-credentials
- script: |
set -e
git config --global user.email "$(Build.RequestedForEmail)"
git config --global user.name "$(Build.RequestedFor)"
mkdir -p ~/.local/share/snapcraft/provider/launchpad
cp $(credentials.secureFilePath) ~/.local/share/snapcraft/provider/launchpad/credentials
python3 tools/snap/build_remote.py ALL --archs ${ARCHS}
displayName: Build snaps
- script: |
set -e
mv *.snap $(Build.ArtifactStagingDirectory)
mv certbot-dns-*/*.snap $(Build.ArtifactStagingDirectory)
displayName: Prepare artifacts
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: snaps
displayName: Store snaps artifacts
- job: snap_run
dependsOn: snaps_build
pool:
vmImage: ubuntu-18.04
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- script: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends nginx-light snapd
python3 -m venv venv
venv/bin/python letsencrypt-auto-source/pieces/pipstrap.py
venv/bin/python tools/pip_install.py -U tox
displayName: Install dependencies
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- script: |
set -e
sudo snap install --dangerous --classic snap/certbot_*_amd64.snap
displayName: Install Certbot snap
- script: |
set -e
venv/bin/python -m tox -e integration-external,apacheconftest-external-with-pebble
displayName: Run tox
- job: snap_dns_run
dependsOn: snaps_build
pool:
vmImage: ubuntu-18.04
steps:
- script: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
displayName: Install dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- script: |
set -e
python3 -m venv venv
venv/bin/python letsencrypt-auto-source/pieces/pipstrap.py
venv/bin/python tools/pip_install.py -e certbot-ci
displayName: Prepare Certbot-CI
- script: |
set -e
sudo -E venv/bin/pytest certbot-ci/snap_integration_tests/dns_tests --allow-persistent-changes --snap-folder $(Build.SourcesDirectory)/snap --snap-arch amd64
displayName: Test DNS plugins snaps

View File

@@ -0,0 +1,75 @@
jobs:
- job: test
variables:
PYTHON_VERSION: 3.8
strategy:
matrix:
macos-py27:
IMAGE_NAME: macOS-10.15
PYTHON_VERSION: 2.7
TOXENV: py27
macos-py38:
IMAGE_NAME: macOS-10.15
PYTHON_VERSION: 3.8
TOXENV: py38
windows-py36:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.6
TOXENV: py36
windows-py37-cover:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.7
TOXENV: py37-cover
windows-integration-certbot:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.7
TOXENV: integration-certbot
linux-oldest-tests-1:
IMAGE_NAME: ubuntu-18.04
TOXENV: py27-{acme,apache,apache-v2,certbot}-oldest
linux-oldest-tests-2:
IMAGE_NAME: ubuntu-18.04
TOXENV: py27-{dns,nginx}-oldest
linux-py27:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 2.7
TOXENV: py27
linux-py36:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.6
TOXENV: py36
linux-py38-cover:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.8
TOXENV: py38-cover
linux-py37-lint:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.7
TOXENV: lint
linux-py36-mypy:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.6
TOXENV: mypy
linux-integration:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 2.7
TOXENV: integration
ACME_SERVER: pebble
apache-compat:
IMAGE_NAME: ubuntu-18.04
TOXENV: apache_compat
le-auto-centos6:
IMAGE_NAME: ubuntu-18.04
TOXENV: le_auto_centos6
apacheconftest:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 2.7
TOXENV: apacheconftest-with-pebble
nginxroundtrip:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 2.7
TOXENV: nginxroundtrip
pool:
vmImage: $(IMAGE_NAME)
steps:
- template: ../steps/tox-steps.yml

View File

@@ -0,0 +1,19 @@
stages:
- stage: Changelog
jobs:
- job: prepare
pool:
vmImage: vs2017-win2016
steps:
# If we change the output filename from `release_notes.md`, it should also be changed in tools/create_github_release.py
- bash: |
set -e
CERTBOT_VERSION="$(cd certbot && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
displayName: Prepare changelog
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
artifact: changelog
displayName: Publish changelog

View File

@@ -0,0 +1,99 @@
parameters:
- name: snapReleaseChannel
type: string
default: edge
values:
- edge
- beta
stages:
- stage: Deploy
jobs:
# This job relies on credentials used to publish the Certbot snaps. This
# credential file was created by running:
#
# snapcraft logout
# snapcraft login (provide the shared snapcraft credentials when prompted)
# snapcraft export-login --channels=beta,edge snapcraft.cfg
#
# Then the file was added as a secure file in Azure pipelines
# with the name snapcraft.cfg by following the instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops
# including authorizing the file in all pipelines as described at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/secure-files?view=azure-devops#how-do-i-authorize-a-secure-file-for-use-in-all-pipelines.
#
# This file has a maximum lifetime of one year and the current
# file will expire on 2021-07-28 which is also tracked by
# https://github.com/certbot/certbot/issues/7931. The file will
# need to be updated before then to prevent automated deploys
# from breaking.
#
# Revoking these credentials can be done by changing the password of the
# account used to generate the credentials. See
# https://forum.snapcraft.io/t/revoking-exported-credentials/19031 for
# more info.
- job: publish_snap
pool:
vmImage: ubuntu-18.04
variables:
- group: certbot-common
steps:
- bash: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
sudo snap install --classic snapcraft
displayName: Install dependencies
- task: DownloadPipelineArtifact@2
inputs:
artifact: snaps
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snaps
- task: DownloadSecureFile@1
name: snapcraftCfg
inputs:
secureFile: snapcraft.cfg
- bash: |
set -e
mkdir -p .snapcraft
ln -s $(snapcraftCfg.secureFilePath) .snapcraft/snapcraft.cfg
for SNAP_FILE in snap/*.snap; do
tools/retry.sh eval snapcraft upload --release=${{ parameters.snapReleaseChannel }} "${SNAP_FILE}"
done
displayName: Publish to Snap store
- job: publish_docker
pool:
vmImage: ubuntu-18.04
strategy:
matrix:
amd64:
DOCKER_ARCH: amd64
arm32v6:
DOCKER_ARCH: arm32v6
arm64v8:
DOCKER_ARCH: arm64v8
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: docker_$(DOCKER_ARCH)
path: $(Build.SourcesDirectory)
displayName: Retrieve Docker images
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
displayName: Load Docker images
- task: Docker@2
inputs:
command: login
# The credentials used here are for the shared certbotbot account
# on Docker Hub. The credentials are stored in a service account
# which was created by following the instructions at
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
# The name given to this service account must match the value
# given to containerRegistry below. "Grant access to all
# pipelines" should also be checked. To revoke these
# credentials, we can change the password on the certbotbot
# Docker Hub account or remove the account from the
# Certbot organization on Docker Hub.
containerRegistry: docker-hub
displayName: Login to Docker Hub
- bash: set -e && tools/docker/deploy.sh $(dockerTag) $DOCKER_ARCH
displayName: Deploy the Docker images

View File

@@ -0,0 +1,19 @@
stages:
- stage: On_Failure
jobs:
- job: notify_mattermost
variables:
- group: certbot-common
pool:
vmImage: ubuntu-latest
steps:
- bash: |
set -e
MESSAGE="\
---\n\
##### Azure Pipeline
*Repo* $(Build.Repository.ID) - *Pipeline* $(Build.DefinitionName) #$(Build.BuildNumber) - *Branch/PR* $(Build.SourceBranchName)\n\
:warning: __Pipeline has failed__: [Link to the build](https://dev.azure.com/$(Build.Repository.ID)/_build/results?buildId=$(Build.BuildId)&view=results)\n\n\
---"
curl -i -X POST --data-urlencode "payload={\"text\":\"${MESSAGE}\"}" "$(MATTERMOST_URL)"
condition: failed()

View File

@@ -0,0 +1,6 @@
stages:
- stage: TestAndPackage
jobs:
- template: ../jobs/standard-tests-jobs.yml
- template: ../jobs/extended-tests-jobs.yml
- template: ../jobs/packaging-jobs.yml

View File

@@ -0,0 +1,57 @@
steps:
- bash: |
set -e
brew install augeas
condition: startswith(variables['IMAGE_NAME'], 'macOS')
displayName: Install MacOS dependencies
- bash: |
set -e
sudo apt-get update
sudo apt-get install -y --no-install-recommends \
python-dev \
gcc \
libaugeas0 \
libssl-dev \
libffi-dev \
ca-certificates \
nginx-light \
openssl
sudo systemctl stop nginx
condition: startswith(variables['IMAGE_NAME'], 'ubuntu')
displayName: Install Linux dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: $(PYTHON_VERSION)
addToPath: true
# tools/pip_install.py is used to pin packages to a known working version
# except in tests where the environment variable CERTBOT_NO_PIN is set.
# virtualenv is listed here explicitly to make sure it is upgraded when
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
# version of virtualenv. The option "-I" is set so when CERTBOT_NO_PIN is also
# set, pip updates dependencies it thinks are already satisfied to avoid some
# problems with its lack of real dependency resolution.
- bash: |
set -e
python letsencrypt-auto-source/pieces/pipstrap.py
python tools/pip_install.py -I tox virtualenv
displayName: Install runtime dependencies
- task: DownloadSecureFile@1
name: testFarmPem
inputs:
secureFile: azure-test-farm.pem
condition: contains(variables['TOXENV'], 'test-farm')
- bash: |
set -e
export TARGET_BRANCH="`echo "${BUILD_SOURCEBRANCH}" | sed -E 's!refs/(heads|tags)/!!g'`"
[ -z "${SYSTEM_PULLREQUEST_TARGETBRANCH}" ] || export TARGET_BRANCH="${SYSTEM_PULLREQUEST_TARGETBRANCH}"
env
if [[ "${TOXENV}" == *"oldest"* ]]; then
tools/run_oldest_tests.sh
else
python -m tox
fi
env:
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
AWS_EC2_PEM_FILE: $(testFarmPem.secureFilePath)
displayName: Run tox

View File

@@ -1,38 +0,0 @@
jobs:
- job: test
pool:
vmImage: vs2017-win2016
strategy:
matrix:
py35:
PYTHON_VERSION: 3.5
TOXENV: py35
py37-cover:
PYTHON_VERSION: 3.7
TOXENV: py37-cover
integration-certbot:
PYTHON_VERSION: 3.7
TOXENV: integration-certbot
PYTEST_ADDOPTS: --numprocesses 4
variables:
- group: certbot-common
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(PYTHON_VERSION)
addToPath: true
- script: python tools/pip_install.py -U tox coverage
displayName: Install dependencies
- script: python -m tox
displayName: Run tox
# We do not require codecov report upload to succeed. So to avoid to break the pipeline if
# something goes wrong, each command is suffixed with a command that hides any non zero exit
# codes and echoes an informative message instead.
- bash: |
curl -s https://codecov.io/bash -o codecov-bash || echo "Failed to download codecov-bash"
chmod +x codecov-bash || echo "Failed to apply execute permissions on codecov-bash"
./codecov-bash -F windows || echo "Codecov did not collect coverage reports"
condition: in(variables['TOXENV'], 'py37-cover', 'integration-certbot')
env:
CODECOV_TOKEN: $(codecov_token)
displayName: Publish coverage

View File

@@ -1,18 +0,0 @@
coverage:
status:
project:
default: off
linux:
flags: linux
# Fixed target instead of auto set by #7173, can
# be removed when flags in Codecov are added back.
target: 97.4
threshold: 0.1
base: auto
windows:
flags: windows
# Fixed target instead of auto set by #7173, can
# be removed when flags in Codecov are added back.
target: 97.4
threshold: 0.1
base: auto

18
.editorconfig Normal file
View File

@@ -0,0 +1,18 @@
# https://editorconfig.org/
root = true
[*]
insert_final_newline = true
trim_trailing_whitespace = true
end_of_line = lf
[*.py]
indent_style = space
indent_size = 4
charset = utf-8
max_line_length = 100
[*.yaml]
indent_style = space
indent_size = 2

12
.envrc Normal file
View File

@@ -0,0 +1,12 @@
# This file is just a nicety for developers who use direnv. When you cd under
# the Certbot repo, Certbot's virtual environment will be automatically
# activated and then deactivated when you cd elsewhere. Developers have to have
# direnv set up and run `direnv allow` to allow this file to execute on their
# system. You can find more information at https://direnv.net/.
. venv3/bin/activate
# direnv doesn't support modifying PS1 so we unset it to squelch the error
# it'll otherwise print about this being done in the activate script. See
# https://github.com/direnv/direnv/wiki/PS1. If you would like your shell
# prompt to change like it normally does, see
# https://github.com/direnv/direnv/wiki/Python#restoring-the-ps1.
unset PS1

16
.gitignore vendored
View File

@@ -26,6 +26,7 @@ tags
\#*#
.idea
.ropeproject
.vscode
# auth --cert-path --chain-path
/*.pem
@@ -34,6 +35,7 @@ tags
tests/letstest/letest-*/
tests/letstest/*.pem
tests/letstest/venv/
tests/letstest/venv3/
.venv
@@ -49,3 +51,17 @@ tests/letstest/venv/
.certbot_test_workspace
**/assets/pebble*
**/assets/challtestsrv*
# snap files
.snapcraft
parts
prime
stage
*.snap
snap-constraints.txt
qemu-*
certbot-dns*/certbot-dns*_amd64*.txt
certbot-dns*/certbot-dns*_arm*.txt
/certbot_amd64*.txt
/certbot_arm*.txt
certbot-dns*/snap

7
.isort.cfg Normal file
View File

@@ -0,0 +1,7 @@
[settings]
skip_glob=venv*
skip=letsencrypt-auto-source
force_sort_within_sections=True
force_single_line=True
order_by_type=False
line_length=400

View File

@@ -24,6 +24,11 @@ persistent=yes
# usually to register additional checkers.
load-plugins=linter_plugin
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
[MESSAGES CONTROL]
@@ -41,10 +46,14 @@ load-plugins=linter_plugin
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
disable=fixme,locally-disabled,locally-enabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design
# abstract-class-not-used cannot be disabled locally (at least in
# pylint 1.4.1), same for abstract-class-little-used
# CERTBOT COMMENT
# 1) Once certbot codebase is claimed to be compatible exclusively with Python 3,
# the useless-object-inheritance check can be enabled again, and code fixed accordingly.
# 2) Check unsubscriptable-object tends to create a lot of false positives. Let's disable it.
# See https://github.com/PyCQA/pylint/issues/1498.
# 3) Same as point 2 for no-value-for-parameter.
# See https://github.com/PyCQA/pylint/issues/2820.
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue
[REPORTS]

View File

@@ -1,312 +0,0 @@
language: python
dist: xenial
cache:
directories:
- $HOME/.cache/pip
before_script:
- 'if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then ulimit -n 1024 ; fi'
# On Travis, the fastest parallelization for integration tests has proved to be 4.
- 'if [[ "$TOXENV" == *"integration"* ]]; then export PYTEST_ADDOPTS="--numprocesses 4"; fi'
# Use Travis retry feature for farm tests since they are flaky
- 'if [[ "$TOXENV" == "travis-test-farm"* ]]; then export TRAVIS_RETRY=travis_retry; fi'
- export TOX_TESTENV_PASSENV=TRAVIS
# Only build pushes to the master branch, PRs, and branches beginning with
# `test-` or of the form `digit(s).digit(s).x`. This reduces the number of
# simultaneous Travis runs, which speeds turnaround time on review since there
# is a cap of on the number of simultaneous runs.
branches:
only:
# apache-parser-v2 is a temporary branch for doing work related to
# rewriting the parser in the Apache plugin.
- apache-parser-v2
- master
- /^\d+\.\d+\.x$/
- /^test-.*$/
# Jobs for the main test suite are always executed (including on PRs) except for pushes on master.
not-on-master: &not-on-master
if: NOT (type = push AND branch = master)
# Jobs for the extended test suite are executed for cron jobs and pushes to
# non-development branches. See the explanation for apache-parser-v2 above.
extended-test-suite: &extended-test-suite
if: type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))
matrix:
include:
# Main test suite
- python: "2.7"
env: ACME_SERVER=pebble TOXENV=integration
<<: *not-on-master
# This job is always executed, including on master
- python: "2.7"
env: TOXENV=py27-cover FYI="py27 tests + code coverage"
- python: "2.7"
env: TOXENV=lint
<<: *not-on-master
- python: "3.4"
env: TOXENV=mypy
<<: *not-on-master
- python: "3.5"
env: TOXENV=mypy
<<: *not-on-master
- python: "2.7"
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
env: TOXENV='py27-{acme,apache,certbot,dns,nginx}-oldest'
<<: *not-on-master
- python: "3.4"
env: TOXENV=py34
<<: *not-on-master
- python: "3.7"
env: TOXENV=py37
<<: *not-on-master
- python: "3.8"
env: TOXENV=py38
<<: *not-on-master
- sudo: required
env: TOXENV=apache_compat
services: docker
before_install:
addons:
<<: *not-on-master
- sudo: required
env: TOXENV=le_auto_xenial
services: docker
<<: *not-on-master
- python: "2.7"
env: TOXENV=apacheconftest-with-pebble
<<: *not-on-master
- python: "2.7"
env: TOXENV=nginxroundtrip
<<: *not-on-master
# Extended test suite on cron jobs and pushes to tested branches other than master
- sudo: required
env: TOXENV=nginx_compat
services: docker
before_install:
addons:
<<: *extended-test-suite
- python: "2.7"
env:
- TOXENV=travis-test-farm-apache2
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
<<: *extended-test-suite
- python: "2.7"
env:
- TOXENV=travis-test-farm-leauto-upgrades
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
git:
depth: false # This is needed to have the history to checkout old versions of certbot-auto.
<<: *extended-test-suite
- python: "2.7"
env:
- TOXENV=travis-test-farm-certonly-standalone
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
<<: *extended-test-suite
- python: "2.7"
env:
- TOXENV=travis-test-farm-sdists
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
<<: *extended-test-suite
- python: "3.7"
env: TOXENV=py37 CERTBOT_NO_PIN=1
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration-certbot-oldest
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration-certbot-oldest
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration-nginx-oldest
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration-nginx-oldest
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.4"
env: TOXENV=py34
<<: *extended-test-suite
- python: "3.5"
env: TOXENV=py35
<<: *extended-test-suite
- python: "3.6"
env: TOXENV=py36
<<: *extended-test-suite
- python: "3.7"
env: TOXENV=py37
<<: *extended-test-suite
- python: "3.8-dev"
env: TOXENV=py38
<<: *extended-test-suite
- python: "3.4"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.4"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.5"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.5"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.6"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.6"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.8-dev"
env: ACME_SERVER=boulder-v1 TOXENV=integration
<<: *extended-test-suite
- python: "3.8-dev"
env: ACME_SERVER=boulder-v2 TOXENV=integration
<<: *extended-test-suite
- sudo: required
env: TOXENV=le_auto_jessie
services: docker
<<: *extended-test-suite
- sudo: required
env: TOXENV=le_auto_centos6
services: docker
<<: *extended-test-suite
- sudo: required
env: TOXENV=docker_dev
services: docker
addons:
apt:
packages: # don't install nginx and apache
- libaugeas0
<<: *extended-test-suite
- language: generic
env: TOXENV=py27
os: osx
# Using this osx_image is a workaround for
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
osx_image: xcode10.2
addons:
homebrew:
packages:
- augeas
- python2
<<: *extended-test-suite
- language: generic
env: TOXENV=py3
os: osx
# Using this osx_image is a workaround for
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
osx_image: xcode10.2
addons:
homebrew:
packages:
- augeas
- python3
<<: *extended-test-suite
# container-based infrastructure
sudo: false
addons:
apt:
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
- python-dev
- gcc
- libaugeas0
- libssl-dev
- libffi-dev
- ca-certificates
# For certbot-nginx integration testing
- nginx-light
- openssl
# tools/pip_install.py is used to pin packages to a known working version
# except in tests where the environment variable CERTBOT_NO_PIN is set.
# virtualenv is listed here explicitly to make sure it is upgraded when
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
# version of virtualenv.
install: 'tools/pip_install.py -U codecov tox virtualenv'
# Most of the time TRAVIS_RETRY is an empty string, and has no effect on the
# script command. It is set only to `travis_retry` during farm tests, in
# order to trigger the Travis retry feature, and compensate the inherent
# flakiness of these specific tests.
script: '$TRAVIS_RETRY tox'
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov -F linux'
notifications:
email: false
irc:
channels:
# This is set to a secure variable to prevent forks from sending
# notifications. This value was created by installing
# https://github.com/travis-ci/travis.rb and running
# `travis encrypt "chat.freenode.net#certbot-devel"`.
- secure: "EWW66E2+KVPZyIPR8ViENZwfcup4Gx3/dlimmAZE0WuLwxDCshBBOd3O8Rf6pBokEoZlXM5eDT6XdyJj8n0DLslgjO62pExdunXpbcMwdY7l1ELxX2/UbnDTE6UnPYa09qVBHNG7156Z6yE0x2lH4M9Ykvp0G0cubjPQHylAwo0="
on_cancel: never
on_success: never
on_failure: always

View File

@@ -21,6 +21,7 @@ Authors
* [Andrzej Górski](https://github.com/andrzej3393)
* [Anselm Levskaya](https://github.com/levskaya)
* [Antoine Jacoutot](https://github.com/ajacoutot)
* [April King](https://github.com/april)
* [asaph](https://github.com/asaph)
* [Axel Beckert](https://github.com/xtaran)
* [Bas](https://github.com/Mechazawa)
@@ -35,7 +36,9 @@ Authors
* [Blake Griffith](https://github.com/cowlicks)
* [Brad Warren](https://github.com/bmw)
* [Brandon Kraft](https://github.com/kraftbj)
* [Brandon Kreisel](https://github.com/kraftbj)
* [Brandon Kreisel](https://github.com/BKreisel)
* [Brian Heim](https://github.com/brianlheim)
* [Cameron Steel](https://github.com/Tugzrida)
* [Ceesjan Luiten](https://github.com/quinox)
* [Chad Whitacre](https://github.com/whit537)
* [Chhatoi Pritam Baral](https://github.com/pritambaral)
@@ -58,6 +61,7 @@ Authors
* [Daniel Albers](https://github.com/AID)
* [Daniel Aleksandersen](https://github.com/da2x)
* [Daniel Convissor](https://github.com/convissor)
* [Daniel "Drex" Drexler](https://github.com/aeturnum)
* [Daniel Huang](https://github.com/dhuang)
* [Dave Guarino](https://github.com/daguar)
* [David cz](https://github.com/dave-cz)
@@ -82,6 +86,7 @@ Authors
* [Felix Schwarz](https://github.com/FelixSchwarz)
* [Felix Yan](https://github.com/felixonmars)
* [Filip Ochnik](https://github.com/filipochnik)
* [Florian Klink](https://github.com/flokli)
* [Francois Marier](https://github.com/fmarier)
* [Frank](https://github.com/Frankkkkk)
* [Frederic BLANC](https://github.com/fblanc)
@@ -100,7 +105,9 @@ Authors
* [Harlan Lieberman-Berg](https://github.com/hlieberman)
* [Henri Salo](https://github.com/fgeek)
* [Henry Chen](https://github.com/henrychen95)
* [Hugo van Kemenade](https://github.com/hugovk)
* [Ingolf Becker](https://github.com/watercrossing)
* [Ivan Nejgebauer](https://github.com/inejge)
* [Jaap Eldering](https://github.com/eldering)
* [Jacob Hoffman-Andrews](https://github.com/jsha)
* [Jacob Sachs](https://github.com/jsachs)
@@ -124,6 +131,7 @@ Authors
* [Jonathan Herlin](https://github.com/Jonher937)
* [Jon Walsh](https://github.com/code-tree)
* [Joona Hoikkala](https://github.com/joohoi)
* [Josh McCullough](https://github.com/JoshMcCullough)
* [Josh Soref](https://github.com/jsoref)
* [Joubin Jabbari](https://github.com/joubin)
* [Juho Juopperi](https://github.com/jkjuopperi)
@@ -146,6 +154,7 @@ Authors
* [Luca Olivetti](https://github.com/olivluca)
* [Luke Rogers](https://github.com/lukeroge)
* [Maarten](https://github.com/mrtndwrd)
* [Mads Jensen](https://github.com/atombrella)
* [Maikel Martens](https://github.com/krukas)
* [Malte Janduda](https://github.com/MalteJ)
* [Mantas Mikulėnas](https://github.com/grawity)
@@ -197,6 +206,7 @@ Authors
* [Pierre Jaury](https://github.com/kaiyou)
* [Piotr Kasprzyk](https://github.com/kwadrat)
* [Prayag Verma](https://github.com/pra85)
* [Rasesh Patel](https://github.com/raspat1)
* [Reinaldo de Souza Jr](https://github.com/juniorz)
* [Remi Rampin](https://github.com/remram44)
* [Rémy HUBSCHER](https://github.com/Natim)
@@ -204,6 +214,7 @@ Authors
* [Richard Barnes](https://github.com/r-barnes)
* [Richard Panek](https://github.com/kernelpanek)
* [Robert Buchholz](https://github.com/rbu)
* [Robert Dailey](https://github.com/pahrohfit)
* [Robert Habermann](https://github.com/frennkie)
* [Robert Xiao](https://github.com/nneonneo)
* [Roland Shoemaker](https://github.com/rolandshoemaker)
@@ -229,9 +240,11 @@ Authors
* [Spencer Bliven](https://github.com/sbliven)
* [Stacey Sheldon](https://github.com/solidgoldbomb)
* [Stavros Korokithakis](https://github.com/skorokithakis)
* [Ștefan Talpalaru](https://github.com/stefantalpalaru)
* [Stefan Weil](https://github.com/stweil)
* [Steve Desmond](https://github.com/stevedesmond-ca)
* [sydneyli](https://github.com/sydneyli)
* [taixx046](https://github.com/taixx046)
* [Tan Jay Jun](https://github.com/jayjun)
* [Tapple Gao](https://github.com/tapple)
* [Telepenin Nikolay](https://github.com/telepenin)
@@ -263,5 +276,6 @@ Authors
* [Yomna](https://github.com/ynasser)
* [Yoni Jah](https://github.com/yonjah)
* [YourDaddyIsHere](https://github.com/YourDaddyIsHere)
* [Yuseong Cho](https://github.com/g6123)
* [Zach Shepherd](https://github.com/zjs)
* [陈三](https://github.com/chenxsan)

File diff suppressed because it is too large Load Diff

1
CHANGELOG.md Symbolic link
View File

@@ -0,0 +1 @@
certbot/CHANGELOG.md

View File

@@ -11,7 +11,7 @@ to the Sphinx generated docs is provided below.
[1] https://github.com/blog/1184-contributing-guidelines
[2] http://docutils.sourceforge.net/docs/user/rst/quickref.html#hyperlink-targets
[2] https://docutils.sourceforge.io/docs/user/rst/quickref.html#hyperlink-targets
-->

View File

@@ -6,16 +6,15 @@ EXPOSE 80 443
WORKDIR /opt/certbot/src
# TODO: Install Apache/Nginx for plugin development.
COPY . .
RUN apt-get update && \
apt-get install apache2 git nginx-light -y && \
letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
apt-get install apache2 git python3-dev python3-venv gcc libaugeas0 \
libssl-dev libffi-dev ca-certificates openssl nginx-light -y && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*
RUN VENV_NAME="../venv" python tools/venv.py
RUN VENV_NAME="../venv3" python3 tools/venv3.py
ENV PATH /opt/certbot/venv/bin:$PATH
ENV PATH /opt/certbot/venv3/bin:$PATH

View File

@@ -1,131 +0,0 @@
.. This file contains a series of comments that are used to include sections of this README in other files. Do not modify these comments unless you know what you are doing. tag:intro-begin
Certbot is part of EFFs effort to encrypt the entire Internet. Secure communication over the Web relies on HTTPS, which requires the use of a digital certificate that lets browsers verify the identity of web servers (e.g., is that really google.com?). Web servers obtain their certificates from trusted third parties called certificate authorities (CAs). Certbot is an easy-to-use client that fetches a certificate from Lets Encrypt—an open certificate authority launched by the EFF, Mozilla, and others—and deploys it to a web server.
Anyone who has gone through the trouble of setting up a secure website knows what a hassle getting and maintaining a certificate is. Certbot and Lets Encrypt can automate away the pain and let you turn on and manage HTTPS with simple commands. Using Certbot and Let's Encrypt is free, so theres no need to arrange payment.
How you use Certbot depends on the configuration of your web server. The best way to get started is to use our `interactive guide <https://certbot.eff.org>`_. It generates instructions based on your configuration settings. In most cases, youll need `root or administrator access <https://certbot.eff.org/faq/#does-certbot-require-root-administrator-privileges>`_ to your web server to run Certbot.
Certbot is meant to be run directly on your web server, not on your personal computer. If youre using a hosted service and dont have direct access to your web server, you might not be able to use Certbot. Check with your hosting provider for documentation about uploading certificates or using certificates issued by Lets Encrypt.
Certbot is a fully-featured, extensible client for the Let's
Encrypt CA (or any other CA that speaks the `ACME
<https://github.com/ietf-wg-acme/acme/blob/master/draft-ietf-acme-acme.md>`_
protocol) that can automate the tasks of obtaining certificates and
configuring webservers to use them. This client runs on Unix-based operating
systems.
To see the changes made to Certbot between versions please refer to our
`changelog <https://github.com/certbot/certbot/blob/master/CHANGELOG.md>`_.
Until May 2016, Certbot was named simply ``letsencrypt`` or ``letsencrypt-auto``,
depending on install method. Instructions on the Internet, and some pieces of the
software, may still refer to this older name.
Contributing
------------
If you'd like to contribute to this project please read `Developer Guide
<https://certbot.eff.org/docs/contributing.html>`_.
This project is governed by `EFF's Public Projects Code of Conduct <https://www.eff.org/pages/eppcode>`_.
.. _installation:
How to run the client
---------------------
The easiest way to install and run Certbot is by visiting `certbot.eff.org`_,
where you can find the correct instructions for many web server and OS
combinations. For more information, see `Get Certbot
<https://certbot.eff.org/docs/install.html>`_.
.. _certbot.eff.org: https://certbot.eff.org/
Understanding the client in more depth
--------------------------------------
To understand what the client is doing in detail, it's important to
understand the way it uses plugins. Please see the `explanation of
plugins <https://certbot.eff.org/docs/using.html#plugins>`_ in
the User Guide.
Links
=====
.. Do not modify this comment unless you know what you're doing. tag:links-begin
Documentation: https://certbot.eff.org/docs
Software project: https://github.com/certbot/certbot
Notes for developers: https://certbot.eff.org/docs/contributing.html
Main Website: https://certbot.eff.org
Let's Encrypt Website: https://letsencrypt.org
Community: https://community.letsencrypt.org
ACME spec: http://ietf-wg-acme.github.io/acme/
ACME working area in github: https://github.com/ietf-wg-acme/acme
|build-status| |coverage| |docs| |container|
.. |build-status| image:: https://travis-ci.com/certbot/certbot.svg?branch=master
:target: https://travis-ci.com/certbot/certbot
:alt: Travis CI status
.. |coverage| image:: https://codecov.io/gh/certbot/certbot/branch/master/graph/badge.svg
:target: https://codecov.io/gh/certbot/certbot
:alt: Coverage status
.. |docs| image:: https://readthedocs.org/projects/letsencrypt/badge/
:target: https://readthedocs.org/projects/letsencrypt/
:alt: Documentation status
.. |container| image:: https://quay.io/repository/letsencrypt/letsencrypt/status
:target: https://quay.io/repository/letsencrypt/letsencrypt
:alt: Docker Repository on Quay.io
.. Do not modify this comment unless you know what you're doing. tag:links-end
System Requirements
===================
See https://certbot.eff.org/docs/install.html#system-requirements.
.. Do not modify this comment unless you know what you're doing. tag:intro-end
.. Do not modify this comment unless you know what you're doing. tag:features-begin
Current Features
=====================
* Supports multiple web servers:
- apache/2.x
- nginx/0.8.48+
- webroot (adds files to webroot directories in order to prove control of
domains and obtain certs)
- standalone (runs its own simple webserver to prove you control a domain)
- other server software via `third party plugins <https://certbot.eff.org/docs/using.html#third-party-plugins>`_
* The private key is generated locally on your system.
* Can talk to the Let's Encrypt CA or optionally to other ACME
compliant services.
* Can get domain-validated (DV) certificates.
* Can revoke certificates.
* Adjustable RSA key bit-length (2048 (default), 4096, ...).
* Can optionally install a http -> https redirect, so your site effectively
runs https only (Apache only)
* Fully automated.
* Configuration changes are logged and can be reverted.
* Supports an interactive text UI, or can be driven entirely from the
command line.
* Free and Open Source Software, made with Python.
.. Do not modify this comment unless you know what you're doing. tag:features-end
For extensive documentation on using and contributing to Certbot, go to https://certbot.eff.org/docs. If you would like to contribute to the project or run the latest code from git, you should read our `developer guide <https://certbot.eff.org/docs/contributing.html>`_.

1
README.rst Symbolic link
View File

@@ -0,0 +1 @@
certbot/README.rst

View File

@@ -3,4 +3,6 @@ include README.rst
include pytest.ini
recursive-include docs *
recursive-include examples *
recursive-include acme/testdata *
recursive-include tests *
global-exclude __pycache__
global-exclude *.py[cod]

View File

@@ -13,7 +13,6 @@ import warnings
#
# It is based on
# https://github.com/requests/requests/blob/1278ecdf71a312dc2268f3bfc0aabfab3c006dcf/requests/packages.py
import josepy as jose
for mod in list(sys.modules):

View File

@@ -1,15 +1,22 @@
"""ACME Identifier Validation Challenges."""
import abc
import codecs
import functools
import hashlib
import logging
import socket
from cryptography.hazmat.primitives import hashes # type: ignore
import josepy as jose
import requests
import six
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
from OpenSSL import crypto
from acme import crypto_util
from acme import errors
from acme import fields
from acme.mixins import ResourceMixin, TypeMixin
logger = logging.getLogger(__name__)
@@ -28,7 +35,7 @@ class Challenge(jose.TypedJSONObjectWithFields):
return UnrecognizedChallenge.from_json(jobj)
class ChallengeResponse(jose.TypedJSONObjectWithFields):
class ChallengeResponse(ResourceMixin, TypeMixin, jose.TypedJSONObjectWithFields):
# _fields_to_partial_json
"""ACME challenge response."""
TYPES = {} # type: dict
@@ -54,8 +61,7 @@ class UnrecognizedChallenge(Challenge):
object.__setattr__(self, "jobj", jobj)
def to_partial_json(self):
# pylint: disable=no-member
return self.jobj
return self.jobj # pylint: disable=no-member
@classmethod
def from_json(cls, jobj):
@@ -113,7 +119,7 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
:rtype: bool
"""
parts = self.key_authorization.split('.') # pylint: disable=no-member
parts = self.key_authorization.split('.')
if len(parts) != 2:
logger.debug("Key authorization (%r) is not well formed",
self.key_authorization)
@@ -304,7 +310,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
uri = chall.uri(domain)
logger.debug("Verifying %s at %s...", chall.typ, uri)
try:
http_response = requests.get(uri)
http_response = requests.get(uri, verify=False)
except requests.exceptions.RequestException as error:
logger.error("Unable to reach %s: %s", uri, error)
return False
@@ -363,29 +369,163 @@ class HTTP01(KeyAuthorizationChallenge):
@ChallengeResponse.register
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""ACME TLS-ALPN-01 challenge response.
This class only allows initiating a TLS-ALPN-01 challenge returned from the
CA. Full support for responding to TLS-ALPN-01 challenges by generating and
serving the expected response certificate is not currently provided.
"""
"""ACME tls-alpn-01 challenge response."""
typ = "tls-alpn-01"
PORT = 443
"""Verification port as defined by the protocol.
@Challenge.register
You can override it (e.g. for testing) by passing ``port`` to
`simple_verify`.
"""
ID_PE_ACME_IDENTIFIER_V1 = b"1.3.6.1.5.5.7.1.30.1"
ACME_TLS_1_PROTOCOL = "acme-tls/1"
@property
def h(self):
"""Hash value stored in challenge certificate"""
return hashlib.sha256(self.key_authorization.encode('utf-8')).digest()
def gen_cert(self, domain, key=None, bits=2048):
"""Generate tls-alpn-01 certificate.
:param unicode domain: Domain verified by the challenge.
:param OpenSSL.crypto.PKey key: Optional private key used in
certificate generation. If not provided (``None``), then
fresh key will be generated.
:param int bits: Number of bits for newly generated key.
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
"""
if key is None:
key = crypto.PKey()
key.generate_key(crypto.TYPE_RSA, bits)
der_value = b"DER:" + codecs.encode(self.h, 'hex')
acme_extension = crypto.X509Extension(self.ID_PE_ACME_IDENTIFIER_V1,
critical=True, value=der_value)
return crypto_util.gen_ss_cert(key, [domain], force_san=True,
extensions=[acme_extension]), key
def probe_cert(self, domain, host=None, port=None):
"""Probe tls-alpn-01 challenge certificate.
:param unicode domain: domain being validated, required.
:param string host: IP address used to probe the certificate.
:param int port: Port used to probe the certificate.
"""
if host is None:
host = socket.gethostbyname(domain)
logger.debug('%s resolved to %s', domain, host)
if port is None:
port = self.PORT
return crypto_util.probe_sni(host=host, port=port, name=domain,
alpn_protocols=[self.ACME_TLS_1_PROTOCOL])
def verify_cert(self, domain, cert):
"""Verify tls-alpn-01 challenge certificate.
:param unicode domain: Domain name being validated.
:param OpensSSL.crypto.X509 cert: Challenge certificate.
:returns: Whether the certificate was successfully verified.
:rtype: bool
"""
# pylint: disable=protected-access
names = crypto_util._pyopenssl_cert_or_req_all_names(cert)
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), names)
if len(names) != 1 or names[0].lower() != domain.lower():
return False
for i in range(cert.get_extension_count()):
ext = cert.get_extension(i)
# FIXME: assume this is the ACME extension. Currently there is no
# way to get full OID of an unknown extension from pyopenssl.
if ext.get_short_name() == b'UNDEF':
data = ext.get_data()
return data == self.h
return False
# pylint: disable=too-many-arguments
def simple_verify(self, chall, domain, account_public_key,
cert=None, host=None, port=None):
"""Simple verify.
Verify ``validation`` using ``account_public_key``, optionally
probe tls-alpn-01 certificate and check using `verify_cert`.
:param .challenges.TLSALPN01 chall: Corresponding challenge.
:param str domain: Domain name being validated.
:param JWK account_public_key:
:param OpenSSL.crypto.X509 cert: Optional certificate. If not
provided (``None``) certificate will be retrieved using
`probe_cert`.
:param string host: IP address used to probe the certificate.
:param int port: Port used to probe the certificate.
:returns: ``True`` if and only if client's control of the domain has been verified.
:rtype: bool
"""
if not self.verify(chall, account_public_key):
logger.debug("Verification of key authorization in response failed")
return False
if cert is None:
try:
cert = self.probe_cert(domain=domain, host=host, port=port)
except errors.Error as error:
logger.debug(str(error), exc_info=True)
return False
return self.verify_cert(domain, cert)
@Challenge.register # pylint: disable=too-many-ancestors
class TLSALPN01(KeyAuthorizationChallenge):
"""ACME tls-alpn-01 challenge.
This class simply allows parsing the TLS-ALPN-01 challenge returned from
the CA. Full TLS-ALPN-01 support is not currently provided.
"""
typ = "tls-alpn-01"
"""ACME tls-alpn-01 challenge."""
response_cls = TLSALPN01Response
typ = response_cls.typ
def validation(self, account_key, **kwargs):
"""Generate validation for the challenge."""
raise NotImplementedError()
"""Generate validation.
:param JWK account_key:
:param unicode domain: Domain verified by the challenge.
:param OpenSSL.crypto.PKey cert_key: Optional private key used
in certificate generation. If not provided (``None``), then
fresh key will be generated.
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
"""
return self.response(account_key).gen_cert(
key=kwargs.get('cert_key'),
domain=kwargs.get('domain'))
@staticmethod
def is_supported():
"""
Check if TLS-ALPN-01 challenge is supported on this machine.
This implies that a recent version of OpenSSL is installed (>= 1.0.2),
or a recent cryptography version shipped with the OpenSSL library is installed.
:returns: ``True`` if TLS-ALPN-01 is supported on this machine, ``False`` otherwise.
:rtype: bool
"""
return (hasattr(SSL.Connection, "set_alpn_protos")
and hasattr(SSL.Context, "set_alpn_select_callback"))
@Challenge.register

View File

@@ -5,25 +5,28 @@ import datetime
from email.utils import parsedate_tz
import heapq
import logging
import time
import re
import sys
import time
import six
from six.moves import http_client # pylint: disable=import-error
import josepy as jose
import OpenSSL
import requests
from requests.adapters import HTTPAdapter
from requests.utils import parse_header_links
from requests_toolbelt.adapters.source import SourceAddressAdapter
import six
from six.moves import http_client
from acme import crypto_util
from acme import errors
from acme import jws
from acme import messages
# pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict, List, Set, Text
from acme.magic_typing import Dict
from acme.magic_typing import List
from acme.magic_typing import Set
from acme.magic_typing import Text
from acme.mixins import VersionedLEACMEMixin
logger = logging.getLogger(__name__)
@@ -33,10 +36,9 @@ logger = logging.getLogger(__name__)
# https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning
if sys.version_info < (2, 7, 9): # pragma: no cover
try:
# pylint: disable=no-member
requests.packages.urllib3.contrib.pyopenssl.inject_into_urllib3() # type: ignore
except AttributeError:
import urllib3.contrib.pyopenssl # pylint: disable=import-error
import urllib3.contrib.pyopenssl
urllib3.contrib.pyopenssl.inject_into_urllib3()
DEFAULT_NETWORK_TIMEOUT = 45
@@ -279,7 +281,6 @@ class Client(ClientBase):
assert response.status_code == http_client.CREATED
# "Instance of 'Field' has no key/contact member" bug:
# pylint: disable=no-member
return self._regr_from_response(response)
def query_registration(self, regr):
@@ -447,7 +448,7 @@ class Client(ClientBase):
heapq.heapify(waiting)
# mapping between original Authorization Resource and the most
# recently updated one
updated = dict((authzr, authzr) for authzr in authzrs)
updated = {authzr: authzr for authzr in authzrs}
while waiting:
# find the smallest Retry-After, and sleep if necessary
@@ -464,7 +465,6 @@ class Client(ClientBase):
updated[authzr] = updated_authzr
attempts[authzr] += 1
# pylint: disable=no-member
if updated_authzr.body.status not in (
messages.STATUS_VALID, messages.STATUS_INVALID):
if attempts[authzr] < max_attempts:
@@ -605,7 +605,6 @@ class ClientV2(ClientBase):
if response.status_code == 200 and 'Location' in response.headers:
raise errors.ConflictError(response.headers.get('Location'))
# "Instance of 'Field' has no key/contact member" bug:
# pylint: disable=no-member
regr = self._regr_from_response(response)
self.net.account = regr
return regr
@@ -669,7 +668,7 @@ class ClientV2(ClientBase):
response = self._post(self.directory['newOrder'], order)
body = messages.Order.from_json(response.json())
authorizations = []
for url in body.authorizations: # pylint: disable=not-an-iterable
for url in body.authorizations:
authorizations.append(self._authzr_from_response(self._post_as_get(url), uri=url))
return messages.OrderResource(
body=body,
@@ -729,17 +728,19 @@ class ClientV2(ClientBase):
for authzr in responses:
if authzr.body.status != messages.STATUS_VALID:
for chall in authzr.body.challenges:
if chall.error != None:
if chall.error is not None:
failed.append(authzr)
if failed:
raise errors.ValidationError(failed)
return orderr.update(authorizations=responses)
def finalize_order(self, orderr, deadline):
def finalize_order(self, orderr, deadline, fetch_alternative_chains=False):
"""Finalize an order and obtain a certificate.
:param messages.OrderResource orderr: order to finalize
:param datetime.datetime deadline: when to stop polling and timeout
:param bool fetch_alternative_chains: whether to also fetch alternative
certificate chains
:returns: finalized order
:rtype: messages.OrderResource
@@ -756,8 +757,13 @@ class ClientV2(ClientBase):
if body.error is not None:
raise errors.IssuanceError(body.error)
if body.certificate is not None:
certificate_response = self._post_as_get(body.certificate).text
return orderr.update(body=body, fullchain_pem=certificate_response)
certificate_response = self._post_as_get(body.certificate)
orderr = orderr.update(body=body, fullchain_pem=certificate_response.text)
if fetch_alternative_chains:
alt_chains_urls = self._get_links(certificate_response, 'alternate')
alt_chains = [self._post_as_get(url).text for url in alt_chains_urls]
orderr = orderr.update(alternative_fullchains_pem=alt_chains)
return orderr
raise errors.TimeoutError()
def revoke(self, cert, rsn):
@@ -779,29 +785,27 @@ class ClientV2(ClientBase):
def _post_as_get(self, *args, **kwargs):
"""
Send GET request using the POST-as-GET protocol if needed.
The request will be first issued using POST-as-GET for ACME v2. If the ACME CA servers do
not support this yet and return an error, request will be retried using GET.
For ACME v1, only GET request will be tried, as POST-as-GET is not supported.
Send GET request using the POST-as-GET protocol.
:param args:
:param kwargs:
:return:
"""
if self.acme_version >= 2:
# We add an empty payload for POST-as-GET requests
new_args = args[:1] + (None,) + args[1:]
try:
return self._post(*new_args, **kwargs)
except messages.Error as error:
if error.code == 'malformed':
logger.debug('Error during a POST-as-GET request, '
'your ACME CA server may not support it:\n%s', error)
logger.debug('Retrying request with GET.')
else: # pragma: no cover
raise
new_args = args[:1] + (None,) + args[1:]
return self._post(*new_args, **kwargs)
# If POST-as-GET is not supported yet, we use a GET instead.
return self.net.get(*args, **kwargs)
def _get_links(self, response, relation_type):
"""
Retrieves all Link URIs of relation_type from the response.
:param requests.Response response: The requests HTTP response.
:param str relation_type: The relation type to filter by.
"""
# Can't use response.links directly because it drops multiple links
# of the same relation type, which is possible in RFC8555 responses.
if 'Link' not in response.headers:
return []
links = parse_header_links(response.headers['Link'])
return [l['url'] for l in links
if 'rel' in l and 'url' in l and l['rel'] == relation_type]
class BackwardsCompatibleClientV2(object):
@@ -881,11 +885,13 @@ class BackwardsCompatibleClientV2(object):
return messages.OrderResource(authorizations=authorizations, csr_pem=csr_pem)
return self.client.new_order(csr_pem)
def finalize_order(self, orderr, deadline):
def finalize_order(self, orderr, deadline, fetch_alternative_chains=False):
"""Finalize an order and obtain a certificate.
:param messages.OrderResource orderr: order to finalize
:param datetime.datetime deadline: when to stop polling and timeout
:param bool fetch_alternative_chains: whether to also fetch alternative
certificate chains
:returns: finalized order
:rtype: messages.OrderResource
@@ -916,7 +922,7 @@ class BackwardsCompatibleClientV2(object):
chain = crypto_util.dump_pyopenssl_chain(chain).decode()
return orderr.update(fullchain_pem=(cert + chain))
return self.client.finalize_order(orderr, deadline)
return self.client.finalize_order(orderr, deadline, fetch_alternative_chains)
def revoke(self, cert, rsn):
"""Revoke certificate.
@@ -961,7 +967,7 @@ class ClientNetwork(object):
:param messages.RegistrationResource account: Account object. Required if you are
planning to use .post() with acme_version=2 for anything other than
creating a new account; may be set later after registering.
:param josepy.JWASignature alg: Algoritm to use in signing JWS.
:param josepy.JWASignature alg: Algorithm to use in signing JWS.
:param bool verify_ssl: Whether to verify certificates on SSL connections.
:param str user_agent: String to send as User-Agent header.
:param float timeout: Timeout for requests.
@@ -1006,6 +1012,8 @@ class ClientNetwork(object):
:rtype: `josepy.JWS`
"""
if isinstance(obj, VersionedLEACMEMixin):
obj.le_acme_version = acme_version
jobj = obj.json_dumps(indent=2).encode() if obj else b''
logger.debug('JWS payload:\n%s', jobj)
kwargs = {
@@ -1041,6 +1049,9 @@ class ClientNetwork(object):
"""
response_ct = response.headers.get('Content-Type')
# Strip parameters from the media-type (rfc2616#section-3.7)
if response_ct:
response_ct = response_ct.split(';')[0].strip()
try:
# TODO: response.json() is called twice, once here, and
# once in _get and _post clients
@@ -1124,10 +1135,9 @@ class ClientNetwork(object):
err_regex = r".*host='(\S*)'.*Max retries exceeded with url\: (\/\w*).*(\[Errno \d+\])([A-Za-z ]*)"
m = re.match(err_regex, str(e))
if m is None:
raise # pragma: no cover
else:
host, path, _err_no, err_msg = m.groups()
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
raise # pragma: no cover
host, path, _err_no, err_msg = m.groups()
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
# If content is DER, log the base64 of it instead of raw bytes, to keep
# binary data out of the logs.
@@ -1137,8 +1147,8 @@ class ClientNetwork(object):
debug_content = response.content.decode("utf-8")
logger.debug('Received response:\nHTTP %d\n%s\n\n%s',
response.status_code,
"\n".join(["{0}: {1}".format(k, v)
for k, v in response.headers.items()]),
"\n".join("{0}: {1}".format(k, v)
for k, v in response.headers.items()),
debug_content)
return response
@@ -1193,8 +1203,7 @@ class ClientNetwork(object):
if error.code == 'badNonce':
logger.debug('Retrying request after error:\n%s', error)
return self._post_once(*args, **kwargs)
else:
raise
raise
def _post_once(self, url, obj, content_type=JOSE_CONTENT_TYPE,
acme_version=1, **kwargs):

View File

@@ -6,15 +6,14 @@ import os
import re
import socket
from OpenSSL import crypto
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
import josepy as jose
from OpenSSL import crypto
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
from acme import errors
# pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Callable, Union, Tuple, Optional
# pylint: enable=unused-import, no-name-in-module
from acme.magic_typing import Callable
from acme.magic_typing import Tuple
from acme.magic_typing import Union
logger = logging.getLogger(__name__)
@@ -28,19 +27,41 @@ logger = logging.getLogger(__name__)
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
class SSLSocket(object):
class _DefaultCertSelection(object):
def __init__(self, certs):
self.certs = certs
def __call__(self, connection):
server_name = connection.get_servername()
return self.certs.get(server_name, None)
class SSLSocket(object): # pylint: disable=too-few-public-methods
"""SSL wrapper for sockets.
:ivar socket sock: Original wrapped socket.
:ivar dict certs: Mapping from domain names (`bytes`) to
`OpenSSL.crypto.X509`.
:ivar method: See `OpenSSL.SSL.Context` for allowed values.
:ivar alpn_selection: Hook to select negotiated ALPN protocol for
connection.
:ivar cert_selection: Hook to select certificate for connection. If given,
`certs` parameter would be ignored, and therefore must be empty.
"""
def __init__(self, sock, certs, method=_DEFAULT_SSL_METHOD):
def __init__(self, sock, certs=None,
method=_DEFAULT_SSL_METHOD, alpn_selection=None,
cert_selection=None):
self.sock = sock
self.certs = certs
self.alpn_selection = alpn_selection
self.method = method
if not cert_selection and not certs:
raise ValueError("Neither cert_selection or certs specified.")
if cert_selection and certs:
raise ValueError("Both cert_selection and certs specified.")
if cert_selection is None:
cert_selection = _DefaultCertSelection(certs)
self.cert_selection = cert_selection
def __getattr__(self, name):
return getattr(self.sock, name)
@@ -57,24 +78,25 @@ class SSLSocket(object):
:type connection: :class:`OpenSSL.Connection`
"""
server_name = connection.get_servername()
try:
key, cert = self.certs[server_name]
except KeyError:
logger.debug("Server name (%s) not recognized, dropping SSL",
server_name)
pair = self.cert_selection(connection)
if pair is None:
logger.debug("Certificate selection for server name %s failed, dropping SSL",
connection.get_servername())
return
key, cert = pair
new_context = SSL.Context(self.method)
new_context.set_options(SSL.OP_NO_SSLv2)
new_context.set_options(SSL.OP_NO_SSLv3)
new_context.use_privatekey(key)
new_context.use_certificate(cert)
if self.alpn_selection is not None:
new_context.set_alpn_select_callback(self.alpn_selection)
connection.set_context(new_context)
class FakeConnection(object):
"""Fake OpenSSL.SSL.Connection."""
# pylint: disable=missing-docstring
# pylint: disable=missing-function-docstring
def __init__(self, connection):
self._wrapped = connection
@@ -86,13 +108,15 @@ class SSLSocket(object):
# OpenSSL.SSL.Connection.shutdown doesn't accept any args
return self._wrapped.shutdown()
def accept(self): # pylint: disable=missing-docstring
def accept(self): # pylint: disable=missing-function-docstring
sock, addr = self.sock.accept()
context = SSL.Context(self.method)
context.set_options(SSL.OP_NO_SSLv2)
context.set_options(SSL.OP_NO_SSLv3)
context.set_tlsext_servername_callback(self._pick_certificate_cb)
if self.alpn_selection is not None:
context.set_alpn_select_callback(self.alpn_selection)
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
ssl_sock.set_accept_state()
@@ -108,8 +132,9 @@ class SSLSocket(object):
return ssl_sock, addr
def probe_sni(name, host, port=443, timeout=300,
method=_DEFAULT_SSL_METHOD, source_address=('', 0)):
def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-arguments
method=_DEFAULT_SSL_METHOD, source_address=('', 0),
alpn_protocols=None):
"""Probe SNI server for SSL certificate.
:param bytes name: Byte string to send as the server name in the
@@ -121,6 +146,8 @@ def probe_sni(name, host, port=443, timeout=300,
:param tuple source_address: Enables multi-path probing (selection
of source interface). See `socket.creation_connection` for more
info. Available only in Python 2.7+.
:param alpn_protocols: Protocols to request using ALPN.
:type alpn_protocols: `list` of `bytes`
:raises acme.errors.Error: In case of any problems.
@@ -150,6 +177,8 @@ def probe_sni(name, host, port=443, timeout=300,
client_ssl = SSL.Connection(context, client)
client_ssl.set_connect_state()
client_ssl.set_tlsext_host_name(name) # pyOpenSSL>=0.13
if alpn_protocols is not None:
client_ssl.set_alpn_protos(alpn_protocols)
try:
client_ssl.do_handshake()
client_ssl.shutdown()
@@ -240,12 +269,14 @@ def _pyopenssl_cert_or_req_san(cert_or_req):
def gen_ss_cert(key, domains, not_before=None,
validity=(7 * 24 * 60 * 60), force_san=True):
validity=(7 * 24 * 60 * 60), force_san=True, extensions=None):
"""Generate new self-signed certificate.
:type domains: `list` of `unicode`
:param OpenSSL.crypto.PKey key:
:param bool force_san:
:param extensions: List of additional extensions to include in the cert.
:type extensions: `list` of `OpenSSL.crypto.X509Extension`
If more than one domain is provided, all of the domains are put into
``subjectAltName`` X.509 extension and first domain is set as the
@@ -258,10 +289,13 @@ def gen_ss_cert(key, domains, not_before=None,
cert.set_serial_number(int(binascii.hexlify(os.urandom(16)), 16))
cert.set_version(2)
extensions = [
if extensions is None:
extensions = []
extensions.append(
crypto.X509Extension(
b"basicConstraints", True, b"CA:TRUE, pathlen:0"),
]
)
cert.get_subject().CN = domains[0]
# TODO: what to put into cert.get_subject()?
@@ -298,7 +332,6 @@ def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
def _dump_cert(cert):
if isinstance(cert, jose.ComparableX509):
# pylint: disable=protected-access
cert = cert.wrapped
return crypto.dump_certificate(filetype, cert)

View File

@@ -29,7 +29,12 @@ class NonceError(ClientError):
class BadNonce(NonceError):
"""Bad nonce error."""
def __init__(self, nonce, error, *args, **kwargs):
super(BadNonce, self).__init__(*args, **kwargs)
# MyPy complains here that there is too many arguments for BaseException constructor.
# This is an error fixed in typeshed, see https://github.com/python/mypy/issues/4183
# The fix is included in MyPy>=0.740, but upgrading it would bring dozen of errors due to
# new types definitions. So we ignore the error until the code base is fixed to match
# with MyPy>=0.740 referential.
super(BadNonce, self).__init__(*args, **kwargs) # type: ignore
self.nonce = nonce
self.error = error
@@ -48,7 +53,8 @@ class MissingNonce(NonceError):
"""
def __init__(self, response, *args, **kwargs):
super(MissingNonce, self).__init__(*args, **kwargs)
# See comment in BadNonce constructor above for an explanation of type: ignore here.
super(MissingNonce, self).__init__(*args, **kwargs) # type: ignore
self.response = response
def __str__(self):
@@ -83,6 +89,7 @@ class PollError(ClientError):
return '{0}(exhausted={1!r}, updated={2!r})'.format(
self.__class__.__name__, self.exhausted, self.updated)
class ValidationError(Error):
"""Error for authorization failures. Contains a list of authorization
resources, each of which is invalid and should have an error field.
@@ -91,9 +98,11 @@ class ValidationError(Error):
self.failed_authzrs = failed_authzrs
super(ValidationError, self).__init__()
class TimeoutError(Error):
class TimeoutError(Error): # pylint: disable=redefined-builtin
"""Error for when polling an authorization or an order times out."""
class IssuanceError(Error):
"""Error sent by the server after requesting issuance of a certificate."""
@@ -105,6 +114,7 @@ class IssuanceError(Error):
self.error = error
super(IssuanceError, self).__init__()
class ConflictError(ClientError):
"""Error for when the server returns a 409 (Conflict) HTTP status.

View File

@@ -4,7 +4,6 @@ import logging
import josepy as jose
import pyrfc3339
logger = logging.getLogger(__name__)

View File

@@ -15,7 +15,7 @@ class Header(jose.Header):
url = jose.Field('url', omitempty=True)
@nonce.decoder
def nonce(value): # pylint: disable=missing-docstring,no-self-argument
def nonce(value): # pylint: disable=no-self-argument,missing-function-docstring
try:
return jose.decode_b64jose(value)
except jose.DeserializationError as error:
@@ -40,7 +40,7 @@ class Signature(jose.Signature):
class JWS(jose.JWS):
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
signature_cls = Signature
__slots__ = jose.JWS._orig_slots # pylint: disable=no-member
__slots__ = jose.JWS._orig_slots
@classmethod
# pylint: disable=arguments-differ

View File

@@ -1,6 +1,7 @@
"""Shim class to not have to depend on typing module in prod."""
import sys
class TypingClass(object):
"""Ignore import errors by getting anything"""
def __getattr__(self, name):
@@ -9,8 +10,6 @@ class TypingClass(object):
try:
# mypy doesn't respect modifying sys.modules
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
# pylint: disable=unused-import
from typing import Collection, IO # type: ignore
# pylint: enable=unused-import
except ImportError:
sys.modules[__name__] = TypingClass()

View File

@@ -1,18 +1,22 @@
"""ACME protocol messages."""
import json
import six
try:
from collections.abc import Hashable # pylint: disable=no-name-in-module
except ImportError: # pragma: no cover
from collections import Hashable
import josepy as jose
import six
from acme import challenges
from acme import errors
from acme import fields
from acme import util
from acme import jws
from acme import util
from acme.mixins import ResourceMixin
try:
from collections.abc import Hashable
except ImportError: # pragma: no cover
from collections import Hashable
OLD_ERROR_PREFIX = "urn:acme:error:"
ERROR_PREFIX = "urn:ietf:params:acme:error:"
@@ -33,7 +37,7 @@ ERROR_CODES = {
' domain'),
'dns': 'There was a problem with a DNS query during identifier validation',
'dnssec': 'The server could not validate a DNSSEC signed domain',
'incorrectResponse': 'Response recieved didn\'t match the challenge\'s requirements',
'incorrectResponse': 'Response received didn\'t match the challenge\'s requirements',
# deprecate invalidEmail
'invalidEmail': 'The provided email for a registration was invalid',
'invalidContact': 'The provided contact URI was invalid',
@@ -143,7 +147,7 @@ class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
raise jose.DeserializationError(
'{0} not recognized'.format(cls.__name__))
return cls.POSSIBLE_NAMES[jobj] # pylint: disable=unsubscriptable-object
return cls.POSSIBLE_NAMES[jobj]
def __repr__(self):
return '{0}({1})'.format(self.__class__.__name__, self.name)
@@ -202,7 +206,7 @@ class Directory(jose.JSONDeSerializable):
external_account_required = jose.Field('externalAccountRequired', omitempty=True)
def __init__(self, **kwargs):
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
super(Directory.Meta, self).__init__(**kwargs)
@property
@@ -242,13 +246,13 @@ class Directory(jose.JSONDeSerializable):
try:
return self[name.replace('_', '-')]
except KeyError as error:
raise AttributeError(str(error) + ': ' + name)
raise AttributeError(str(error))
def __getitem__(self, name):
try:
return self._jobj[self._canon_key(name)]
except KeyError:
raise KeyError('Directory field not found')
raise KeyError('Directory field "' + self._canon_key(name) + '" not found')
def to_partial_json(self):
return self._jobj
@@ -311,6 +315,9 @@ class Registration(ResourceBody):
# on new-reg key server ignores 'key' and populates it based on
# JWS.signature.combined.jwk
key = jose.Field('key', omitempty=True, decoder=jose.JWK.from_json)
# Contact field implements special behavior to allow messages that clear existing
# contacts while not expecting the `contact` field when loading from json.
# This is implemented in the constructor and *_json methods.
contact = jose.Field('contact', omitempty=True, default=())
agreement = jose.Field('agreement', omitempty=True)
status = jose.Field('status', omitempty=True)
@@ -323,24 +330,73 @@ class Registration(ResourceBody):
@classmethod
def from_data(cls, phone=None, email=None, external_account_binding=None, **kwargs):
"""Create registration resource from contact details."""
"""
Create registration resource from contact details.
The `contact` keyword being passed to a Registration object is meaningful, so
this function represents empty iterables in its kwargs by passing on an empty
`tuple`.
"""
# Note if `contact` was in kwargs.
contact_provided = 'contact' in kwargs
# Pop `contact` from kwargs and add formatted email or phone numbers
details = list(kwargs.pop('contact', ()))
if phone is not None:
details.append(cls.phone_prefix + phone)
if email is not None:
details.extend([cls.email_prefix + mail for mail in email.split(',')])
kwargs['contact'] = tuple(details)
# Insert formatted contact information back into kwargs
# or insert an empty tuple if `contact` provided.
if details or contact_provided:
kwargs['contact'] = tuple(details)
if external_account_binding:
kwargs['external_account_binding'] = external_account_binding
return cls(**kwargs)
def __init__(self, **kwargs):
"""Note if the user provides a value for the `contact` member."""
if 'contact' in kwargs:
# Avoid the __setattr__ used by jose.TypedJSONObjectWithFields
object.__setattr__(self, '_add_contact', True)
super(Registration, self).__init__(**kwargs)
def _filter_contact(self, prefix):
return tuple(
detail[len(prefix):] for detail in self.contact # pylint: disable=not-an-iterable
if detail.startswith(prefix))
def _add_contact_if_appropriate(self, jobj):
"""
The `contact` member of Registration objects should not be required when
de-serializing (as it would be if the Fields' `omitempty` flag were `False`), but
it should be included in serializations if it was provided.
:param jobj: Dictionary containing this Registrations' data
:type jobj: dict
:returns: Dictionary containing Registrations data to transmit to the server
:rtype: dict
"""
if getattr(self, '_add_contact', False):
jobj['contact'] = self.encode('contact')
return jobj
def to_partial_json(self):
"""Modify josepy.JSONDeserializable.to_partial_json()"""
jobj = super(Registration, self).to_partial_json()
return self._add_contact_if_appropriate(jobj)
def fields_to_partial_json(self):
"""Modify josepy.JSONObjectWithFields.fields_to_partial_json()"""
jobj = super(Registration, self).fields_to_partial_json()
return self._add_contact_if_appropriate(jobj)
@property
def phones(self):
"""All phones found in the ``contact`` field."""
@@ -353,13 +409,13 @@ class Registration(ResourceBody):
@Directory.register
class NewRegistration(Registration):
class NewRegistration(ResourceMixin, Registration):
"""New registration."""
resource_type = 'new-reg'
resource = fields.Resource(resource_type)
class UpdateRegistration(Registration):
class UpdateRegistration(ResourceMixin, Registration):
"""Update registration."""
resource_type = 'reg'
resource = fields.Resource(resource_type)
@@ -409,7 +465,7 @@ class ChallengeBody(ResourceBody):
omitempty=True, default=None)
def __init__(self, **kwargs):
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
kwargs = {self._internal_name(k): v for k, v in kwargs.items()}
super(ChallengeBody, self).__init__(**kwargs)
def encode(self, name):
@@ -457,7 +513,6 @@ class ChallengeResource(Resource):
@property
def uri(self):
"""The URL of the challenge body."""
# pylint: disable=function-redefined,no-member
return self.body.uri
@@ -485,7 +540,7 @@ class Authorization(ResourceBody):
wildcard = jose.Field('wildcard', omitempty=True)
@challenges.decoder
def challenges(value): # pylint: disable=missing-docstring,no-self-argument
def challenges(value): # pylint: disable=no-self-argument,missing-function-docstring
return tuple(ChallengeBody.from_json(chall) for chall in value)
@property
@@ -496,13 +551,13 @@ class Authorization(ResourceBody):
@Directory.register
class NewAuthorization(Authorization):
class NewAuthorization(ResourceMixin, Authorization):
"""New authorization."""
resource_type = 'new-authz'
resource = fields.Resource(resource_type)
class UpdateAuthorization(Authorization):
class UpdateAuthorization(ResourceMixin, Authorization):
"""Update authorization."""
resource_type = 'authz'
resource = fields.Resource(resource_type)
@@ -520,7 +575,7 @@ class AuthorizationResource(ResourceWithURI):
@Directory.register
class CertificateRequest(jose.JSONObjectWithFields):
class CertificateRequest(ResourceMixin, jose.JSONObjectWithFields):
"""ACME new-cert request.
:ivar josepy.util.ComparableX509 csr:
@@ -546,7 +601,7 @@ class CertificateResource(ResourceWithURI):
@Directory.register
class Revocation(jose.JSONObjectWithFields):
class Revocation(ResourceMixin, jose.JSONObjectWithFields):
"""Revocation message.
:ivar .ComparableX509 certificate: `OpenSSL.crypto.X509` wrapped in
@@ -563,9 +618,11 @@ class Revocation(jose.JSONObjectWithFields):
class Order(ResourceBody):
"""Order Resource Body.
:ivar list of .Identifier: List of identifiers for the certificate.
:ivar identifiers: List of identifiers for the certificate.
:vartype identifiers: `list` of `.Identifier`
:ivar acme.messages.Status status:
:ivar list of str authorizations: URLs of authorizations.
:ivar authorizations: URLs of authorizations.
:vartype authorizations: `list` of `str`
:ivar str certificate: URL to download certificate as a fullchain PEM.
:ivar str finalize: URL to POST to to request issuance once all
authorizations have "valid" status.
@@ -582,7 +639,7 @@ class Order(ResourceBody):
error = jose.Field('error', omitempty=True, decoder=Error.from_json)
@identifiers.decoder
def identifiers(value): # pylint: disable=missing-docstring,no-self-argument
def identifiers(value): # pylint: disable=no-self-argument,missing-function-docstring
return tuple(Identifier.from_json(identifier) for identifier in value)
class OrderResource(ResourceWithURI):
@@ -590,15 +647,20 @@ class OrderResource(ResourceWithURI):
:ivar acme.messages.Order body:
:ivar str csr_pem: The CSR this Order will be finalized with.
:ivar list of acme.messages.AuthorizationResource authorizations:
Fully-fetched AuthorizationResource objects.
:ivar authorizations: Fully-fetched AuthorizationResource objects.
:vartype authorizations: `list` of `acme.messages.AuthorizationResource`
:ivar str fullchain_pem: The fetched contents of the certificate URL
produced once the order was finalized, if it's present.
:ivar alternative_fullchains_pem: The fetched contents of alternative certificate
chain URLs produced once the order was finalized, if present and requested during
finalization.
:vartype alternative_fullchains_pem: `list` of `str`
"""
body = jose.Field('body', decoder=Order.from_json)
csr_pem = jose.Field('csr_pem', omitempty=True)
authorizations = jose.Field('authorizations')
fullchain_pem = jose.Field('fullchain_pem', omitempty=True)
alternative_fullchains_pem = jose.Field('alternative_fullchains_pem', omitempty=True)
@Directory.register
class NewOrder(Order):

65
acme/acme/mixins.py Normal file
View File

@@ -0,0 +1,65 @@
"""Useful mixins for Challenge and Resource objects"""
class VersionedLEACMEMixin(object):
"""This mixin stores the version of Let's Encrypt's endpoint being used."""
@property
def le_acme_version(self):
"""Define the version of ACME protocol to use"""
return getattr(self, '_le_acme_version', 1)
@le_acme_version.setter
def le_acme_version(self, version):
# We need to use object.__setattr__ to not depend on the specific implementation of
# __setattr__ in current class (eg. jose.TypedJSONObjectWithFields raises AttributeError
# for any attempt to set an attribute to make objects immutable).
object.__setattr__(self, '_le_acme_version', version)
def __setattr__(self, key, value):
if key == 'le_acme_version':
# Required for @property to operate properly. See comment above.
object.__setattr__(self, key, value)
else:
super(VersionedLEACMEMixin, self).__setattr__(key, value) # pragma: no cover
class ResourceMixin(VersionedLEACMEMixin):
"""
This mixin generates a RFC8555 compliant JWS payload
by removing the `resource` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self):
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(ResourceMixin, self),
'to_partial_json', 'resource')
def fields_to_partial_json(self):
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(ResourceMixin, self),
'fields_to_partial_json', 'resource')
class TypeMixin(VersionedLEACMEMixin):
"""
This mixin allows generation of a RFC8555 compliant JWS payload
by removing the `type` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self):
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(TypeMixin, self),
'to_partial_json', 'type')
def fields_to_partial_json(self):
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(TypeMixin, self),
'fields_to_partial_json', 'type')
def _safe_jobj_compliance(instance, jobj_method, uncompliant_field):
if hasattr(instance, jobj_method):
jobj = getattr(instance, jobj_method)()
if instance.le_acme_version == 2:
jobj.pop(uncompliant_field, None)
return jobj
raise AttributeError('Method {0}() is not implemented.'.format(jobj_method)) # pragma: no cover

View File

@@ -5,20 +5,16 @@ import logging
import socket
import threading
from six.moves import BaseHTTPServer # type: ignore # pylint: disable=import-error
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
from six.moves import BaseHTTPServer # type: ignore
from six.moves import http_client
from six.moves import socketserver # type: ignore
from acme import challenges
from acme import crypto_util
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List
logger = logging.getLogger(__name__)
# six.moves.* | pylint: disable=no-member,attribute-defined-outside-init
# pylint: disable=no-init
class TLSServer(socketserver.TCPServer):
"""Generic TLS Server."""
@@ -31,21 +27,27 @@ class TLSServer(socketserver.TCPServer):
self.address_family = socket.AF_INET
self.certs = kwargs.pop("certs", {})
self.method = kwargs.pop(
# pylint: disable=protected-access
"method", crypto_util._DEFAULT_SSL_METHOD)
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
socketserver.TCPServer.__init__(self, *args, **kwargs)
def _wrap_sock(self):
self.socket = crypto_util.SSLSocket(
self.socket, certs=self.certs, method=self.method)
self.socket, cert_selection=self._cert_selection,
alpn_selection=getattr(self, '_alpn_selection', None),
method=self.method)
def server_bind(self): # pylint: disable=missing-docstring
def _cert_selection(self, connection): # pragma: no cover
"""Callback selecting certificate for connection."""
server_name = connection.get_servername()
return self.certs.get(server_name, None)
def server_bind(self):
self._wrap_sock()
return socketserver.TCPServer.server_bind(self)
class ACMEServerMixin: # pylint: disable=old-style-class
class ACMEServerMixin:
"""ACME server common settings mixin."""
# TODO: c.f. #858
server_version = "ACME client standalone challenge solver"
@@ -106,7 +108,6 @@ class BaseDualNetworkedServers(object):
"""Wraps socketserver.TCPServer.serve_forever"""
for server in self.servers:
thread = threading.Thread(
# pylint: disable=no-member
target=server.serve_forever)
thread.start()
self.threads.append(thread)
@@ -126,6 +127,40 @@ class BaseDualNetworkedServers(object):
self.threads = []
class TLSALPN01Server(TLSServer, ACMEServerMixin):
"""TLSALPN01 Server."""
ACME_TLS_1_PROTOCOL = b"acme-tls/1"
def __init__(self, server_address, certs, challenge_certs, ipv6=False):
TLSServer.__init__(
self, server_address, _BaseRequestHandlerWithLogging, certs=certs,
ipv6=ipv6)
self.challenge_certs = challenge_certs
def _cert_selection(self, connection):
# TODO: We would like to serve challenge cert only if asked for it via
# ALPN. To do this, we need to retrieve the list of protos from client
# hello, but this is currently impossible with openssl [0], and ALPN
# negotiation is done after cert selection.
# Therefore, currently we always return challenge cert, and terminate
# handshake in alpn_selection() if ALPN protos are not what we expect.
# [0] https://github.com/openssl/openssl/issues/4952
server_name = connection.get_servername()
logger.debug("Serving challenge cert for server name %s", server_name)
return self.challenge_certs.get(server_name, None)
def _alpn_selection(self, _connection, alpn_protos):
"""Callback to select alpn protocol."""
if len(alpn_protos) == 1 and alpn_protos[0] == self.ACME_TLS_1_PROTOCOL:
logger.debug("Agreed on %s ALPN", self.ACME_TLS_1_PROTOCOL)
return self.ACME_TLS_1_PROTOCOL
logger.debug("Cannot agree on ALPN proto. Got: %s", str(alpn_protos))
# Explicitly close the connection now, by returning an empty string.
# See https://www.pyopenssl.org/en/stable/api/ssl.html#OpenSSL.SSL.Context.set_alpn_select_callback # pylint: disable=line-too-long
return b""
class HTTPServer(BaseHTTPServer.HTTPServer):
"""Generic HTTP Server."""
@@ -141,10 +176,10 @@ class HTTPServer(BaseHTTPServer.HTTPServer):
class HTTP01Server(HTTPServer, ACMEServerMixin):
"""HTTP01 Server."""
def __init__(self, server_address, resources, ipv6=False):
def __init__(self, server_address, resources, ipv6=False, timeout=30):
HTTPServer.__init__(
self, server_address, HTTP01RequestHandler.partial_init(
simple_http_resources=resources), ipv6=ipv6)
simple_http_resources=resources, timeout=timeout), ipv6=ipv6)
class HTTP01DualNetworkedServers(BaseDualNetworkedServers):
@@ -169,6 +204,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
def __init__(self, *args, **kwargs):
self.simple_http_resources = kwargs.pop("simple_http_resources", set())
self.timeout = kwargs.pop('timeout', 30)
BaseHTTPServer.BaseHTTPRequestHandler.__init__(self, *args, **kwargs)
def log_message(self, format, *args): # pylint: disable=redefined-builtin
@@ -180,7 +216,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
self.log_message("Incoming request")
BaseHTTPServer.BaseHTTPRequestHandler.handle(self)
def do_GET(self): # pylint: disable=invalid-name,missing-docstring
def do_GET(self): # pylint: disable=invalid-name,missing-function-docstring
if self.path == "/":
self.handle_index()
elif self.path.startswith("/" + challenges.HTTP01.URI_ROOT_PATH):
@@ -218,7 +254,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
self.path)
@classmethod
def partial_init(cls, simple_http_resources):
def partial_init(cls, simple_http_resources, timeout):
"""Partially initialize this handler.
This is useful because `socketserver.BaseServer` takes
@@ -227,4 +263,18 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
"""
return functools.partial(
cls, simple_http_resources=simple_http_resources)
cls, simple_http_resources=simple_http_resources,
timeout=timeout)
class _BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
"""BaseRequestHandler with logging."""
def log_message(self, format, *args): # pylint: disable=redefined-builtin
"""Log arbitrary message."""
logger.debug("%s - - %s", self.client_address[0], format % args)
def handle(self):
"""Handle request."""
self.log_message("Incoming request")
socketserver.BaseRequestHandler.handle(self)

View File

@@ -4,4 +4,4 @@ import six
def map_keys(dikt, func):
"""Map dictionary keys."""
return dict((func(key), value) for key, value in six.iteritems(dikt))
return {func(key): value for key, value in six.iteritems(dikt)}

View File

@@ -9,7 +9,7 @@ BUILDDIR = _build
# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from https://www.sphinx-doc.org/)
endif
# Internal variables.

View File

@@ -12,10 +12,8 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
import sys
import os
import shlex
import sys
here = os.path.abspath(os.path.dirname(__file__))
@@ -42,7 +40,7 @@ extensions = [
]
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance', 'private-members']
autodoc_default_flags = ['show-inheritance']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
@@ -114,7 +112,7 @@ pygments_style = 'sphinx'
#keep_warnings = False
# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = True
todo_include_todos = False
# -- Options for HTML output ----------------------------------------------
@@ -122,7 +120,7 @@ todo_include_todos = True
# The theme to use for HTML and HTML Help pages. See the documentation for
# a list of builtin themes.
# http://docs.readthedocs.org/en/latest/theme.html#how-do-i-use-this-locally-and-on-read-the-docs
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
# on_rtd is whether we are on readthedocs.org
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
if not on_rtd: # only import and set the theme if we're building docs locally

View File

@@ -65,7 +65,7 @@ if errorlevel 9009 (
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
echo.https://www.sphinx-doc.org/
exit /b 1
)

View File

@@ -26,8 +26,10 @@ Workflow:
- Deactivate Account
"""
from contextlib import contextmanager
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import rsa
import josepy as jose
import OpenSSL
from acme import challenges
@@ -36,7 +38,6 @@ from acme import crypto_util
from acme import errors
from acme import messages
from acme import standalone
import josepy as jose
# Constants:

View File

@@ -1,10 +1,10 @@
# readthedocs.org gives no way to change the install command to "pip
# install -e .[docs]" (that would in turn install documentation
# install -e acme[docs]" (that would in turn install documentation
# dependencies), but it allows to specify a requirements.txt file at
# https://readthedocs.org/dashboard/letsencrypt/advanced/ (c.f. #259)
# Although ReadTheDocs certainly doesn't need to install the project
# in --editable mode (-e), just "pip install .[docs]" does not work as
# expected and "pip install -e .[docs]" must be used instead
# in --editable mode (-e), just "pip install acme[docs]" does not work as
# expected and "pip install -e acme[docs]" must be used instead
-e acme[docs]

View File

@@ -1,9 +1,11 @@
from setuptools import setup
from setuptools import find_packages
from setuptools.command.test import test as TestCommand
from distutils.version import LooseVersion
import sys
version = '1.0.0.dev0'
from setuptools import __version__ as setuptools_version
from setuptools import find_packages
from setuptools import setup
version = '1.10.0.dev0'
# Please update tox.ini when modifying dependency version requirements
install_requires = [
@@ -14,9 +16,8 @@ install_requires = [
# 1.1.0+ is required to avoid the warnings described at
# https://github.com/certbot/josepy/issues/13.
'josepy>=1.1.0',
# Connection.set_tlsext_host_name (>=0.13)
'mock',
'PyOpenSSL>=0.13.1',
# Connection.set_tlsext_host_name (>=0.13) + matching Xenial requirements (>=0.15.1)
'PyOpenSSL>=0.15.1',
'pyrfc3339',
'pytz',
'requests[security]>=2.6.0', # security extras added in 2.4.1
@@ -25,6 +26,15 @@ install_requires = [
'six>=1.9.0', # needed for python_2_unicode_compatible
]
setuptools_known_environment_markers = (LooseVersion(setuptools_version) >= LooseVersion('36.2'))
if setuptools_known_environment_markers:
install_requires.append('mock ; python_version < "3.3"')
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Error, you are trying to build certbot wheels using an old version '
'of setuptools. Version 36.2+ of setuptools is required.')
elif sys.version_info < (3,3):
install_requires.append('mock')
dev_extras = [
'pytest',
'pytest-xdist',
@@ -36,22 +46,6 @@ docs_extras = [
'sphinx_rtd_theme',
]
class PyTest(TestCommand):
user_options = []
def initialize_options(self):
TestCommand.initialize_options(self)
self.pytest_args = ''
def run_tests(self):
import shlex
# import here, cause outside the eggs aren't loaded
import pytest
errno = pytest.main(shlex.split(self.pytest_args))
sys.exit(errno)
setup(
name='acme',
version=version,
@@ -60,7 +54,7 @@ setup(
author="Certbot Project",
author_email='client-dev@letsencrypt.org',
license='Apache License 2.0',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*, !=3.5.*',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Developers',
@@ -69,8 +63,6 @@ setup(
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',
'Programming Language :: Python :: 3.8',
@@ -85,7 +77,4 @@ setup(
'dev': dev_extras,
'docs': docs_extras,
},
test_suite='acme',
tests_require=["pytest"],
cmdclass={"test": PyTest},
)

View File

@@ -2,12 +2,17 @@
import unittest
import josepy as jose
import mock
import OpenSSL
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import requests
from six.moves.urllib import parse as urllib_parse
from six.moves.urllib import parse as urllib_parse # pylint: disable=relative-import
from acme import errors
from acme import test_util
import test_util
CERT = test_util.load_comparable_cert('cert.pem')
KEY = jose.JWKRSA(key=test_util.load_rsa_private_key('rsa512_key.pem'))
@@ -19,7 +24,6 @@ class ChallengeTest(unittest.TestCase):
from acme.challenges import Challenge
from acme.challenges import UnrecognizedChallenge
chall = UnrecognizedChallenge({"type": "foo"})
# pylint: disable=no-member
self.assertEqual(chall, Challenge.from_json(chall.jobj))
@@ -183,7 +187,7 @@ class HTTP01ResponseTest(unittest.TestCase):
mock_get.return_value = mock.MagicMock(text=validation)
self.assertTrue(self.response.simple_verify(
self.chall, "local", KEY.public_key()))
mock_get.assert_called_once_with(self.chall.uri("local"))
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_bad_validation(self, mock_get):
@@ -199,7 +203,7 @@ class HTTP01ResponseTest(unittest.TestCase):
HTTP01Response.WHITESPACE_CUTSET))
self.assertTrue(self.response.simple_verify(
self.chall, "local", KEY.public_key()))
mock_get.assert_called_once_with(self.chall.uri("local"))
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_connection_error(self, mock_get):
@@ -258,30 +262,87 @@ class HTTP01Test(unittest.TestCase):
class TLSALPN01ResponseTest(unittest.TestCase):
def setUp(self):
from acme.challenges import TLSALPN01Response
self.msg = TLSALPN01Response(key_authorization=u'foo')
from acme.challenges import TLSALPN01
self.chall = TLSALPN01(
token=jose.b64decode(b'a82d5ff8ef740d12881f6d3c2277ab2e'))
self.domain = u'example.com'
self.domain2 = u'example2.com'
self.response = self.chall.response(KEY)
self.jmsg = {
'resource': 'challenge',
'type': 'tls-alpn-01',
'keyAuthorization': u'foo',
'keyAuthorization': self.response.key_authorization,
}
from acme.challenges import TLSALPN01
self.chall = TLSALPN01(token=(b'x' * 16))
self.response = self.chall.response(KEY)
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.msg.to_partial_json())
self.response.to_partial_json())
def test_from_json(self):
from acme.challenges import TLSALPN01Response
self.assertEqual(self.msg, TLSALPN01Response.from_json(self.jmsg))
self.assertEqual(self.response, TLSALPN01Response.from_json(self.jmsg))
def test_from_json_hashable(self):
from acme.challenges import TLSALPN01Response
hash(TLSALPN01Response.from_json(self.jmsg))
def test_gen_verify_cert(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(self.domain, key1)
self.assertEqual(key1, key2)
self.assertTrue(self.response.verify_cert(self.domain, cert))
def test_gen_verify_cert_gen_key(self):
cert, key = self.response.gen_cert(self.domain)
self.assertTrue(isinstance(key, OpenSSL.crypto.PKey))
self.assertTrue(self.response.verify_cert(self.domain, cert))
def test_verify_bad_cert(self):
self.assertFalse(self.response.verify_cert(self.domain,
test_util.load_cert('cert.pem')))
def test_verify_bad_domain(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(self.domain, key1)
self.assertEqual(key1, key2)
self.assertFalse(self.response.verify_cert(self.domain2, cert))
def test_simple_verify_bad_key_authorization(self):
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
self.response.simple_verify(self.chall, "local", key2.public_key())
@mock.patch('acme.challenges.TLSALPN01Response.verify_cert', autospec=True)
def test_simple_verify(self, mock_verify_cert):
mock_verify_cert.return_value = mock.sentinel.verification
self.assertEqual(
mock.sentinel.verification, self.response.simple_verify(
self.chall, self.domain, KEY.public_key(),
cert=mock.sentinel.cert))
mock_verify_cert.assert_called_once_with(
self.response, self.domain, mock.sentinel.cert)
@mock.patch('acme.challenges.socket.gethostbyname')
@mock.patch('acme.challenges.crypto_util.probe_sni')
def test_probe_cert(self, mock_probe_sni, mock_gethostbyname):
mock_gethostbyname.return_value = '127.0.0.1'
self.response.probe_cert('foo.com')
mock_gethostbyname.assert_called_once_with('foo.com')
mock_probe_sni.assert_called_once_with(
host='127.0.0.1', port=self.response.PORT, name='foo.com',
alpn_protocols=['acme-tls/1'])
self.response.probe_cert('foo.com', host='8.8.8.8')
mock_probe_sni.assert_called_with(
host='8.8.8.8', port=mock.ANY, name='foo.com',
alpn_protocols=['acme-tls/1'])
@mock.patch('acme.challenges.TLSALPN01Response.probe_cert')
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
mock_probe_cert.side_effect = errors.Error
self.assertFalse(self.response.simple_verify(
self.chall, self.domain, KEY.public_key()))
class TLSALPN01Test(unittest.TestCase):
@@ -311,8 +372,13 @@ class TLSALPN01Test(unittest.TestCase):
self.assertRaises(
jose.DeserializationError, TLSALPN01.from_json, self.jmsg)
def test_validation(self):
self.assertRaises(NotImplementedError, self.msg.validation, KEY)
@mock.patch('acme.challenges.TLSALPN01Response.gen_cert')
def test_validation(self, mock_gen_cert):
mock_gen_cert.return_value = ('cert', 'key')
self.assertEqual(('cert', 'key'), self.msg.validation(
KEY, cert_key=mock.sentinel.cert_key, domain=mock.sentinel.domain))
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key,
domain=mock.sentinel.domain)
class DNSTest(unittest.TestCase):
@@ -415,5 +481,18 @@ class DNSResponseTest(unittest.TestCase):
self.msg.check_validation(self.chall, KEY.public_key()))
class JWSPayloadRFC8555Compliant(unittest.TestCase):
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
def test_challenge_payload(self):
from acme.challenges import HTTP01Response
challenge_body = HTTP01Response()
challenge_body.le_acme_version = 2
jobj = challenge_body.json_dumps(indent=2).encode()
# RFC8555 states that challenge responses must have an empty payload.
self.assertEqual(jobj, b'{}')
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -5,21 +5,22 @@ import datetime
import json
import unittest
from six.moves import http_client # pylint: disable=import-error
import josepy as jose
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import OpenSSL
import requests
from six.moves import http_client # pylint: disable=import-error
from acme import challenges
from acme import errors
from acme import jws as acme_jws
from acme import messages
from acme import messages_test
from acme import test_util
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.mixins import VersionedLEACMEMixin
import messages_test
import test_util
CERT_DER = test_util.load_vector('cert.der')
CERT_SAN_PEM = test_util.load_vector('cert-san.pem')
@@ -63,7 +64,7 @@ class ClientTestBase(unittest.TestCase):
self.contact = ('mailto:cert-admin@example.com', 'tel:+12025551212')
reg = messages.Registration(
contact=self.contact, key=KEY.public_key())
the_arg = dict(reg) # type: Dict
the_arg = dict(reg) # type: Dict
self.new_reg = messages.NewRegistration(**the_arg)
self.regr = messages.RegistrationResource(
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1')
@@ -262,7 +263,7 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.finalize_order(mock_orderr, mock_deadline)
mock_client().finalize_order.assert_called_once_with(mock_orderr, mock_deadline)
mock_client().finalize_order.assert_called_once_with(mock_orderr, mock_deadline, False)
def test_revoke(self):
self.response.json.return_value = DIRECTORY_V1.to_json()
@@ -841,6 +842,32 @@ class ClientV2Test(ClientTestBase):
deadline = datetime.datetime.now() - datetime.timedelta(seconds=60)
self.assertRaises(errors.TimeoutError, self.client.finalize_order, self.orderr, deadline)
def test_finalize_order_alt_chains(self):
updated_order = self.order.update(
certificate='https://www.letsencrypt-demo.org/acme/cert/',
)
updated_orderr = self.orderr.update(body=updated_order,
fullchain_pem=CERT_SAN_PEM,
alternative_fullchains_pem=[CERT_SAN_PEM,
CERT_SAN_PEM])
self.response.json.return_value = updated_order.to_json()
self.response.text = CERT_SAN_PEM
self.response.headers['Link'] ='<https://example.com/acme/cert/1>;rel="alternate", ' + \
'<https://example.com/dir>;rel="index", ' + \
'<https://example.com/acme/cert/2>;title="foo";rel="alternate"'
deadline = datetime.datetime(9999, 9, 9)
resp = self.client.finalize_order(self.orderr, deadline, fetch_alternative_chains=True)
self.net.post.assert_any_call('https://example.com/acme/cert/1',
mock.ANY, acme_version=2, new_nonce_url=mock.ANY)
self.net.post.assert_any_call('https://example.com/acme/cert/2',
mock.ANY, acme_version=2, new_nonce_url=mock.ANY)
self.assertEqual(resp, updated_orderr)
del self.response.headers['Link']
resp = self.client.finalize_order(self.orderr, deadline, fetch_alternative_chains=True)
self.assertEqual(resp, updated_orderr.update(alternative_fullchains_pem=[]))
def test_revoke(self):
self.client.revoke(messages_test.CERT, self.rsn)
self.net.post.assert_called_once_with(
@@ -887,21 +914,8 @@ class ClientV2Test(ClientTestBase):
new_nonce_url='https://www.letsencrypt-demo.org/acme/new-nonce')
self.client.net.get.assert_not_called()
class FakeError(messages.Error):
"""Fake error to reproduce a malformed request ACME error"""
def __init__(self): # pylint: disable=super-init-not-called
pass
@property
def code(self):
return 'malformed'
self.client.net.post.side_effect = FakeError()
self.client.poll(self.authzr2) # pylint: disable=protected-access
self.client.net.get.assert_called_once_with(self.authzr2.uri)
class MockJSONDeSerializable(jose.JSONDeSerializable):
class MockJSONDeSerializable(VersionedLEACMEMixin, jose.JSONDeSerializable):
# pylint: disable=missing-docstring
def __init__(self, value):
self.value = value
@@ -965,8 +979,8 @@ class ClientNetworkTest(unittest.TestCase):
def test_check_response_not_ok_jobj_error(self):
self.response.ok = False
self.response.json.return_value = messages.Error(
detail='foo', typ='serverInternal', title='some title').to_json()
self.response.json.return_value = messages.Error.with_code(
'serverInternal', detail='foo', title='some title').to_json()
# pylint: disable=protected-access
self.assertRaises(
messages.Error, self.net._check_response, self.response)
@@ -991,10 +1005,39 @@ class ClientNetworkTest(unittest.TestCase):
self.response.json.side_effect = ValueError
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
self.response.headers['Content-Type'] = response_ct
# pylint: disable=protected-access,no-value-for-parameter
# pylint: disable=protected-access
self.assertEqual(
self.response, self.net._check_response(self.response))
@mock.patch('acme.client.logger')
def test_check_response_ok_ct_with_charset(self, mock_logger):
self.response.json.return_value = {}
self.response.headers['Content-Type'] = 'application/json; charset=utf-8'
# pylint: disable=protected-access
self.assertEqual(self.response, self.net._check_response(
self.response, content_type='application/json'))
try:
mock_logger.debug.assert_called_with(
'Ignoring wrong Content-Type (%r) for JSON decodable response',
'application/json; charset=utf-8'
)
except AssertionError:
return
raise AssertionError('Expected Content-Type warning ' #pragma: no cover
'to not have been logged')
@mock.patch('acme.client.logger')
def test_check_response_ok_bad_ct(self, mock_logger):
self.response.json.return_value = {}
self.response.headers['Content-Type'] = 'text/plain'
# pylint: disable=protected-access
self.assertEqual(self.response, self.net._check_response(
self.response, content_type='application/json'))
mock_logger.debug.assert_called_with(
'Ignoring wrong Content-Type (%r) for JSON decodable response',
'text/plain'
)
def test_check_response_conflict(self):
self.response.ok = False
self.response.status_code = 409
@@ -1005,7 +1048,7 @@ class ClientNetworkTest(unittest.TestCase):
self.response.json.return_value = {}
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
self.response.headers['Content-Type'] = response_ct
# pylint: disable=protected-access,no-value-for-parameter
# pylint: disable=protected-access
self.assertEqual(
self.response, self.net._check_response(self.response))
@@ -1130,8 +1173,8 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
self.response.headers = {}
self.response.links = {}
self.response.checked = False
self.acmev1_nonce_response = mock.MagicMock(ok=False,
status_code=http_client.METHOD_NOT_ALLOWED)
self.acmev1_nonce_response = mock.MagicMock(
ok=False, status_code=http_client.METHOD_NOT_ALLOWED)
self.acmev1_nonce_response.headers = {}
self.obj = mock.MagicMock()
self.wrapped_obj = mock.MagicMock()
@@ -1299,7 +1342,7 @@ class ClientNetworkSourceAddressBindingTest(unittest.TestCase):
# test should fail if the default adapter type is changed by requests
net = ClientNetwork(key=None, alg=None)
session = requests.Session()
for scheme in session.adapters.keys():
for scheme in session.adapters:
client_network_adapter = net.session.adapters.get(scheme)
default_adapter = session.adapters.get(scheme)
self.assertEqual(client_network_adapter.__class__, default_adapter.__class__)

View File

@@ -5,21 +5,18 @@ import threading
import time
import unittest
import six
from six.moves import socketserver #type: ignore # pylint: disable=import-error
import josepy as jose
import OpenSSL
import six
from six.moves import socketserver # type: ignore # pylint: disable=import-error
from acme import errors
from acme import test_util
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
import test_util
class SSLSocketAndProbeSNITest(unittest.TestCase):
"""Tests for acme.crypto_util.SSLSocket/probe_sni."""
def setUp(self):
self.cert = test_util.load_comparable_cert('rsa2048_cert.pem')
key = test_util.load_pyopenssl_private_key('rsa2048_key.pem')
@@ -33,13 +30,13 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
# six.moves.* | pylint: disable=attribute-defined-outside-init,no-init
def server_bind(self): # pylint: disable=missing-docstring
self.socket = SSLSocket(socket.socket(), certs=certs)
self.socket = SSLSocket(socket.socket(),
certs)
socketserver.TCPServer.server_bind(self)
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
self.port = self.server.socket.getsockname()[1]
self.server_thread = threading.Thread(
# pylint: disable=no-member
target=self.server.handle_request)
def tearDown(self):
@@ -66,7 +63,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
def test_probe_connection_error(self):
# pylint has a hard time with six
self.server.server_close() # pylint: disable=no-member
self.server.server_close()
original_timeout = socket.getdefaulttimeout()
try:
socket.setdefaulttimeout(1)
@@ -75,6 +72,18 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
socket.setdefaulttimeout(original_timeout)
class SSLSocketTest(unittest.TestCase):
"""Tests for acme.crypto_util.SSLSocket."""
def test_ssl_socket_invalid_arguments(self):
from acme.crypto_util import SSLSocket
with self.assertRaises(ValueError):
_ = SSLSocket(None, {'sni': ('key', 'cert')},
cert_selection=lambda _: None)
with self.assertRaises(ValueError):
_ = SSLSocket(None)
class PyOpenSSLCertOrReqAllNamesTest(unittest.TestCase):
"""Test for acme.crypto_util._pyopenssl_cert_or_req_all_names."""

View File

@@ -1,7 +1,10 @@
"""Tests for acme.errors."""
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
class BadNonceTest(unittest.TestCase):
@@ -35,7 +38,7 @@ class PollErrorTest(unittest.TestCase):
def setUp(self):
from acme.errors import PollError
self.timeout = PollError(
exhausted=set([mock.sentinel.AR]),
exhausted={mock.sentinel.AR},
updated={})
self.invalid = PollError(exhausted=set(), updated={
mock.sentinel.AR: mock.sentinel.AR2})

View File

@@ -2,6 +2,7 @@
import importlib
import unittest
class JoseTest(unittest.TestCase):
"""Tests for acme.jose shim."""
@@ -20,11 +21,10 @@ class JoseTest(unittest.TestCase):
# We use the imports below with eval, but pylint doesn't
# understand that.
# pylint: disable=eval-used,unused-variable
import acme
import josepy
acme_jose_mod = eval(acme_jose_path)
josepy_mod = eval(josepy_path)
import acme # pylint: disable=unused-import
import josepy # pylint: disable=unused-import
acme_jose_mod = eval(acme_jose_path) # pylint: disable=eval-used
josepy_mod = eval(josepy_path) # pylint: disable=eval-used
self.assertIs(acme_jose_mod, josepy_mod)
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))

View File

@@ -3,8 +3,7 @@ import unittest
import josepy as jose
from acme import test_util
import test_util
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))

View File

@@ -2,7 +2,10 @@
import sys
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
class MagicTypingTest(unittest.TestCase):
@@ -18,7 +21,7 @@ class MagicTypingTest(unittest.TestCase):
sys.modules['typing'] = typing_class_mock
if 'acme.magic_typing' in sys.modules:
del sys.modules['acme.magic_typing'] # pragma: no cover
from acme.magic_typing import Text # pylint: disable=no-name-in-module
from acme.magic_typing import Text
self.assertEqual(Text, text_mock)
del sys.modules['acme.magic_typing']
sys.modules['typing'] = temp_typing
@@ -31,7 +34,7 @@ class MagicTypingTest(unittest.TestCase):
sys.modules['typing'] = None
if 'acme.magic_typing' in sys.modules:
del sys.modules['acme.magic_typing'] # pragma: no cover
from acme.magic_typing import Text # pylint: disable=no-name-in-module
from acme.magic_typing import Text
self.assertTrue(Text is None)
del sys.modules['acme.magic_typing']
sys.modules['typing'] = temp_typing

View File

@@ -2,12 +2,13 @@
import unittest
import josepy as jose
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from acme import challenges
from acme import test_util
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
import test_util
CERT = test_util.load_comparable_cert('cert.der')
CSR = test_util.load_comparable_csr('csr.der')
@@ -19,8 +20,7 @@ class ErrorTest(unittest.TestCase):
def setUp(self):
from acme.messages import Error, ERROR_PREFIX
self.error = Error(
detail='foo', typ=ERROR_PREFIX + 'malformed', title='title')
self.error = Error.with_code('malformed', detail='foo', title='title')
self.jobj = {
'detail': 'foo',
'title': 'some title',
@@ -28,7 +28,6 @@ class ErrorTest(unittest.TestCase):
}
self.error_custom = Error(typ='custom', detail='bar')
self.empty_error = Error()
self.jobj_custom = {'type': 'custom', 'detail': 'bar'}
def test_default_typ(self):
from acme.messages import Error
@@ -43,8 +42,7 @@ class ErrorTest(unittest.TestCase):
hash(Error.from_json(self.error.to_json()))
def test_description(self):
self.assertEqual(
'The request message was malformed', self.error.description)
self.assertEqual('The request message was malformed', self.error.description)
self.assertTrue(self.error_custom.description is None)
def test_code(self):
@@ -54,17 +52,17 @@ class ErrorTest(unittest.TestCase):
self.assertEqual(None, Error().code)
def test_is_acme_error(self):
from acme.messages import is_acme_error
from acme.messages import is_acme_error, Error
self.assertTrue(is_acme_error(self.error))
self.assertFalse(is_acme_error(self.error_custom))
self.assertFalse(is_acme_error(Error()))
self.assertFalse(is_acme_error(self.empty_error))
self.assertFalse(is_acme_error("must pet all the {dogs|rabbits}"))
def test_unicode_error(self):
from acme.messages import Error, ERROR_PREFIX, is_acme_error
arabic_error = Error(
detail=u'\u0639\u062f\u0627\u0644\u0629', typ=ERROR_PREFIX + 'malformed',
title='title')
from acme.messages import Error, is_acme_error
arabic_error = Error.with_code(
'malformed', detail=u'\u0639\u062f\u0627\u0644\u0629', title='title')
self.assertTrue(is_acme_error(arabic_error))
def test_with_code(self):
@@ -110,11 +108,11 @@ class ConstantTest(unittest.TestCase):
def test_equality(self):
const_a_prime = self.MockConstant('a')
self.assertFalse(self.const_a == self.const_b)
self.assertTrue(self.const_a == const_a_prime)
self.assertNotEqual(self.const_a, self.const_b)
self.assertEqual(self.const_a, const_a_prime)
self.assertTrue(self.const_a != self.const_b)
self.assertFalse(self.const_a != const_a_prime)
self.assertNotEqual(self.const_a, self.const_b)
self.assertEqual(self.const_a, const_a_prime)
class DirectoryTest(unittest.TestCase):
@@ -256,6 +254,19 @@ class RegistrationTest(unittest.TestCase):
from acme.messages import Registration
hash(Registration.from_json(self.jobj_from))
def test_default_not_transmitted(self):
from acme.messages import NewRegistration
empty_new_reg = NewRegistration()
new_reg_with_contact = NewRegistration(contact=())
self.assertEqual(empty_new_reg.contact, ())
self.assertEqual(new_reg_with_contact.contact, ())
self.assertTrue('contact' not in empty_new_reg.to_partial_json())
self.assertTrue('contact' not in empty_new_reg.fields_to_partial_json())
self.assertTrue('contact' in new_reg_with_contact.to_partial_json())
self.assertTrue('contact' in new_reg_with_contact.fields_to_partial_json())
class UpdateRegistrationTest(unittest.TestCase):
"""Tests for acme.messages.UpdateRegistration."""
@@ -305,8 +316,7 @@ class ChallengeBodyTest(unittest.TestCase):
from acme.messages import Error
from acme.messages import STATUS_INVALID
self.status = STATUS_INVALID
error = Error(typ='urn:ietf:params:acme:error:serverInternal',
detail='Unable to communicate with DNS server')
error = Error.with_code('serverInternal', detail='Unable to communicate with DNS server')
self.challb = ChallengeBody(
uri='http://challb', chall=self.chall, status=self.status,
error=error)
@@ -458,6 +468,7 @@ class OrderResourceTest(unittest.TestCase):
'authorizations': None,
})
class NewOrderTest(unittest.TestCase):
"""Tests for acme.messages.NewOrder."""
@@ -472,5 +483,18 @@ class NewOrderTest(unittest.TestCase):
})
class JWSPayloadRFC8555Compliant(unittest.TestCase):
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
def test_message_payload(self):
from acme.messages import NewAuthorization
new_order = NewAuthorization()
new_order.le_acme_version = 2
jobj = new_order.json_dumps(indent=2).encode()
# RFC8555 states that JWS bodies must not have a resource field.
self.assertEqual(jobj, b'{}')
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -3,16 +3,20 @@ import socket
import threading
import unittest
import josepy as jose
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import requests
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
import josepy as jose
import mock
import requests
from acme import challenges
from acme import test_util
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme import crypto_util
from acme import errors
import test_util
class TLSServerTest(unittest.TestCase):
@@ -84,6 +88,81 @@ class HTTP01ServerTest(unittest.TestCase):
def test_http01_not_found(self):
self.assertFalse(self._test_http01(add=False))
def test_timely_shutdown(self):
from acme.standalone import HTTP01Server
server = HTTP01Server(('', 0), resources=set(), timeout=0.05)
server_thread = threading.Thread(target=server.serve_forever)
server_thread.start()
client = socket.socket()
client.connect(('localhost', server.socket.getsockname()[1]))
stop_thread = threading.Thread(target=server.shutdown)
stop_thread.start()
server_thread.join(5.)
is_hung = server_thread.is_alive()
try:
client.shutdown(socket.SHUT_RDWR)
except: # pragma: no cover, pylint: disable=bare-except
# may raise error because socket could already be closed
pass
self.assertFalse(is_hung, msg='Server shutdown should not be hung')
@unittest.skipIf(not challenges.TLSALPN01.is_supported(), "pyOpenSSL too old")
class TLSALPN01ServerTest(unittest.TestCase):
"""Test for acme.standalone.TLSALPN01Server."""
def setUp(self):
self.certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
test_util.load_cert('rsa2048_cert.pem'),
)}
# Use different certificate for challenge.
self.challenge_certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa4096_key.pem'),
test_util.load_cert('rsa4096_cert.pem'),
)}
from acme.standalone import TLSALPN01Server
self.server = TLSALPN01Server(("localhost", 0), certs=self.certs,
challenge_certs=self.challenge_certs)
# pylint: disable=no-member
self.thread = threading.Thread(target=self.server.serve_forever)
self.thread.start()
def tearDown(self):
self.server.shutdown() # pylint: disable=no-member
self.thread.join()
# TODO: This is not implemented yet, see comments in standalone.py
# def test_certs(self):
# host, port = self.server.socket.getsockname()[:2]
# cert = crypto_util.probe_sni(
# b'localhost', host=host, port=port, timeout=1)
# # Expect normal cert when connecting without ALPN.
# self.assertEqual(jose.ComparableX509(cert),
# jose.ComparableX509(self.certs[b'localhost'][1]))
def test_challenge_certs(self):
host, port = self.server.socket.getsockname()[:2]
cert = crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1,
alpn_protocols=[b"acme-tls/1"])
# Expect challenge cert when connecting with ALPN.
self.assertEqual(
jose.ComparableX509(cert),
jose.ComparableX509(self.challenge_certs[b'localhost'][1])
)
def test_bad_alpn(self):
host, port = self.server.socket.getsockname()[:2]
with self.assertRaises(errors.Error):
crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1,
alpn_protocols=[b"bad-alpn"])
class BaseDualNetworkedServersTest(unittest.TestCase):
"""Test for acme.standalone.BaseDualNetworkedServers."""
@@ -139,7 +218,6 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
class HTTP01DualNetworkedServersTest(unittest.TestCase):
"""Tests for acme.standalone.HTTP01DualNetworkedServers."""
def setUp(self):
self.account_key = jose.JWK.load(
test_util.load_vector('rsa1024_key.pem'))

View File

@@ -4,12 +4,12 @@
"""
import os
import pkg_resources
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import serialization
import josepy as jose
from OpenSSL import crypto
import pkg_resources
def load_vector(*names):
@@ -25,8 +25,7 @@ def _guess_loader(filename, loader_pem, loader_der):
return loader_pem
elif ext.lower() == '.der':
return loader_der
else: # pragma: no cover
raise ValueError("Loader could not be recognized based on extension")
raise ValueError("Loader could not be recognized based on extension") # pragma: no cover
def load_cert(*names):

View File

@@ -4,12 +4,14 @@ to use appropriate extension for vector filenames: .pem for PEM and
The following command has been used to generate test keys:
for x in 256 512 1024 2048; do openssl genrsa -out rsa${k}_key.pem $k; done
for k in 256 512 1024 2048 4096; do openssl genrsa -out rsa${k}_key.pem $k; done
and for the CSR:
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -outform DER > csr.der
and for the certificate:
and for the certificates:
openssl req -key rsa2047_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 > rsa2048_cert.pem
openssl req -key rsa1024_key.pem -new -subj '/CN=example.com' -x509 > rsa1024_cert.pem

13
acme/tests/testdata/rsa1024_cert.pem vendored Normal file
View File

@@ -0,0 +1,13 @@
-----BEGIN CERTIFICATE-----
MIIB/TCCAWagAwIBAgIJAOyRIBs3QT8QMA0GCSqGSIb3DQEBCwUAMBYxFDASBgNV
BAMMC2V4YW1wbGUuY29tMB4XDTE4MDQyMzEwMzE0NFoXDTE4MDUyMzEwMzE0NFow
FjEUMBIGA1UEAwwLZXhhbXBsZS5jb20wgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJ
AoGBAJqJ87R8aVwByONxgQA9hwgvQd/QqI1r1UInXhEF2VnEtZGtUWLi100IpIqr
Mq4qusDwNZ3g8cUPtSkvJGs89djoajMDIJP7lQUEKUYnYrI0q755Tr/DgLWSk7iW
l5ezym0VzWUD0/xXUz8yRbNMTjTac80rS5SZk2ja2wWkYlRJAgMBAAGjUzBRMB0G
A1UdDgQWBBSsaX0IVZ4XXwdeffVAbG7gnxSYjTAfBgNVHSMEGDAWgBSsaX0IVZ4X
XwdeffVAbG7gnxSYjTAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4GB
ADe7SVmvGH2nkwVfONk8TauRUDkePN1CJZKFb2zW1uO9ANJ2v5Arm/OQp0BG/xnI
Djw/aLTNVESF89oe15dkrUErtcaF413MC1Ld5lTCaJLHLGqDKY69e02YwRuxW7jY
qarpt7k7aR5FbcfO5r4V/FK/Gvp4Dmoky8uap7SJIW6x
-----END CERTIFICATE-----

30
acme/tests/testdata/rsa4096_cert.pem vendored Normal file
View File

@@ -0,0 +1,30 @@
-----BEGIN CERTIFICATE-----
MIIFDTCCAvWgAwIBAgIUImqDrP53V69vFROsjP/gL0YtoA4wDQYJKoZIhvcNAQEL
BQAwFjEUMBIGA1UEAwwLZXhhbXBsZS5jb20wHhcNMjAwNTI3MjMyNDE0WhcNMjAw
NjI2MjMyNDE0WjAWMRQwEgYDVQQDDAtleGFtcGxlLmNvbTCCAiIwDQYJKoZIhvcN
AQEBBQADggIPADCCAgoCggIBANY9LKLk9Dxn0MUMQFHwBoTN4ehDSWBws2KcytpF
mc8m9Mfk1wmb4fQSKYtK3wIFMfIyo9HQu0nKqMkkUw52o3ZXyOv+oWwF5qNy2BKu
lh5OMSkaZ0o13zoPpW42e+IUnyxvg70+0urD+sUue4cyTHh/nBIUjrM/05ZJ/ac8
HR0RK3H41YoqBjq69JjMZczZZhbNFit3s6p0R1TbVAgc3ckqbtX5BDyQMQQCP4Ed
m4DgbAFVqdcPUCC5W3F3fmuQiPKHiADzONZnXpy6lUvLDWqcd6loKp+nKHM6OkXX
8hmD7pE1PYMQo4hqOfhBR2IgMjAShwd5qUFjl1m2oo0Qm3PFXOk6i2ZQdS6AA/yd
B5/mX0RnM2oIdFZPb6UZFSmtEgs9sTzn+hMUyNSZQRE54px1ur1xws2R+vbsCyM5
+KoFVxDjVjU9TlZx3GvDvnqz/tbHjji6l8VHZYOBMBUXbKHu2U6pJFZ5Zp7k68/z
a3Fb9Pjtn3iRkXEyC0N5kLgqO4QTlExnxebV8aMvQpWd/qefnMn9qPYIZPEXSQAR
mEBIahkcACb60s+acG0WFFluwBPtBqEr8Q67XlSF0Ibf4iBiRzpPobhlWta1nrFg
4IWHMSoZ0PE75bhIGBEkhrpcXQCAxXmAfxfjKDH7jdJ1fRdnZ/9+OzwYGVX5GH/l
0QDtAgMBAAGjUzBRMB0GA1UdDgQWBBQh3xiz/o1nEU2ySylZ9gxCXvIPGzAfBgNV
HSMEGDAWgBQh3xiz/o1nEU2ySylZ9gxCXvIPGzAPBgNVHRMBAf8EBTADAQH/MA0G
CSqGSIb3DQEBCwUAA4ICAQAELoXz31oR9pdAwidlv9ZBOKiC7KBWy8VMqXNVkfTn
bVRxAUex7zleLFIOkWnqadsMesU9sIwrbLzBcZ8Q/vBY+z2xOPdXcgcAoAmdKWoq
YBQNiqng9r54sqlzB/77QZCf5fdktESe7NTxhCifgx5SAWq7IUQs/lm3tnMUSAfE
5ctuN6M+w8K54y3WDprcfMHpnc3ZHeSPhVQApHM0h/bDvXq0bRS7kmq27Hb153Qm
nH3TwYB5pPSWW38NbUc+s/a7mItO7S8ly8yGbA0j9c/IbN5lM+OCdk06asz3+c8E
uo8nuCBoYO5+6AqC2N7WJ3Tdr/pFA8jTbd6VNVlgCWTIR8ZosL5Fgkfv+4fUBrHt
zdVUqMUzvga5rvZnwnJ5Qfu/drHeAAo9MTNFQNe2QgDlYfWBh5GweolgmFSwrpkY
v/5wLtIyv/ASHKswybbqMIlpttcLTXjx5yuh8swttT6Wh+FQqqQ32KSRB3StiwyK
oH0ZhrwYHiFYNlPxecGX6XUta6rFtTlEdkBGSnXzgiTzL2l+Nc0as0V5B9RninZG
qJ+VOChSQ0OFvg1riSXv7tMvbLdGQnxwTRL3t6BMS8I4LA2m3ZfWUcuXT783ODTH
16f1Q1AgXd2csstTWO9cv+N/0fpX31nqrm6+CrGduSr2u4HjYYnlLIUhmdTvK3fX
Fg==
-----END CERTIFICATE-----

51
acme/tests/testdata/rsa4096_key.pem vendored Normal file
View File

@@ -0,0 +1,51 @@
-----BEGIN RSA PRIVATE KEY-----
MIIJKgIBAAKCAgEA1j0souT0PGfQxQxAUfAGhM3h6ENJYHCzYpzK2kWZzyb0x+TX
CZvh9BIpi0rfAgUx8jKj0dC7ScqoySRTDnajdlfI6/6hbAXmo3LYEq6WHk4xKRpn
SjXfOg+lbjZ74hSfLG+DvT7S6sP6xS57hzJMeH+cEhSOsz/Tlkn9pzwdHRErcfjV
iioGOrr0mMxlzNlmFs0WK3ezqnRHVNtUCBzdySpu1fkEPJAxBAI/gR2bgOBsAVWp
1w9QILlbcXd+a5CI8oeIAPM41mdenLqVS8sNapx3qWgqn6coczo6RdfyGYPukTU9
gxCjiGo5+EFHYiAyMBKHB3mpQWOXWbaijRCbc8Vc6TqLZlB1LoAD/J0Hn+ZfRGcz
agh0Vk9vpRkVKa0SCz2xPOf6ExTI1JlBETninHW6vXHCzZH69uwLIzn4qgVXEONW
NT1OVnHca8O+erP+1seOOLqXxUdlg4EwFRdsoe7ZTqkkVnlmnuTrz/NrcVv0+O2f
eJGRcTILQ3mQuCo7hBOUTGfF5tXxoy9ClZ3+p5+cyf2o9ghk8RdJABGYQEhqGRwA
JvrSz5pwbRYUWW7AE+0GoSvxDrteVIXQht/iIGJHOk+huGVa1rWesWDghYcxKhnQ
8TvluEgYESSGulxdAIDFeYB/F+MoMfuN0nV9F2dn/347PBgZVfkYf+XRAO0CAwEA
AQKCAgEA0hZdTkQtCYtYm9LexDsXeWYX8VcCfrMmBj7xYcg9A3oVMmzDPuYBVwH0
gWbjd6y2hOaJ5TfGYZ99kvmvBRDsTSHaoyopC7BhssjtAKz6Ay/0X3VH8usPQ3WS
aZi+NT65tK6KRqtz08ppgLGLa1G00bl5x/Um1rpxeACI4FU/y4BJ1VMJvJpnT3KE
Z86Qyagqx5NH+UpCApZSWPFX3zjHePzGgcfXErjniCHYOnpZQrFQ2KIzkfSvQ9fg
x01ByKOM2CB2C1B33TCzBAioXRH6zyAu7A59NeCK9ywTduhDvie1a+oEryFC7IQW
4s7I/H3MGX4hsf/pLXlHMy+5CZJOjRaC2h+pypfbbcuiXu6Sn64kHNpiI7SxI5DI
MIRjyG7MdUcrzq0Rt8ogwwpbCoRqrl/w3bhxtqmeZaEZtyxbjlm7reK2YkIFDgyz
JMqiJK5ZAi+9L/8c0xhjjAQQ0sIzrjmjA8U+6YnWL9jU5qXTVnBB8XQucyeeZGgk
yRHyMur71qOXN8z3UEva7MHkDTUBlj8DgTz6sEjqCipaWl0CXfDNa4IhHIXD5qiF
wplhq7OeS0v6EGG/UFa3Q/lFntxtrayxJX7uvvSccGzjPKXTjpWUELLi/FdnIsum
eXT3RgIEYozj4BibDXaBLfHTCVzxOr7AAEvKM9XWSUgLA0paSWECggEBAO9ZBeE1
GWzd1ejTTkcxBC9AK2rNsYG8PdNqiof/iTbuJWNeRqpG+KB/0CNIpjZ2X5xZd0tM
FDpHTFehlP26Roxuq50iRAFc+SN5KoiO0A3JuJAidreIgRTia1saUUrypHqWrYEA
VZVj2AI8Pyg3s1OkR2frFskY7hXBVb/pJNDP/m9xTXXIYiIXYkHYe+4RIJCnAxRv
q5YHKaX+0Ull9YCZJCxmwvcHat8sgu8qkiwUMEM6QSNEkrEbdnWYBABvC1AR6sws
7MP1h9+j22n4Zc/3D6kpFZEL9Erx8nNyhbOZ6q2Tdnf6YKVVjZdyVa8VyNnR0ROl
3BjkFaHb/bg4e4kCggEBAOUk8ZJS3qBeGCOjug384zbHGcnhUBYtYJiOz+RXBtP+
PRksbFtTkgk1sHuSGO8YRddU4Qv7Av1xL8o+DEsLBSD0YQ7pmLrR/LK+iDQ5N63O
Fve9uJH0ybxAOkiua7G24+lTsVUP//KWToL4Wh5zbHBBjL5D2Z9zoeVbcE87xhva
lImMVr4Ex252DqNP9wkZxBjudFyJ/C/TnXrjPcgwhxWTC7sLQMhE5p+490G7c4hX
PywkIKrANbu37KDiAvVS+dC66ZgpL/NUDkeloAmGNO08LGzbV6YKchlvDyWU/AvW
0hYjbL0FUq7K/wp1G9fumolB+fbI25K9c13X93STzUUCggEBAJDsNFUyk5yJjbYW
C/WrRj9d+WwH9Az77+uNPSgvn+O0usq6EMuVgYGdImfa21lqv2Wp/kOHY1AOT7lX
yyD+oyzw7dSNJOQ2aVwDR6+72Vof5DLRy1RBwPbmSd61xrc8yD658YCEtU1pUSe5
VvyBDYH9nIbdn8RP5gkiMUusXXBaIFNWJXLFzDWcNxBrhk6V7EPp/EFphFmpKJyr
+AkbRVWCZJbF+hMdWKadCwLJogwyhS6PnVU/dhrq6AU38GRa2Fy5HJRYN1xH1Oej
DX3Su8L6c28Xw0k6FcczTHx+wVoIPkKvYTIwVkiFzt/+iMckx6KsGo5tBSHFKRwC
WlQrTxECggEBALjUruLnY1oZ7AC7bTUhOimSOfQEgTQSUCtebsRxijlvhtsKYTDd
XRt+qidStjgN7S/+8DRYuZWzOeg5WnMhpXZqiOudcyume922IGl3ibjxVsdoyjs5
J4xohlrgDlBgBMDNWGoTqNGFejjcmNydH+gAh8VlN2INxJYbxqCyx17qVgwJHmLR
uggYxD/pHYvCs9GkbknCp5/wYsOgDtKuihfV741lS1D/esN1UEQ+LrfYIEW7snno
5q7Pcdhn1hkKYCWEzy2Ec4Aj2gzixQ9JqOF/OxpnZvCw1k47rg0TeqcWFYnz8x8Y
7xO8/DH0OoxXk2GJzVXJuItJs4gLzzfCjL0CggEAJFHfC9jisdy7CoWiOpNCSF1B
S0/CWDz77cZdlWkpTdaXGGp1MA/UKUFPIH8sOHfvpKS660+X4G/1ZBHmFb4P5kFF
Qy8UyUMKtSOEdZS6KFlRlfSCAMd5aSTmCvq4OSjYEpMRwUhU/iEJNkn9Z1Soehe0
U3dxJ8KiT1071geO6rRquSHoSJs6Y0WQKriYYQJOhh4Axs3PQihER2eyh+WGk8YJ
02m0mMsjntqnXtdc6IcdKaHp9ko+OpM9QZLsvt19fxBcrXj/i21uUXrzuNtKfO6M
JqGhsOrO2dh8lMhvodENvgKA0DmYDC9N7ogo7bxTNSedcjBF46FhJoqii8m70Q==
-----END RSA PRIVATE KEY-----

View File

@@ -1,7 +1,7 @@
include LICENSE.txt
include README.rst
recursive-include docs *
recursive-include certbot_apache/tests/testdata *
include certbot_apache/centos-options-ssl-apache.conf
include certbot_apache/options-ssl-apache.conf
recursive-include certbot_apache/augeas_lens *.aug
recursive-include tests *
recursive-include certbot_apache/_internal/augeas_lens *.aug
recursive-include certbot_apache/_internal/tls_configs *.conf
global-exclude __pycache__
global-exclude *.py[cod]

View File

@@ -0,0 +1 @@
"""Certbot Apache plugin."""

View File

@@ -0,0 +1,257 @@
""" Utility functions for certbot-apache plugin """
import binascii
import fnmatch
import logging
import re
import subprocess
import pkg_resources
from certbot import errors
from certbot import util
from certbot.compat import os
logger = logging.getLogger(__name__)
def get_mod_deps(mod_name):
"""Get known module dependencies.
.. note:: This does not need to be accurate in order for the client to
run. This simply keeps things clean if the user decides to revert
changes.
.. warning:: If all deps are not included, it may cause incorrect parsing
behavior, due to enable_mod's shortcut for updating the parser's
currently defined modules (`.ApacheParser.add_mod`)
This would only present a major problem in extremely atypical
configs that use ifmod for the missing deps.
"""
deps = {
"ssl": ["setenvif", "mime"]
}
return deps.get(mod_name, [])
def get_file_path(vhost_path):
"""Get file path from augeas_vhost_path.
Takes in Augeas path and returns the file name
:param str vhost_path: Augeas virtual host path
:returns: filename of vhost
:rtype: str
"""
if not vhost_path or not vhost_path.startswith("/files/"):
return None
return _split_aug_path(vhost_path)[0]
def get_internal_aug_path(vhost_path):
"""Get the Augeas path for a vhost with the file path removed.
:param str vhost_path: Augeas virtual host path
:returns: Augeas path to vhost relative to the containing file
:rtype: str
"""
return _split_aug_path(vhost_path)[1]
def _split_aug_path(vhost_path):
"""Splits an Augeas path into a file path and an internal path.
After removing "/files", this function splits vhost_path into the
file path and the remaining Augeas path.
:param str vhost_path: Augeas virtual host path
:returns: file path and internal Augeas path
:rtype: `tuple` of `str`
"""
# Strip off /files
file_path = vhost_path[6:]
internal_path = []
# Remove components from the end of file_path until it becomes valid
while not os.path.exists(file_path):
file_path, _, internal_path_part = file_path.rpartition("/")
internal_path.append(internal_path_part)
return file_path, "/".join(reversed(internal_path))
def parse_define_file(filepath, varname):
""" Parses Defines from a variable in configuration file
:param str filepath: Path of file to parse
:param str varname: Name of the variable
:returns: Dict of Define:Value pairs
:rtype: `dict`
"""
return_vars = {}
# Get list of words in the variable
a_opts = util.get_var_from_file(varname, filepath).split()
for i, v in enumerate(a_opts):
# Handle Define statements and make sure it has an argument
if v == "-D" and len(a_opts) >= i+2:
var_parts = a_opts[i+1].partition("=")
return_vars[var_parts[0]] = var_parts[2]
elif len(v) > 2 and v.startswith("-D"):
# Found var with no whitespace separator
var_parts = v[2:].partition("=")
return_vars[var_parts[0]] = var_parts[2]
return return_vars
def unique_id():
""" Returns an unique id to be used as a VirtualHost identifier"""
return binascii.hexlify(os.urandom(16)).decode("utf-8")
def included_in_paths(filepath, paths):
"""
Returns true if the filepath is included in the list of paths
that may contain full paths or wildcard paths that need to be
expanded.
:param str filepath: Filepath to check
:params list paths: List of paths to check against
:returns: True if included
:rtype: bool
"""
return any(fnmatch.fnmatch(filepath, path) for path in paths)
def parse_defines(apachectl):
"""
Gets Defines from httpd process and returns a dictionary of
the defined variables.
:param str apachectl: Path to apachectl executable
:returns: dictionary of defined variables
:rtype: dict
"""
variables = {}
define_cmd = [apachectl, "-t", "-D",
"DUMP_RUN_CFG"]
matches = parse_from_subprocess(define_cmd, r"Define: ([^ \n]*)")
try:
matches.remove("DUMP_RUN_CFG")
except ValueError:
return {}
for match in matches:
if match.count("=") > 1:
logger.error("Unexpected number of equal signs in "
"runtime config dump.")
raise errors.PluginError(
"Error parsing Apache runtime variables")
parts = match.partition("=")
variables[parts[0]] = parts[2]
return variables
def parse_includes(apachectl):
"""
Gets Include directives from httpd process and returns a list of
their values.
:param str apachectl: Path to apachectl executable
:returns: list of found Include directive values
:rtype: list of str
"""
inc_cmd = [apachectl, "-t", "-D",
"DUMP_INCLUDES"]
return parse_from_subprocess(inc_cmd, r"\(.*\) (.*)")
def parse_modules(apachectl):
"""
Get loaded modules from httpd process, and return the list
of loaded module names.
:param str apachectl: Path to apachectl executable
:returns: list of found LoadModule module names
:rtype: list of str
"""
mod_cmd = [apachectl, "-t", "-D",
"DUMP_MODULES"]
return parse_from_subprocess(mod_cmd, r"(.*)_module")
def parse_from_subprocess(command, regexp):
"""Get values from stdout of subprocess command
:param list command: Command to run
:param str regexp: Regexp for parsing
:returns: list parsed from command output
:rtype: list
"""
stdout = _get_runtime_cfg(command)
return re.compile(regexp).findall(stdout)
def _get_runtime_cfg(command):
"""
Get runtime configuration info.
:param command: Command to run
:returns: stdout from command
"""
try:
proc = subprocess.Popen(
command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True,
env=util.env_no_snap_for_external_calls())
stdout, stderr = proc.communicate()
except (OSError, ValueError):
logger.error(
"Error running command %s for runtime parameters!%s",
command, os.linesep)
raise errors.MisconfigurationError(
"Error accessing loaded Apache parameters: {0}".format(
command))
# Small errors that do not impede
if proc.returncode != 0:
logger.warning("Error in checking parameter list: %s", stderr)
raise errors.MisconfigurationError(
"Apache is unable to check whether or not the module is "
"loaded because Apache is misconfigured.")
return stdout
def find_ssl_apache_conf(prefix):
"""
Find a TLS Apache config file in the dedicated storage.
:param str prefix: prefix of the TLS Apache config file to find
:return: the path the TLS Apache config file
:rtype: str
"""
return pkg_resources.resource_filename(
"certbot_apache",
os.path.join("_internal", "tls_configs", "{0}-options-ssl-apache.conf".format(prefix)))

View File

@@ -0,0 +1,172 @@
""" apacheconfig implementation of the ParserNode interfaces """
from certbot_apache._internal import assertions
from certbot_apache._internal import interfaces
from certbot_apache._internal import parsernode_util as util
class ApacheParserNode(interfaces.ParserNode):
""" apacheconfig implementation of ParserNode interface.
Expects metadata `ac_ast` to be passed in, where `ac_ast` is the AST provided
by parsing the equivalent configuration text using the apacheconfig library.
"""
def __init__(self, **kwargs):
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs) # pylint: disable=unused-variable
super(ApacheParserNode, self).__init__(**kwargs)
self.ancestor = ancestor
self.filepath = filepath
self.dirty = dirty
self.metadata = metadata
self._raw = self.metadata["ac_ast"]
def save(self, msg): # pragma: no cover
pass
def find_ancestors(self, name): # pylint: disable=unused-variable
"""Find ancestor BlockNodes with a given name"""
return [ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
class ApacheCommentNode(ApacheParserNode):
""" apacheconfig implementation of CommentNode interface """
def __init__(self, **kwargs):
comment, kwargs = util.commentnode_kwargs(kwargs) # pylint: disable=unused-variable
super(ApacheCommentNode, self).__init__(**kwargs)
self.comment = comment
def __eq__(self, other): # pragma: no cover
if isinstance(other, self.__class__):
return (self.comment == other.comment and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata and
self.filepath == other.filepath)
return False
class ApacheDirectiveNode(ApacheParserNode):
""" apacheconfig implementation of DirectiveNode interface """
def __init__(self, **kwargs):
name, parameters, enabled, kwargs = util.directivenode_kwargs(kwargs)
super(ApacheDirectiveNode, self).__init__(**kwargs)
self.name = name
self.parameters = parameters
self.enabled = enabled
self.include = None
def __eq__(self, other): # pragma: no cover
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
def set_parameters(self, _parameters): # pragma: no cover
"""Sets the parameters for DirectiveNode"""
return
class ApacheBlockNode(ApacheDirectiveNode):
""" apacheconfig implementation of BlockNode interface """
def __init__(self, **kwargs):
super(ApacheBlockNode, self).__init__(**kwargs)
self.children = ()
def __eq__(self, other): # pragma: no cover
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.children == other.children and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
# pylint: disable=unused-argument
def add_child_block(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new BlockNode to the sequence of children"""
new_block = ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)
self.children += (new_block,)
return new_block
# pylint: disable=unused-argument
def add_child_directive(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new DirectiveNode to the sequence of children"""
new_dir = ApacheDirectiveNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)
self.children += (new_dir,)
return new_dir
# pylint: disable=unused-argument
def add_child_comment(self, comment="", position=None): # pragma: no cover
"""Adds a new CommentNode to the sequence of children"""
new_comment = ApacheCommentNode(comment=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)
self.children += (new_comment,)
return new_comment
def find_blocks(self, name, exclude=True): # pylint: disable=unused-argument
"""Recursive search of BlockNodes from the sequence of children"""
return [ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
def find_directives(self, name, exclude=True): # pylint: disable=unused-argument
"""Recursive search of DirectiveNodes from the sequence of children"""
return [ApacheDirectiveNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
# pylint: disable=unused-argument
def find_comments(self, comment, exact=False): # pragma: no cover
"""Recursive search of DirectiveNodes from the sequence of children"""
return [ApacheCommentNode(comment=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
def delete_child(self, child): # pragma: no cover
"""Deletes a ParserNode from the sequence of children"""
return
def unsaved_files(self): # pragma: no cover
"""Returns a list of unsaved filepaths"""
return [assertions.PASS]
def parsed_paths(self): # pragma: no cover
"""Returns a list of parsed configuration file paths"""
return [assertions.PASS]
interfaces.CommentNode.register(ApacheCommentNode)
interfaces.DirectiveNode.register(ApacheDirectiveNode)
interfaces.BlockNode.register(ApacheBlockNode)

View File

@@ -0,0 +1,142 @@
"""Dual parser node assertions"""
import fnmatch
from certbot_apache._internal import interfaces
PASS = "CERTBOT_PASS_ASSERT"
def assertEqual(first, second):
""" Equality assertion """
if isinstance(first, interfaces.CommentNode):
assertEqualComment(first, second)
elif isinstance(first, interfaces.DirectiveNode):
assertEqualDirective(first, second)
# Do an extra interface implementation assertion, as the contents were
# already checked for BlockNode in the assertEqualDirective
if isinstance(first, interfaces.BlockNode):
assert isinstance(second, interfaces.BlockNode)
# Skip tests if filepath includes the pass value. This is done
# because filepath is variable of the base ParserNode interface, and
# unless the implementation is actually done, we cannot assume getting
# correct results from boolean assertion for dirty
if not isPass(first.filepath) and not isPass(second.filepath):
assert first.dirty == second.dirty
# We might want to disable this later if testing with two separate
# (but identical) directory structures.
assert first.filepath == second.filepath
def assertEqualComment(first, second): # pragma: no cover
""" Equality assertion for CommentNode """
assert isinstance(first, interfaces.CommentNode)
assert isinstance(second, interfaces.CommentNode)
if not isPass(first.comment) and not isPass(second.comment): # type: ignore
assert first.comment == second.comment # type: ignore
def _assertEqualDirectiveComponents(first, second): # pragma: no cover
""" Handles assertion for instance variables for DirectiveNode and BlockNode"""
# Enabled value cannot be asserted, because Augeas implementation
# is unable to figure that out.
# assert first.enabled == second.enabled
if not isPass(first.name) and not isPass(second.name):
assert first.name == second.name
if not isPass(first.parameters) and not isPass(second.parameters):
assert first.parameters == second.parameters
def assertEqualDirective(first, second):
""" Equality assertion for DirectiveNode """
assert isinstance(first, interfaces.DirectiveNode)
assert isinstance(second, interfaces.DirectiveNode)
_assertEqualDirectiveComponents(first, second)
def isPass(value): # pragma: no cover
"""Checks if the value is set to PASS"""
if isinstance(value, bool):
return True
return PASS in value
def isPassDirective(block):
""" Checks if BlockNode or DirectiveNode should pass the assertion """
if isPass(block.name):
return True
if isPass(block.parameters): # pragma: no cover
return True
if isPass(block.filepath): # pragma: no cover
return True
return False
def isPassComment(comment):
""" Checks if CommentNode should pass the assertion """
if isPass(comment.comment):
return True
if isPass(comment.filepath): # pragma: no cover
return True
return False
def isPassNodeList(nodelist): # pragma: no cover
""" Checks if a ParserNode in the nodelist should pass the assertion,
this function is used for results of find_* methods. Unimplemented find_*
methods should return a sequence containing a single ParserNode instance
with assertion pass string."""
try:
node = nodelist[0]
except IndexError:
node = None
if not node: # pragma: no cover
return False
if isinstance(node, interfaces.DirectiveNode):
return isPassDirective(node)
return isPassComment(node)
def assertEqualSimple(first, second):
""" Simple assertion """
if not isPass(first) and not isPass(second):
assert first == second
def isEqualVirtualHost(first, second):
"""
Checks that two VirtualHost objects are similar. There are some built
in differences with the implementations: VirtualHost created by ParserNode
implementation doesn't have "path" defined, as it was used for Augeas path
and that cannot obviously be used in the future. Similarly the legacy
version lacks "node" variable, that has a reference to the BlockNode for the
VirtualHost.
"""
return (
first.name == second.name and
first.aliases == second.aliases and
first.filep == second.filep and
first.addrs == second.addrs and
first.ssl == second.ssl and
first.enabled == second.enabled and
first.modmacro == second.modmacro and
first.ancestor == second.ancestor
)
def assertEqualPathsList(first, second): # pragma: no cover
"""
Checks that the two lists of file paths match. This assertion allows for wildcard
paths.
"""
if any(isPass(path) for path in first):
return
if any(isPass(path) for path in second):
return
for fpath in first:
assert any([fnmatch.fnmatch(fpath, spath) for spath in second])
for spath in second:
assert any([fnmatch.fnmatch(fpath, spath) for fpath in first])

View File

@@ -6,7 +6,7 @@ Authors:
Raphael Pinson <raphink@gmail.com>
About: Reference
Online Apache configuration manual: http://httpd.apache.org/docs/trunk/
Online Apache configuration manual: https://httpd.apache.org/docs/trunk/
About: License
This file is licensed under the LGPL v2+.

View File

@@ -0,0 +1,538 @@
"""
Augeas implementation of the ParserNode interfaces.
Augeas works internally by using XPATH notation. The following is a short example
of how this all works internally, to better understand what's going on under the
hood.
A configuration file /etc/apache2/apache2.conf with the following content:
# First comment line
# Second comment line
WhateverDirective whatevervalue
<ABlock>
DirectiveInABlock dirvalue
</ABlock>
SomeDirective somedirectivevalue
<ABlock>
AnotherDirectiveInABlock dirvalue
</ABlock>
# Yet another comment
Translates over to Augeas path notation (of immediate children), when calling
for example: aug.match("/files/etc/apache2/apache2.conf/*")
[
"/files/etc/apache2/apache2.conf/#comment[1]",
"/files/etc/apache2/apache2.conf/#comment[2]",
"/files/etc/apache2/apache2.conf/directive[1]",
"/files/etc/apache2/apache2.conf/ABlock[1]",
"/files/etc/apache2/apache2.conf/directive[2]",
"/files/etc/apache2/apache2.conf/ABlock[2]",
"/files/etc/apache2/apache2.conf/#comment[3]"
]
Regardless of directives name, its key in the Augeas tree is always "directive",
with index where needed of course. Comments work similarly, while blocks
have their own key in the Augeas XPATH notation.
It's important to note that all of the unique keys have their own indices.
Augeas paths are case sensitive, while Apache configuration is case insensitive.
It looks like this:
<block>
directive value
</block>
<Block>
Directive Value
</Block>
<block>
directive value
</block>
<bLoCk>
DiReCtiVe VaLuE
</bLoCk>
Translates over to:
[
"/files/etc/apache2/apache2.conf/block[1]",
"/files/etc/apache2/apache2.conf/Block[1]",
"/files/etc/apache2/apache2.conf/block[2]",
"/files/etc/apache2/apache2.conf/bLoCk[1]",
]
"""
from acme.magic_typing import Set
from certbot import errors
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import assertions
from certbot_apache._internal import interfaces
from certbot_apache._internal import parser
from certbot_apache._internal import parsernode_util as util
class AugeasParserNode(interfaces.ParserNode):
""" Augeas implementation of ParserNode interface """
def __init__(self, **kwargs):
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs) # pylint: disable=unused-variable
super(AugeasParserNode, self).__init__(**kwargs)
self.ancestor = ancestor
self.filepath = filepath
self.dirty = dirty
self.metadata = metadata
self.parser = self.metadata.get("augeasparser")
try:
if self.metadata["augeaspath"].endswith("/"):
raise errors.PluginError(
"Augeas path: {} has a trailing slash".format(
self.metadata["augeaspath"]
)
)
except KeyError:
raise errors.PluginError("Augeas path is required")
def save(self, msg):
self.parser.save(msg)
def find_ancestors(self, name):
"""
Searches for ancestor BlockNodes with a given name.
:param str name: Name of the BlockNode parent to search for
:returns: List of matching ancestor nodes.
:rtype: list of AugeasBlockNode
"""
ancestors = []
parent = self.metadata["augeaspath"]
while True:
# Get the path of ancestor node
parent = parent.rpartition("/")[0]
# Root of the tree
if not parent or parent == "/files":
break
anc = self._create_blocknode(parent)
if anc.name.lower() == name.lower():
ancestors.append(anc)
return ancestors
def _create_blocknode(self, path):
"""
Helper function to create a BlockNode from Augeas path. This is used by
AugeasParserNode.find_ancestors and AugeasBlockNode.
and AugeasBlockNode.find_blocks
"""
name = self._aug_get_name(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(path)
)
return AugeasBlockNode(name=name,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _aug_get_name(self, path):
"""
Helper function to get name of a configuration block or variable from path.
"""
# Remove the ending slash if any
if path[-1] == "/": # pragma: no cover
path = path[:-1]
# Get the block name
name = path.split("/")[-1]
# remove [...], it's not allowed in Apache configuration and is used
# for indexing within Augeas
name = name.split("[")[0]
return name
class AugeasCommentNode(AugeasParserNode):
""" Augeas implementation of CommentNode interface """
def __init__(self, **kwargs):
comment, kwargs = util.commentnode_kwargs(kwargs) # pylint: disable=unused-variable
super(AugeasCommentNode, self).__init__(**kwargs)
# self.comment = comment
self.comment = comment
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.comment == other.comment and
self.filepath == other.filepath and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
class AugeasDirectiveNode(AugeasParserNode):
""" Augeas implementation of DirectiveNode interface """
def __init__(self, **kwargs):
name, parameters, enabled, kwargs = util.directivenode_kwargs(kwargs)
super(AugeasDirectiveNode, self).__init__(**kwargs)
self.name = name
self.enabled = enabled
if parameters:
self.set_parameters(parameters)
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
def set_parameters(self, parameters):
"""
Sets parameters of a DirectiveNode or BlockNode object.
:param list parameters: List of all parameters for the node to set.
"""
orig_params = self._aug_get_params(self.metadata["augeaspath"])
# Clear out old parameters
for _ in orig_params:
# When the first parameter is removed, the indices get updated
param_path = "{}/arg[1]".format(self.metadata["augeaspath"])
self.parser.aug.remove(param_path)
# Insert new ones
for pi, param in enumerate(parameters):
param_path = "{}/arg[{}]".format(self.metadata["augeaspath"], pi+1)
self.parser.aug.set(param_path, param)
@property
def parameters(self):
"""
Fetches the parameters from Augeas tree, ensuring that the sequence always
represents the current state
:returns: Tuple of parameters for this DirectiveNode
:rtype: tuple:
"""
return tuple(self._aug_get_params(self.metadata["augeaspath"]))
def _aug_get_params(self, path):
"""Helper function to get parameters for DirectiveNodes and BlockNodes"""
arg_paths = self.parser.aug.match(path + "/arg")
return [self.parser.get_arg(apath) for apath in arg_paths]
class AugeasBlockNode(AugeasDirectiveNode):
""" Augeas implementation of BlockNode interface """
def __init__(self, **kwargs):
super(AugeasBlockNode, self).__init__(**kwargs)
self.children = ()
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.children == other.children and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
# pylint: disable=unused-argument
def add_child_block(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new BlockNode to the sequence of children"""
insertpath, realpath, before = self._aug_resolve_child_position(
name,
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
# Create the new block
self.parser.aug.insert(insertpath, name, before)
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(realpath)
)
# Parameters will be set at the initialization of the new object
new_block = AugeasBlockNode(name=name,
parameters=parameters,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_block
# pylint: disable=unused-argument
def add_child_directive(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new DirectiveNode to the sequence of children"""
if not parameters:
raise errors.PluginError("Directive requires parameters and none were set.")
insertpath, realpath, before = self._aug_resolve_child_position(
"directive",
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
# Create the new directive
self.parser.aug.insert(insertpath, "directive", before)
# Set the directive key
self.parser.aug.set(realpath, name)
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(realpath)
)
new_dir = AugeasDirectiveNode(name=name,
parameters=parameters,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_dir
def add_child_comment(self, comment="", position=None):
"""Adds a new CommentNode to the sequence of children"""
insertpath, realpath, before = self._aug_resolve_child_position(
"#comment",
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
# Create the new comment
self.parser.aug.insert(insertpath, "#comment", before)
# Set the comment content
self.parser.aug.set(realpath, comment)
new_comment = AugeasCommentNode(comment=comment,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_comment
def find_blocks(self, name, exclude=True):
"""Recursive search of BlockNodes from the sequence of children"""
nodes = []
paths = self._aug_find_blocks(name)
if exclude:
paths = self.parser.exclude_dirs(paths)
for path in paths:
nodes.append(self._create_blocknode(path))
return nodes
def find_directives(self, name, exclude=True):
"""Recursive search of DirectiveNodes from the sequence of children"""
nodes = []
ownpath = self.metadata.get("augeaspath")
directives = self.parser.find_dir(name, start=ownpath, exclude=exclude)
already_parsed = set() # type: Set[str]
for directive in directives:
# Remove the /arg part from the Augeas path
directive = directive.partition("/arg")[0]
# find_dir returns an object for each _parameter_ of a directive
# so we need to filter out duplicates.
if directive not in already_parsed:
nodes.append(self._create_directivenode(directive))
already_parsed.add(directive)
return nodes
def find_comments(self, comment):
"""
Recursive search of DirectiveNodes from the sequence of children.
:param str comment: Comment content to search for.
"""
nodes = []
ownpath = self.metadata.get("augeaspath")
comments = self.parser.find_comments(comment, start=ownpath)
for com in comments:
nodes.append(self._create_commentnode(com))
return nodes
def delete_child(self, child):
"""
Deletes a ParserNode from the sequence of children, and raises an
exception if it's unable to do so.
:param AugeasParserNode: child: A node to delete.
"""
if not self.parser.aug.remove(child.metadata["augeaspath"]):
raise errors.PluginError(
("Could not delete child node, the Augeas path: {} doesn't " +
"seem to exist.").format(child.metadata["augeaspath"])
)
def unsaved_files(self):
"""Returns a list of unsaved filepaths"""
return self.parser.unsaved_files()
def parsed_paths(self):
"""
Returns a list of file paths that have currently been parsed into the parser
tree. The returned list may include paths with wildcard characters, for
example: ['/etc/apache2/conf.d/*.load']
This is typically called on the root node of the ParserNode tree.
:returns: list of file paths of files that have been parsed
"""
res_paths = []
paths = self.parser.existing_paths
for directory in paths:
for filename in paths[directory]:
res_paths.append(os.path.join(directory, filename))
return res_paths
def _create_commentnode(self, path):
"""Helper function to create a CommentNode from Augeas path"""
comment = self.parser.aug.get(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
# Because of the dynamic nature of AugeasParser and the fact that we're
# not populating the complete node tree, the ancestor has a dummy value
return AugeasCommentNode(comment=comment,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _create_directivenode(self, path):
"""Helper function to create a DirectiveNode from Augeas path"""
name = self.parser.get_arg(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(path)
)
return AugeasDirectiveNode(name=name,
ancestor=assertions.PASS,
enabled=enabled,
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _aug_find_blocks(self, name):
"""Helper function to perform a search to Augeas DOM tree to search
configuration blocks with a given name"""
# The code here is modified from configurator.get_virtual_hosts()
blk_paths = set()
for vhost_path in list(self.parser.parser_paths):
paths = self.parser.aug.match(
("/files%s//*[label()=~regexp('%s')]" %
(vhost_path, parser.case_i(name))))
blk_paths.update([path for path in paths if
name.lower() in os.path.basename(path).lower()])
return blk_paths
def _aug_resolve_child_position(self, name, position):
"""
Helper function that iterates through the immediate children and figures
out the insertion path for a new AugeasParserNode.
Augeas also generalizes indices for directives and comments, simply by
using "directive" or "comment" respectively as their names.
This function iterates over the existing children of the AugeasBlockNode,
returning their insertion path, resulting Augeas path and if the new node
should be inserted before or after the returned insertion path.
Note: while Apache is case insensitive, Augeas is not, and blocks like
Nameofablock and NameOfABlock have different indices.
:param str name: Name of the AugeasBlockNode to insert, "directive" for
AugeasDirectiveNode or "comment" for AugeasCommentNode
:param int position: The position to insert the child AugeasParserNode to
:returns: Tuple of insert path, resulting path and a boolean if the new
node should be inserted before it.
:rtype: tuple of str, str, bool
"""
# Default to appending
before = False
all_children = self.parser.aug.match("{}/*".format(
self.metadata["augeaspath"])
)
# Calculate resulting_path
# Augeas indices start at 1. We use counter to calculate the index to
# be used in resulting_path.
counter = 1
for i, child in enumerate(all_children):
if position is not None and i >= position:
# We're not going to insert the new node to an index after this
break
childname = self._aug_get_name(child)
if name == childname:
counter += 1
resulting_path = "{}/{}[{}]".format(
self.metadata["augeaspath"],
name,
counter
)
# Form the correct insert_path
# Inserting the only child and appending as the last child work
# similarly in Augeas.
append = not all_children or position is None or position >= len(all_children)
if append:
insert_path = "{}/*[last()]".format(
self.metadata["augeaspath"]
)
elif position == 0:
# Insert as the first child, before the current first one.
insert_path = all_children[0]
before = True
else:
insert_path = "{}/*[{}]".format(
self.metadata["augeaspath"],
position
)
return (insert_path, resulting_path, before)
interfaces.CommentNode.register(AugeasCommentNode)
interfaces.DirectiveNode.register(AugeasDirectiveNode)
interfaces.BlockNode.register(AugeasBlockNode)

View File

@@ -1,5 +1,7 @@
"""Apache Configurator."""
# pylint: disable=too-many-lines
from collections import defaultdict
from distutils.version import LooseVersion
import copy
import fnmatch
import logging
@@ -7,34 +9,38 @@ import re
import socket
import time
from collections import defaultdict
import pkg_resources
import six
import zope.component
import zope.interface
try:
import apacheconfig
HAS_APACHECONFIG = True
except ImportError: # pragma: no cover
HAS_APACHECONFIG = False
from acme import challenges
from acme.magic_typing import DefaultDict, Dict, List, Set, Union # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import DefaultDict
from acme.magic_typing import Dict
from acme.magic_typing import List
from acme.magic_typing import Set
from acme.magic_typing import Union
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.achallenges import KeyAuthorizationAnnotatedChallenge # pylint: disable=unused-import
from certbot.compat import filesystem
from certbot.compat import os
from certbot.plugins import common
from certbot.plugins.util import path_surgery
from certbot.plugins.enhancements import AutoHSTSEnhancement
from certbot_apache import apache_util
from certbot_apache import constants
from certbot_apache import display_ops
from certbot_apache import http_01
from certbot_apache import obj
from certbot_apache import parser
from certbot.plugins.util import path_surgery
from certbot_apache._internal import apache_util
from certbot_apache._internal import assertions
from certbot_apache._internal import constants
from certbot_apache._internal import display_ops
from certbot_apache._internal import dualparser
from certbot_apache._internal import http_01
from certbot_apache._internal import obj
from certbot_apache._internal import parser
logger = logging.getLogger(__name__)
@@ -77,11 +83,11 @@ class ApacheConfigurator(common.Installer):
:type config: :class:`~certbot.interfaces.IConfig`
:ivar parser: Handles low level parsing
:type parser: :class:`~certbot_apache.parser`
:type parser: :class:`~certbot_apache._internal.parser`
:ivar tup version: version of Apache
:ivar list vhosts: All vhosts found in the configuration
(:class:`list` of :class:`~certbot_apache.obj.VirtualHost`)
(:class:`list` of :class:`~certbot_apache._internal.obj.VirtualHost`)
:ivar dict assoc: Mapping between domains and vhosts
@@ -109,14 +115,30 @@ class ApacheConfigurator(common.Installer):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/apache2",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
bin=None
)
def option(self, key):
"""Get a value from options"""
return self.options.get(key)
def pick_apache_config(self, warn_on_no_mod_ssl=True):
"""
Pick the appropriate TLS Apache configuration file for current version of Apache and OS.
:param bool warn_on_no_mod_ssl: True if we should warn if mod_ssl is not found.
:return: the path to the TLS Apache configuration file to use
:rtype: str
"""
# Disabling TLS session tickets is supported by Apache 2.4.11+ and OpenSSL 1.0.2l+.
# So for old versions of Apache we pick a configuration without this option.
openssl_version = self.openssl_version(warn_on_no_mod_ssl)
if self.version < (2, 4, 11) or not openssl_version or\
LooseVersion(openssl_version) < LooseVersion('1.0.2l'):
return apache_util.find_ssl_apache_conf("old")
return apache_util.find_ssl_apache_conf("current")
def _prepare_options(self):
"""
Set the values possibly changed by command line parameters to
@@ -124,7 +146,7 @@ class ApacheConfigurator(common.Installer):
"""
opts = ["enmod", "dismod", "le_vhost_ext", "server_root", "vhost_root",
"logs_root", "challenge_location", "handle_modules", "handle_sites",
"ctl"]
"ctl", "bin"]
for o in opts:
# Config options use dashes instead of underscores
if self.conf(o.replace("_", "-")) is not None:
@@ -173,6 +195,8 @@ class ApacheConfigurator(common.Installer):
"(Only Ubuntu/Debian currently)")
add("ctl", default=DEFAULTS["ctl"],
help="Full path to Apache control script")
add("bin", default=DEFAULTS["bin"],
help="Full path to apache2/httpd binary")
def __init__(self, *args, **kwargs):
"""Initialize an Apache Configurator.
@@ -182,26 +206,34 @@ class ApacheConfigurator(common.Installer):
"""
version = kwargs.pop("version", None)
use_parsernode = kwargs.pop("use_parsernode", False)
openssl_version = kwargs.pop("openssl_version", None)
super(ApacheConfigurator, self).__init__(*args, **kwargs)
# Add name_server association dict
self.assoc = dict() # type: Dict[str, obj.VirtualHost]
self.assoc = {} # type: Dict[str, obj.VirtualHost]
# Outstanding challenges
self._chall_out = set() # type: Set[KeyAuthorizationAnnotatedChallenge]
# List of vhosts configured per wildcard domain on this run.
# used by deploy_cert() and enhance()
self._wildcard_vhosts = dict() # type: Dict[str, List[obj.VirtualHost]]
self._wildcard_vhosts = {} # type: Dict[str, List[obj.VirtualHost]]
# Maps enhancements to vhosts we've enabled the enhancement for
self._enhanced_vhosts = defaultdict(set) # type: DefaultDict[str, Set[obj.VirtualHost]]
# Temporary state for AutoHSTS enhancement
self._autohsts = {} # type: Dict[str, Dict[str, Union[int, float]]]
# Reverter save notes
self.save_notes = ""
# Should we use ParserNode implementation instead of the old behavior
self.USE_PARSERNODE = use_parsernode
# Saves the list of file paths that were parsed initially, and
# not added to parser tree by self.conf("vhost-root") for example.
self.parsed_paths = [] # type: List[str]
# These will be set in the prepare function
self._prepared = False
self.parser = None
self.parser_root = None
self.version = version
self._openssl_version = openssl_version
self.vhosts = None
self.options = copy.deepcopy(self.OS_DEFAULTS)
self._enhance_func = {"redirect": self._enable_redirect,
@@ -218,6 +250,59 @@ class ApacheConfigurator(common.Installer):
"""Full absolute path to digest of updated SSL configuration file."""
return os.path.join(self.config.config_dir, constants.UPDATED_MOD_SSL_CONF_DIGEST)
def _open_module_file(self, ssl_module_location):
"""Extract the open lines of openssl_version for testing purposes"""
try:
with open(ssl_module_location, mode="rb") as f:
contents = f.read()
except IOError as error:
logger.debug(str(error), exc_info=True)
return None
return contents
def openssl_version(self, warn_on_no_mod_ssl=True):
"""Lazily retrieve openssl version
:param bool warn_on_no_mod_ssl: `True` if we should warn if mod_ssl is not found. Set to
`False` when we know we'll try to enable mod_ssl later. This is currently debian/ubuntu,
when called from `prepare`.
:return: the OpenSSL version as a string, or None.
:rtype: str or None
"""
if self._openssl_version:
return self._openssl_version
# Step 1. Determine the location of ssl_module
try:
ssl_module_location = self.parser.modules['ssl_module']
except KeyError:
if warn_on_no_mod_ssl:
logger.warning("Could not find ssl_module; not disabling session tickets.")
return None
if ssl_module_location:
# Possibility A: ssl_module is a DSO
ssl_module_location = self.parser.standard_path_from_server_root(ssl_module_location)
else:
# Possibility B: ssl_module is statically linked into Apache
if self.option("bin"):
ssl_module_location = self.option("bin")
else:
logger.warning("ssl_module is statically linked but --apache-bin is "
"missing; not disabling session tickets.")
return None
# Step 2. Grep in the binary for openssl version
contents = self._open_module_file(ssl_module_location)
if not contents:
logger.warning("Unable to read ssl_module file; not disabling session tickets.")
return None
# looks like: OpenSSL 1.0.2s 28 May 2019
matches = re.findall(br"OpenSSL ([0-9]\.[^ ]+) ", contents)
if not matches:
logger.warning("Could not find OpenSSL version; not disabling session tickets.")
return None
self._openssl_version = matches[0].decode('UTF-8')
return self._openssl_version
def prepare(self):
"""Prepare the authenticator/installer.
@@ -250,14 +335,26 @@ class ApacheConfigurator(common.Installer):
# Perform the actual Augeas initialization to be able to react
self.parser = self.get_parser()
# Set up ParserNode root
pn_meta = {"augeasparser": self.parser,
"augeaspath": self.parser.get_root_augpath(),
"ac_ast": None}
if self.USE_PARSERNODE:
self.parser_root = self.get_parsernode_root(pn_meta)
self.parsed_paths = self.parser_root.parsed_paths()
# Check for errors in parsing files with Augeas
self.parser.check_parsing_errors("httpd.aug")
# Get all of the available vhosts
self.vhosts = self.get_virtual_hosts()
# We may try to enable mod_ssl later. If so, we shouldn't warn if we can't find it now.
# This is currently only true for debian/ubuntu.
warn_on_no_mod_ssl = not self.option("handle_modules")
self.install_ssl_options_conf(self.mod_ssl_conf,
self.updated_mod_ssl_conf_digest)
self.updated_mod_ssl_conf_digest,
warn_on_no_mod_ssl)
# Prevent two Apache plugins from modifying a config at once
try:
@@ -345,6 +442,28 @@ class ApacheConfigurator(common.Installer):
self.option("server_root"), self.conf("vhost-root"),
self.version, configurator=self)
def get_parsernode_root(self, metadata):
"""Initializes the ParserNode parser root instance."""
if HAS_APACHECONFIG:
apache_vars = {}
apache_vars["defines"] = apache_util.parse_defines(self.option("ctl"))
apache_vars["includes"] = apache_util.parse_includes(self.option("ctl"))
apache_vars["modules"] = apache_util.parse_modules(self.option("ctl"))
metadata["apache_vars"] = apache_vars
with open(self.parser.loc["root"]) as f:
with apacheconfig.make_loader(writable=True,
**apacheconfig.flavors.NATIVE_APACHE) as loader:
metadata["ac_ast"] = loader.loads(f.read())
return dualparser.DualBlockNode(
name=assertions.PASS,
ancestor=None,
filepath=self.parser.loc["root"],
metadata=metadata
)
def _wildcard_domain(self, domain):
"""
Checks if domain is a wildcard domain
@@ -391,7 +510,7 @@ class ApacheConfigurator(common.Installer):
counterpart, should one get created
:returns: List of VirtualHosts or None
:rtype: `list` of :class:`~certbot_apache.obj.VirtualHost`
:rtype: `list` of :class:`~certbot_apache._internal.obj.VirtualHost`
"""
if self._wildcard_domain(domain):
@@ -439,7 +558,7 @@ class ApacheConfigurator(common.Installer):
# Go through the vhosts, making sure that we cover all the names
# present, but preferring the SSL vhosts
filtered_vhosts = dict()
filtered_vhosts = {}
for vhost in vhosts:
for name in vhost.get_names():
if vhost.ssl:
@@ -450,7 +569,7 @@ class ApacheConfigurator(common.Installer):
filtered_vhosts[name] = vhost
# Only unique VHost objects
dialog_input = set([vhost for vhost in filtered_vhosts.values()])
dialog_input = set(filtered_vhosts.values())
# Ask the user which of names to enable, expect list of names back
dialog_output = display_ops.select_vhost_multiple(list(dialog_input))
@@ -465,7 +584,7 @@ class ApacheConfigurator(common.Installer):
# Make sure we create SSL vhosts for the ones that are HTTP only
# if requested.
return_vhosts = list()
return_vhosts = []
for vhost in dialog_output:
if not vhost.ssl:
return_vhosts.append(self.make_vhost_ssl(vhost))
@@ -486,6 +605,11 @@ class ApacheConfigurator(common.Installer):
# cert_key... can all be parsed appropriately
self.prepare_server_https("443")
# If we haven't managed to enable mod_ssl by this point, error out
if "ssl_module" not in self.parser.modules:
raise errors.MisconfigurationError("Could not find ssl_module; "
"not installing certificate.")
# Add directives and remove duplicates
self._add_dummy_ssl_directives(vhost.path)
self._clean_vhost(vhost)
@@ -500,21 +624,6 @@ class ApacheConfigurator(common.Installer):
path["chain_path"] = self.parser.find_dir(
"SSLCertificateChainFile", None, vhost.path)
# Handle errors when certificate/key directives cannot be found
if not path["cert_path"]:
logger.warning(
"Cannot find an SSLCertificateFile directive in %s. "
"VirtualHost was not modified", vhost.path)
raise errors.PluginError(
"Unable to find an SSLCertificateFile directive")
elif not path["cert_key"]:
logger.warning(
"Cannot find an SSLCertificateKeyFile directive for "
"certificate in %s. VirtualHost was not modified", vhost.path)
raise errors.PluginError(
"Unable to find an SSLCertificateKeyFile directive for "
"certificate")
logger.info("Deploying Certificate to VirtualHost %s", vhost.filep)
if self.version < (2, 4, 8) or (chain_path and not fullchain_path):
@@ -566,7 +675,7 @@ class ApacheConfigurator(common.Installer):
counterpart, should one get created
:returns: vhost associated with name
:rtype: :class:`~certbot_apache.obj.VirtualHost`
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost`
:raises .errors.PluginError: If no vhost is available or chosen
@@ -601,9 +710,9 @@ class ApacheConfigurator(common.Installer):
"in the Apache config.",
target_name)
raise errors.PluginError("No vhost selected")
elif temp:
if temp:
return vhost
elif not vhost.ssl:
if not vhost.ssl:
addrs = self._get_proposed_addrs(vhost, "443")
# TODO: Conflicts is too conservative
if not any(vhost.enabled and vhost.conflicts(addrs) for
@@ -669,7 +778,7 @@ class ApacheConfigurator(common.Installer):
:param str target_name: domain handled by the desired vhost
:param vhosts: vhosts to consider
:type vhosts: `collections.Iterable` of :class:`~certbot_apache.obj.VirtualHost`
:type vhosts: `collections.Iterable` of :class:`~certbot_apache._internal.obj.VirtualHost`
:param bool filter_defaults: whether a vhost with a _default_
addr is acceptable
@@ -761,7 +870,7 @@ class ApacheConfigurator(common.Installer):
return util.get_filtered_names(all_names)
def get_name_from_ip(self, addr): # pylint: disable=no-self-use
def get_name_from_ip(self, addr):
"""Returns a reverse dns name if available.
:param addr: IP Address
@@ -811,7 +920,7 @@ class ApacheConfigurator(common.Installer):
"""Helper function for get_virtual_hosts().
:param host: In progress vhost whose names will be added
:type host: :class:`~certbot_apache.obj.VirtualHost`
:type host: :class:`~certbot_apache._internal.obj.VirtualHost`
"""
@@ -830,7 +939,7 @@ class ApacheConfigurator(common.Installer):
:param str path: Augeas path to virtual host
:returns: newly created vhost
:rtype: :class:`~certbot_apache.obj.VirtualHost`
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost`
"""
addrs = set()
@@ -869,9 +978,32 @@ class ApacheConfigurator(common.Installer):
return vhost
def get_virtual_hosts(self):
"""
Temporary wrapper for legacy and ParserNode version for
get_virtual_hosts. This should be replaced with the ParserNode
implementation when ready.
"""
v1_vhosts = self.get_virtual_hosts_v1()
if self.USE_PARSERNODE and HAS_APACHECONFIG:
v2_vhosts = self.get_virtual_hosts_v2()
for v1_vh in v1_vhosts:
found = False
for v2_vh in v2_vhosts:
if assertions.isEqualVirtualHost(v1_vh, v2_vh):
found = True
break
if not found:
raise AssertionError("Equivalent for {} was not found".format(v1_vh.path))
return v2_vhosts
return v1_vhosts
def get_virtual_hosts_v1(self):
"""Returns list of virtual hosts found in the Apache configuration.
:returns: List of :class:`~certbot_apache.obj.VirtualHost`
:returns: List of :class:`~certbot_apache._internal.obj.VirtualHost`
objects found in configuration
:rtype: list
@@ -921,6 +1053,80 @@ class ApacheConfigurator(common.Installer):
vhs.append(new_vhost)
return vhs
def get_virtual_hosts_v2(self):
"""Returns list of virtual hosts found in the Apache configuration using
ParserNode interface.
:returns: List of :class:`~certbot_apache.obj.VirtualHost`
objects found in configuration
:rtype: list
"""
vhs = []
vhosts = self.parser_root.find_blocks("VirtualHost", exclude=False)
for vhblock in vhosts:
vhs.append(self._create_vhost_v2(vhblock))
return vhs
def _create_vhost_v2(self, node):
"""Used by get_virtual_hosts_v2 to create vhost objects using ParserNode
interfaces.
:param interfaces.BlockNode node: The BlockNode object of VirtualHost block
:returns: newly created vhost
:rtype: :class:`~certbot_apache.obj.VirtualHost`
"""
addrs = set()
for param in node.parameters:
addrs.add(obj.Addr.fromstring(param))
is_ssl = False
# Exclusion to match the behavior in get_virtual_hosts_v2
sslengine = node.find_directives("SSLEngine", exclude=False)
if sslengine:
for directive in sslengine:
if directive.parameters[0].lower() == "on":
is_ssl = True
break
# "SSLEngine on" might be set outside of <VirtualHost>
# Treat vhosts with port 443 as ssl vhosts
for addr in addrs:
if addr.get_port() == "443":
is_ssl = True
enabled = apache_util.included_in_paths(node.filepath, self.parsed_paths)
macro = False
# Check if the VirtualHost is contained in a mod_macro block
if node.find_ancestors("Macro"):
macro = True
vhost = obj.VirtualHost(
node.filepath, None, addrs, is_ssl, enabled, modmacro=macro, node=node
)
self._populate_vhost_names_v2(vhost)
return vhost
def _populate_vhost_names_v2(self, vhost):
"""Helper function that populates the VirtualHost names.
:param host: In progress vhost whose names will be added
:type host: :class:`~certbot_apache.obj.VirtualHost`
"""
servername_match = vhost.node.find_directives("ServerName",
exclude=False)
serveralias_match = vhost.node.find_directives("ServerAlias",
exclude=False)
servername = None
if servername_match:
servername = servername_match[-1].parameters[-1]
if not vhost.modmacro:
for alias in serveralias_match:
for serveralias in alias.parameters:
vhost.aliases.add(serveralias)
vhost.name = servername
def is_name_vhost(self, target_addr):
"""Returns if vhost is a name based vhost
@@ -928,7 +1134,7 @@ class ApacheConfigurator(common.Installer):
now NameVirtualHosts. If version is earlier than 2.4, check if addr
has a NameVirtualHost directive in the Apache config
:param certbot_apache.obj.Addr target_addr: vhost address
:param certbot_apache._internal.obj.Addr target_addr: vhost address
:returns: Success
:rtype: bool
@@ -946,19 +1152,18 @@ class ApacheConfigurator(common.Installer):
"""Adds NameVirtualHost directive for given address.
:param addr: Address that will be added as NameVirtualHost directive
:type addr: :class:`~certbot_apache.obj.Addr`
:type addr: :class:`~certbot_apache._internal.obj.Addr`
"""
loc = parser.get_aug_path(self.parser.loc["name"])
if addr.get_port() == "443":
path = self.parser.add_dir_to_ifmodssl(
self.parser.add_dir_to_ifmodssl(
loc, "NameVirtualHost", [str(addr)])
else:
path = self.parser.add_dir(loc, "NameVirtualHost", [str(addr)])
self.parser.add_dir(loc, "NameVirtualHost", [str(addr)])
msg = ("Setting %s to be NameBasedVirtualHost\n"
"\tDirective added to %s\n" % (addr, path))
msg = "Setting {0} to be NameBasedVirtualHost\n".format(addr)
logger.debug(msg)
self.save_notes += msg
@@ -1114,6 +1319,14 @@ class ApacheConfigurator(common.Installer):
self.enable_mod("socache_shmcb", temp=temp)
if "ssl_module" not in self.parser.modules:
self.enable_mod("ssl", temp=temp)
# Make sure we're not throwing away any unwritten changes to the config
self.parser.ensure_augeas_state()
self.parser.aug.load()
self.parser.reset_modules() # Reset to load the new ssl_module path
# Call again because now we can gate on openssl version
self.install_ssl_options_conf(self.mod_ssl_conf,
self.updated_mod_ssl_conf_digest,
warn_on_no_mod_ssl=True)
def make_vhost_ssl(self, nonssl_vhost):
"""Makes an ssl_vhost version of a nonssl_vhost.
@@ -1125,10 +1338,10 @@ class ApacheConfigurator(common.Installer):
.. note:: This function saves the configuration
:param nonssl_vhost: Valid VH that doesn't have SSLEngine on
:type nonssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
:type nonssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:returns: SSL vhost
:rtype: :class:`~certbot_apache.obj.VirtualHost`
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost`
:raises .errors.PluginError: If more than one virtual host is in
the file or if plugin is unable to write/read vhost files.
@@ -1249,7 +1462,7 @@ class ApacheConfigurator(common.Installer):
if not line.lower().lstrip().startswith("rewriterule"):
return False
# According to: http://httpd.apache.org/docs/2.4/rewrite/flags.html
# According to: https://httpd.apache.org/docs/2.4/rewrite/flags.html
# The syntax of a RewriteRule is:
# RewriteRule pattern target [Flag1,Flag2,Flag3]
# i.e. target is required, so it must exist.
@@ -1366,12 +1579,9 @@ class ApacheConfigurator(common.Installer):
result.append(comment)
sift = True
result.append('\n'.join(
['# ' + l for l in chunk]))
continue
result.append('\n'.join('# ' + l for l in chunk))
else:
result.append('\n'.join(chunk))
continue
return result, sift
def _get_vhost_block(self, vhost):
@@ -1499,7 +1709,7 @@ class ApacheConfigurator(common.Installer):
https://httpd.apache.org/docs/2.2/mod/core.html#namevirtualhost
:param vhost: New virtual host that was recently created.
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
"""
need_to_save = False
@@ -1509,7 +1719,7 @@ class ApacheConfigurator(common.Installer):
for addr in vhost.addrs:
# In Apache 2.2, when a NameVirtualHost directive is not
# set, "*" and "_default_" will conflict when sharing a port
addrs = set((addr,))
addrs = {addr,}
if addr.get_addr() in ("*", "_default_"):
addrs.update(obj.Addr((a, addr.get_port(),))
for a in ("*", "_default_"))
@@ -1534,7 +1744,7 @@ class ApacheConfigurator(common.Installer):
:param str id_str: Id string for matching
:returns: The matched VirtualHost or None
:rtype: :class:`~certbot_apache.obj.VirtualHost` or None
:rtype: :class:`~certbot_apache._internal.obj.VirtualHost` or None
:raises .errors.PluginError: If no VirtualHost is found
"""
@@ -1551,7 +1761,7 @@ class ApacheConfigurator(common.Installer):
used for keeping track of VirtualHost directive over time.
:param vhost: Virtual host to add the id
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:returns: The unique ID or None
:rtype: str or None
@@ -1573,7 +1783,7 @@ class ApacheConfigurator(common.Installer):
If ID already exists, returns that instead.
:param vhost: Virtual host to add or find the id
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:returns: The unique ID for vhost
:rtype: str or None
@@ -1602,7 +1812,7 @@ class ApacheConfigurator(common.Installer):
######################################################################
# Enhancements
######################################################################
def supported_enhancements(self): # pylint: disable=no-self-use
def supported_enhancements(self):
"""Returns currently supported enhancements."""
return ["redirect", "ensure-http-header", "staple-ocsp"]
@@ -1651,7 +1861,7 @@ class ApacheConfigurator(common.Installer):
"""Increase the AutoHSTS max-age value
:param vhost: Virtual host object to modify
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:param str id_str: The unique ID string of VirtualHost
@@ -1700,7 +1910,7 @@ class ApacheConfigurator(common.Installer):
try:
self._autohsts = self.storage.fetch("autohsts")
except KeyError:
self._autohsts = dict()
self._autohsts = {}
def _autohsts_save_state(self):
"""
@@ -1735,13 +1945,13 @@ class ApacheConfigurator(common.Installer):
.. note:: This function saves the configuration
:param ssl_vhost: Destination of traffic, an ssl enabled vhost
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:param unused_options: Not currently used
:type unused_options: Not Available
:returns: Success, general_vhost (HTTP vhost)
:rtype: (bool, :class:`~certbot_apache.obj.VirtualHost`)
:rtype: (bool, :class:`~certbot_apache._internal.obj.VirtualHost`)
"""
min_apache_ver = (2, 3, 3)
@@ -1791,14 +2001,14 @@ class ApacheConfigurator(common.Installer):
.. note:: This function saves the configuration
:param ssl_vhost: Destination of traffic, an ssl enabled vhost
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:param header_substring: string that uniquely identifies a header.
e.g: Strict-Transport-Security, Upgrade-Insecure-Requests.
:type str
:returns: Success, general_vhost (HTTP vhost)
:rtype: (bool, :class:`~certbot_apache.obj.VirtualHost`)
:rtype: (bool, :class:`~certbot_apache._internal.obj.VirtualHost`)
:raises .errors.PluginError: If no viable HTTP host can be created or
set with header header_substring.
@@ -1822,11 +2032,11 @@ class ApacheConfigurator(common.Installer):
ssl_vhost.filep)
def _verify_no_matching_http_header(self, ssl_vhost, header_substring):
"""Checks to see if an there is an existing Header directive that
"""Checks to see if there is an existing Header directive that
contains the string header_substring.
:param ssl_vhost: vhost to check
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:param header_substring: string that uniquely identifies a header.
e.g: Strict-Transport-Security, Upgrade-Insecure-Requests.
@@ -1863,7 +2073,7 @@ class ApacheConfigurator(common.Installer):
.. note:: This function saves the configuration
:param ssl_vhost: Destination of traffic, an ssl enabled vhost
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:param unused_options: Not currently used
:type unused_options: Not Available
@@ -1948,7 +2158,7 @@ class ApacheConfigurator(common.Installer):
delete certbot's old rewrite rules and set the new one instead.
:param vhost: vhost to check
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:raises errors.PluginEnhancementAlreadyPresent: When the exact
certbot redirection WriteRule exists in virtual host.
@@ -1990,7 +2200,7 @@ class ApacheConfigurator(common.Installer):
"""Checks if there exists a RewriteRule directive in vhost
:param vhost: vhost to check
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:returns: True if a RewriteRule directive exists.
:rtype: bool
@@ -2004,7 +2214,7 @@ class ApacheConfigurator(common.Installer):
"""Checks if a RewriteEngine directive is on
:param vhost: vhost to check
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
"""
rewrite_engine_path_list = self.parser.find_dir("RewriteEngine", "on",
@@ -2021,10 +2231,10 @@ class ApacheConfigurator(common.Installer):
"""Creates an http_vhost specifically to redirect for the ssl_vhost.
:param ssl_vhost: ssl vhost
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost`
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:returns: tuple of the form
(`success`, :class:`~certbot_apache.obj.VirtualHost`)
(`success`, :class:`~certbot_apache._internal.obj.VirtualHost`)
:rtype: tuple
"""
@@ -2150,7 +2360,7 @@ class ApacheConfigurator(common.Installer):
of this method where available.
:param vhost: vhost to enable
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:type vhost: :class:`~certbot_apache._internal.obj.VirtualHost`
:raises .errors.NotSupportedError: If filesystem layout is not
supported.
@@ -2168,7 +2378,7 @@ class ApacheConfigurator(common.Installer):
vhost.enabled = True
return
def enable_mod(self, mod_name, temp=False): # pylint: disable=unused-argument
def enable_mod(self, mod_name, temp=False):
"""Enables module in Apache.
Both enables and reloads Apache so module is active.
@@ -2226,7 +2436,7 @@ class ApacheConfigurator(common.Installer):
error = str(err)
raise errors.MisconfigurationError(error)
def config_test(self): # pylint: disable=no-self-use
def config_test(self):
"""Check the configuration of Apache for errors.
:raises .errors.MisconfigurationError: If config_test fails
@@ -2261,7 +2471,7 @@ class ApacheConfigurator(common.Installer):
if len(matches) != 1:
raise errors.PluginError("Unable to find Apache version")
return tuple([int(i) for i in matches[0].split(".")])
return tuple(int(i) for i in matches[0].split("."))
def more_info(self):
"""Human-readable string to help understand the module"""
@@ -2276,7 +2486,7 @@ class ApacheConfigurator(common.Installer):
###########################################################################
# Challenges Section
###########################################################################
def get_chall_pref(self, unused_domain): # pylint: disable=no-self-use
def get_chall_pref(self, unused_domain):
"""Return list of challenge preferences."""
return [challenges.HTTP01]
@@ -2330,14 +2540,19 @@ class ApacheConfigurator(common.Installer):
self.restart()
self.parser.reset_modules()
def install_ssl_options_conf(self, options_ssl, options_ssl_digest):
"""Copy Certbot's SSL options file into the system's config dir if required."""
def install_ssl_options_conf(self, options_ssl, options_ssl_digest, warn_on_no_mod_ssl=True):
"""Copy Certbot's SSL options file into the system's config dir if required.
:param bool warn_on_no_mod_ssl: True if we should warn if mod_ssl is not found.
"""
# XXX if we ever try to enforce a local privilege boundary (eg, running
# certbot for unprivileged users via setuid), this function will need
# to be modified.
return common.install_version_controlled_file(options_ssl, options_ssl_digest,
self.option("MOD_SSL_CONF_SRC"), constants.ALL_SSL_OPTIONS_HASHES)
apache_config_path = self.pick_apache_config(warn_on_no_mod_ssl)
return common.install_version_controlled_file(
options_ssl, options_ssl_digest, apache_config_path, constants.ALL_SSL_OPTIONS_HASHES)
def enable_autohsts(self, _unused_lineage, domains):
"""
@@ -2347,7 +2562,7 @@ class ApacheConfigurator(common.Installer):
:type _unused_lineage: certbot._internal.storage.RenewableCert
:param domains: List of domains in certificate to enhance
:type domains: str
:type domains: `list` of `str`
"""
self._autohsts_fetch_state()
@@ -2387,7 +2602,7 @@ class ApacheConfigurator(common.Installer):
"""Do the initial AutoHSTS deployment to a vhost
:param ssl_vhost: The VirtualHost object to deploy the AutoHSTS
:type ssl_vhost: :class:`~certbot_apache.obj.VirtualHost` or None
:type ssl_vhost: :class:`~certbot_apache._internal.obj.VirtualHost` or None
:raises errors.PluginEnhancementAlreadyPresent: When already enhanced
@@ -2514,4 +2729,4 @@ class ApacheConfigurator(common.Installer):
self._autohsts_save_state()
AutoHSTSEnhancement.register(ApacheConfigurator) # pylint: disable=no-member
AutoHSTSEnhancement.register(ApacheConfigurator)

View File

@@ -1,6 +1,7 @@
"""Apache plugin constants."""
import pkg_resources
from certbot.compat import os
MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
"""Name of the mod_ssl config file as saved in `IConfig.config_dir`."""
@@ -23,11 +24,14 @@ ALL_SSL_OPTIONS_HASHES = [
'0fcdc81280cd179a07ec4d29d3595068b9326b455c488de4b09f585d5dafc137',
'86cc09ad5415cd6d5f09a947fe2501a9344328b1e8a8b458107ea903e80baa6c',
'06675349e457eae856120cdebb564efe546f0b87399f2264baeb41e442c724c7',
'5cc003edd93fb9cd03d40c7686495f8f058f485f75b5e764b789245a386e6daf',
'007cd497a56a3bb8b6a2c1aeb4997789e7e38992f74e44cc5d13a625a738ac73',
'34783b9e2210f5c4a23bced2dfd7ec289834716673354ed7c7abf69fe30192a3',
]
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""
AUGEAS_LENS_DIR = pkg_resources.resource_filename(
"certbot_apache", "augeas_lens")
"certbot_apache", os.path.join("_internal", "augeas_lens"))
"""Path to the Augeas lens directory"""
REWRITE_HTTPS_ARGS = [

View File

@@ -3,10 +3,10 @@ import logging
import zope.component
import certbot.display.util as display_util
from certbot import errors
from certbot import interfaces
from certbot.compat import os
import certbot.display.util as display_util
logger = logging.getLogger(__name__)
@@ -21,7 +21,7 @@ def select_vhost_multiple(vhosts):
:rtype: :class:`list`of type `~obj.Vhost`
"""
if not vhosts:
return list()
return []
tags_list = [vhost.display_repr()+"\n" for vhost in vhosts]
# Remove the extra newline from the last entry
if tags_list:
@@ -37,7 +37,7 @@ def select_vhost_multiple(vhosts):
def _reversemap_vhosts(names, vhosts):
"""Helper function for select_vhost_multiple for mapping string
representations back to actual vhost objects"""
return_vhosts = list()
return_vhosts = []
for selection in names:
for vhost in vhosts:
@@ -77,7 +77,7 @@ def _vhost_menu(domain, vhosts):
if free_chars < 2:
logger.debug("Display size is too small for "
"certbot_apache.display_ops._vhost_menu()")
"certbot_apache._internal.display_ops._vhost_menu()")
# This runs the edge off the screen, but it doesn't cause an "error"
filename_size = 1
disp_name_size = 1

View File

@@ -0,0 +1,306 @@
""" Dual ParserNode implementation """
from certbot_apache._internal import assertions
from certbot_apache._internal import augeasparser
from certbot_apache._internal import apacheparser
class DualNodeBase(object):
""" Dual parser interface for in development testing. This is used as the
base class for dual parser interface classes. This class handles runtime
attribute value assertions."""
def save(self, msg): # pragma: no cover
""" Call save for both parsers """
self.primary.save(msg)
self.secondary.save(msg)
def __getattr__(self, aname):
""" Attribute value assertion """
firstval = getattr(self.primary, aname)
secondval = getattr(self.secondary, aname)
exclusions = [
# Metadata will inherently be different, as ApacheParserNode does
# not have Augeas paths and so on.
aname == "metadata",
callable(firstval)
]
if not any(exclusions):
assertions.assertEqualSimple(firstval, secondval)
return firstval
def find_ancestors(self, name):
""" Traverses the ancestor tree and returns ancestors matching name """
return self._find_helper(DualBlockNode, "find_ancestors", name)
def _find_helper(self, nodeclass, findfunc, search, **kwargs):
"""A helper for find_* functions. The function specific attributes should
be passed as keyword arguments.
:param interfaces.ParserNode nodeclass: The node class for results.
:param str findfunc: Name of the find function to call
:param str search: The search term
"""
primary_res = getattr(self.primary, findfunc)(search, **kwargs)
secondary_res = getattr(self.secondary, findfunc)(search, **kwargs)
# The order of search results for Augeas implementation cannot be
# assured.
pass_primary = assertions.isPassNodeList(primary_res)
pass_secondary = assertions.isPassNodeList(secondary_res)
new_nodes = []
if pass_primary and pass_secondary:
# Both unimplemented
new_nodes.append(nodeclass(primary=primary_res[0],
secondary=secondary_res[0])) # pragma: no cover
elif pass_primary:
for c in secondary_res:
new_nodes.append(nodeclass(primary=primary_res[0],
secondary=c))
elif pass_secondary:
for c in primary_res:
new_nodes.append(nodeclass(primary=c,
secondary=secondary_res[0]))
else:
assert len(primary_res) == len(secondary_res)
matches = self._create_matching_list(primary_res, secondary_res)
for p, s in matches:
new_nodes.append(nodeclass(primary=p, secondary=s))
return new_nodes
class DualCommentNode(DualNodeBase):
""" Dual parser implementation of CommentNode interface """
def __init__(self, **kwargs):
""" This initialization implementation allows ordinary initialization
of CommentNode objects as well as creating a DualCommentNode object
using precreated or fetched CommentNode objects if provided as optional
arguments primary and secondary.
Parameters other than the following are from interfaces.CommentNode:
:param CommentNode primary: Primary pre-created CommentNode, mainly
used when creating new DualParser nodes using add_* methods.
:param CommentNode secondary: Secondary pre-created CommentNode
"""
kwargs.setdefault("primary", None)
kwargs.setdefault("secondary", None)
primary = kwargs.pop("primary")
secondary = kwargs.pop("secondary")
if primary or secondary:
assert primary and secondary
self.primary = primary
self.secondary = secondary
else:
self.primary = augeasparser.AugeasCommentNode(**kwargs)
self.secondary = apacheparser.ApacheCommentNode(**kwargs)
assertions.assertEqual(self.primary, self.secondary)
class DualDirectiveNode(DualNodeBase):
""" Dual parser implementation of DirectiveNode interface """
def __init__(self, **kwargs):
""" This initialization implementation allows ordinary initialization
of DirectiveNode objects as well as creating a DualDirectiveNode object
using precreated or fetched DirectiveNode objects if provided as optional
arguments primary and secondary.
Parameters other than the following are from interfaces.DirectiveNode:
:param DirectiveNode primary: Primary pre-created DirectiveNode, mainly
used when creating new DualParser nodes using add_* methods.
:param DirectiveNode secondary: Secondary pre-created DirectiveNode
"""
kwargs.setdefault("primary", None)
kwargs.setdefault("secondary", None)
primary = kwargs.pop("primary")
secondary = kwargs.pop("secondary")
if primary or secondary:
assert primary and secondary
self.primary = primary
self.secondary = secondary
else:
self.primary = augeasparser.AugeasDirectiveNode(**kwargs)
self.secondary = apacheparser.ApacheDirectiveNode(**kwargs)
assertions.assertEqual(self.primary, self.secondary)
def set_parameters(self, parameters):
""" Sets parameters and asserts that both implementation successfully
set the parameter sequence """
self.primary.set_parameters(parameters)
self.secondary.set_parameters(parameters)
assertions.assertEqual(self.primary, self.secondary)
class DualBlockNode(DualNodeBase):
""" Dual parser implementation of BlockNode interface """
def __init__(self, **kwargs):
""" This initialization implementation allows ordinary initialization
of BlockNode objects as well as creating a DualBlockNode object
using precreated or fetched BlockNode objects if provided as optional
arguments primary and secondary.
Parameters other than the following are from interfaces.BlockNode:
:param BlockNode primary: Primary pre-created BlockNode, mainly
used when creating new DualParser nodes using add_* methods.
:param BlockNode secondary: Secondary pre-created BlockNode
"""
kwargs.setdefault("primary", None)
kwargs.setdefault("secondary", None)
primary = kwargs.pop("primary")
secondary = kwargs.pop("secondary")
if primary or secondary:
assert primary and secondary
self.primary = primary
self.secondary = secondary
else:
self.primary = augeasparser.AugeasBlockNode(**kwargs)
self.secondary = apacheparser.ApacheBlockNode(**kwargs)
assertions.assertEqual(self.primary, self.secondary)
def add_child_block(self, name, parameters=None, position=None):
""" Creates a new child BlockNode, asserts that both implementations
did it in a similar way, and returns a newly created DualBlockNode object
encapsulating both of the newly created objects """
primary_new = self.primary.add_child_block(name, parameters, position)
secondary_new = self.secondary.add_child_block(name, parameters, position)
assertions.assertEqual(primary_new, secondary_new)
new_block = DualBlockNode(primary=primary_new, secondary=secondary_new)
return new_block
def add_child_directive(self, name, parameters=None, position=None):
""" Creates a new child DirectiveNode, asserts that both implementations
did it in a similar way, and returns a newly created DualDirectiveNode
object encapsulating both of the newly created objects """
primary_new = self.primary.add_child_directive(name, parameters, position)
secondary_new = self.secondary.add_child_directive(name, parameters, position)
assertions.assertEqual(primary_new, secondary_new)
new_dir = DualDirectiveNode(primary=primary_new, secondary=secondary_new)
return new_dir
def add_child_comment(self, comment="", position=None):
""" Creates a new child CommentNode, asserts that both implementations
did it in a similar way, and returns a newly created DualCommentNode
object encapsulating both of the newly created objects """
primary_new = self.primary.add_child_comment(comment, position)
secondary_new = self.secondary.add_child_comment(comment, position)
assertions.assertEqual(primary_new, secondary_new)
new_comment = DualCommentNode(primary=primary_new, secondary=secondary_new)
return new_comment
def _create_matching_list(self, primary_list, secondary_list):
""" Matches the list of primary_list to a list of secondary_list and
returns a list of tuples. This is used to create results for find_
methods.
This helper function exists, because we cannot ensure that the list of
search results returned by primary.find_* and secondary.find_* are ordered
in a same way. The function pairs the same search results from both
implementations to a list of tuples.
"""
matched = []
for p in primary_list:
match = None
for s in secondary_list:
try:
assertions.assertEqual(p, s)
match = s
break
except AssertionError:
continue
if match:
matched.append((p, match))
else:
raise AssertionError("Could not find a matching node.")
return matched
def find_blocks(self, name, exclude=True):
"""
Performs a search for BlockNodes using both implementations and does simple
checks for results. This is built upon the assumption that unimplemented
find_* methods return a list with a single assertion passing object.
After the assertion, it creates a list of newly created DualBlockNode
instances that encapsulate the pairs of returned BlockNode objects.
"""
return self._find_helper(DualBlockNode, "find_blocks", name,
exclude=exclude)
def find_directives(self, name, exclude=True):
"""
Performs a search for DirectiveNodes using both implementations and
checks the results. This is built upon the assumption that unimplemented
find_* methods return a list with a single assertion passing object.
After the assertion, it creates a list of newly created DualDirectiveNode
instances that encapsulate the pairs of returned DirectiveNode objects.
"""
return self._find_helper(DualDirectiveNode, "find_directives", name,
exclude=exclude)
def find_comments(self, comment):
"""
Performs a search for CommentNodes using both implementations and
checks the results. This is built upon the assumption that unimplemented
find_* methods return a list with a single assertion passing object.
After the assertion, it creates a list of newly created DualCommentNode
instances that encapsulate the pairs of returned CommentNode objects.
"""
return self._find_helper(DualCommentNode, "find_comments", comment)
def delete_child(self, child):
"""Deletes a child from the ParserNode implementations. The actual
ParserNode implementations are used here directly in order to be able
to match a child to the list of children."""
self.primary.delete_child(child.primary)
self.secondary.delete_child(child.secondary)
def unsaved_files(self):
""" Fetches the list of unsaved file paths and asserts that the lists
match """
primary_files = self.primary.unsaved_files()
secondary_files = self.secondary.unsaved_files()
assertions.assertEqualSimple(primary_files, secondary_files)
return primary_files
def parsed_paths(self):
"""
Returns a list of file paths that have currently been parsed into the parser
tree. The returned list may include paths with wildcard characters, for
example: ['/etc/apache2/conf.d/*.load']
This is typically called on the root node of the ParserNode tree.
:returns: list of file paths of files that have been parsed
"""
primary_paths = self.primary.parsed_paths()
secondary_paths = self.secondary.parsed_paths()
assertions.assertEqualPathsList(primary_paths, secondary_paths)
return primary_paths

View File

@@ -1,18 +1,15 @@
""" Entry point for Apache Plugin """
# Pylint does not like disutils.version when running inside a venv.
# See: https://github.com/PyCQA/pylint/issues/73
from distutils.version import LooseVersion # pylint: disable=no-name-in-module,import-error
from distutils.version import LooseVersion
from certbot import util
from certbot_apache import configurator
from certbot_apache import override_arch
from certbot_apache import override_fedora
from certbot_apache import override_darwin
from certbot_apache import override_debian
from certbot_apache import override_centos
from certbot_apache import override_gentoo
from certbot_apache import override_suse
from certbot_apache._internal import configurator
from certbot_apache._internal import override_arch
from certbot_apache._internal import override_centos
from certbot_apache._internal import override_darwin
from certbot_apache._internal import override_debian
from certbot_apache._internal import override_fedora
from certbot_apache._internal import override_gentoo
from certbot_apache._internal import override_suse
OVERRIDE_CLASSES = {
"arch": override_arch.ArchConfigurator,

View File

@@ -1,15 +1,15 @@
"""A class that performs HTTP-01 challenges for Apache"""
import logging
import errno
from acme.magic_typing import List, Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List
from acme.magic_typing import Set
from certbot import errors
from certbot.compat import os
from certbot.compat import filesystem
from certbot.compat import os
from certbot.plugins import common
from certbot_apache.obj import VirtualHost # pylint: disable=unused-import
from certbot_apache.parser import get_aug_path
from certbot_apache._internal.obj import VirtualHost # pylint: disable=unused-import
from certbot_apache._internal.parser import get_aug_path
logger = logging.getLogger(__name__)
@@ -169,7 +169,15 @@ class ApacheHttp01(common.ChallengePerformer):
def _set_up_challenges(self):
if not os.path.isdir(self.challenge_dir):
filesystem.makedirs(self.challenge_dir, 0o755)
old_umask = filesystem.umask(0o022)
try:
filesystem.makedirs(self.challenge_dir, 0o755)
except OSError as exception:
if exception.errno not in (errno.EEXIST, errno.EISDIR):
raise errors.PluginError(
"Couldn't create root for http-01 challenge")
finally:
filesystem.umask(old_umask)
responses = []
for achall in self.achalls:
@@ -195,8 +203,8 @@ class ApacheHttp01(common.ChallengePerformer):
if vhost not in self.moded_vhosts:
logger.debug(
"Adding a temporary challenge validation Include for name: %s " +
"in: %s", vhost.name, vhost.filep)
"Adding a temporary challenge validation Include for name: %s in: %s",
vhost.name, vhost.filep)
self.configurator.parser.add_dir_beginning(
vhost.path, "Include", self.challenge_conf_pre)
self.configurator.parser.add_dir(

Some files were not shown because too many files have changed in this diff Show More