Compare commits

...

278 Commits

Author SHA1 Message Date
Brad Warren
41b99eba79 Release 1.6.0 2020-07-07 10:33:13 -07:00
Brad Warren
de39a42e6a Update changelog for 1.6.0 release 2020-07-07 10:13:21 -07:00
Adrien Ferrand
183ccc64b1 Some improvements (#8132)
Short PR to improve some things during snap builds:
* cleanup snapcraft assets before a build, in order to avoid some weird errors when two builds are executed consecutively without cleanup
* use python3 explicitly in `tools/simple_http_server.py` because on several recent distributions, `python` binary is not exposed anymore, only `python2` or `python3`.
2020-07-06 16:04:59 -07:00
Brad Warren
6bca930752 Remove unnecessary symlink (#8135)
This isn't needed anymore thanks to the line:
```
AWS_EC2_PEM_FILE: $(testFarmPem.secureFilePath)
```
2020-07-06 15:31:24 -07:00
Brad Warren
cd993cdfb1 Remove grade devel from Certbot snap. (#8128)
If you go to a URL like https://snapcraft.io/certbot/releases and try to move the Certbot snap into the candidate or stable channels, you cannot do so. There is a tooltip which says that revisions with the grade devel cannot be promoted to candidate or stable channels.

The documentation for `grade` can be found at https://snapcraft.io/docs/snapcraft-yaml-reference where it says the value is optional and

> Defines the quality grade of the snap.
Type: enum
Can be either devel (i.e. a development version of the snap, so not to be published to the stable or candidate channels) or stable (i.e. a stable release or release candidate, which can be released to all channels)
Example: [stable or devel]

I'm working on a proposal for our next steps for snaps which involves moving the Certbot snap to the stable channel. I of course won't make those changes without giving others a chance to share their opinion, but I'd like to avoid the situation where we're technically unable to move the Certbot 1.6.0 snap to the stable channel despite wanting to do so.

I started to make the same changes to the DNS plugins, but I personally think it's too soon to propose stable versions of those yet and `grade` is a simple way to ensure we don't accidentally promote something there.

You can see the snap being built and run successfully with this change at https://dev.azure.com/certbot/certbot/_build/results?buildId=2246&view=results.
2020-07-06 12:31:55 -07:00
Brad Warren
9f994d7a50 Run at 4:30 UTC to have Azure reparse YAML file. (#8133) 2020-07-06 20:41:26 +02:00
Brad Warren
4f3dc8862d Switch build status to nightly pipeline. (#8127)
The advanced pipeline no longer exists.
2020-07-02 16:05:28 -07:00
Brad Warren
48139f382d Do not build pushes to master. (#8126) 2020-07-03 01:00:35 +02:00
Adrien Ferrand
8a3a8c7097 Migrate the CI pipeline from Travis to Azure Pipeline (#8098)
Fixes #8071 and fixes https://github.com/certbot/certbot/issues/8110.

This PR migrates every job from Travis in Azure Pipeline.

This PR essentially converts the Travis jobs into Azure Pipeline with a complete iso-fonctionality (or I made a mistake). The jobs are added in the relevant existing pipelines (`main`, `nightly`, `advanced-test`, `release`). A global refactoring thanks to the templating system is done to reduce greatly the verbosity of the pipeline descriptions.

A specific feature (not present in Travis) is added: the stage `On_Failure`. Using directly the Mattermost API, it allows to notify pipeline failure in a Mattermost channel with a link to the failed pipelines without the need to authenticate to Microsoft.

See https://github.com/certbot/certbot/pull/8098#issuecomment-649873641 for the post merge actions to do at the end of this work.
2020-07-02 15:01:21 -07:00
ohemorange
cb3ff9ef18 Set up CentOS 8 test farm tests (#8122)
Fixes #7420.

* Set up CentOS 8 test farm tests

* Don't add to apache2_targets until 7273 is resolved

* Start upgrade test from a version that works on centos 8

* remove when possible from targets
2020-07-01 17:07:41 -07:00
alexzorin
f743dbec3a certbot: add --preferred-chain (#8080)
* acme: add support for alternative cert. chains

* certbot: add --preferred-chain

* remove support for issuer SKI matching

* show --preferred-chain in "run" help

* warn if no chain matched and it's not a dry-run

* fix existing failing tests

* add unit, integration tests

* bump acme dependency to dev version

* simplify test to avoid py2.7 recursion bug

* add preferred_chain to STR_CONFIG_ITEMS

* reduce preferred_chain warning to info level

* acme: fix some docstrings in .messages

* certbot: fix docstring in crypto_util

* try to fix certbot-nginx acme dep problem
2020-06-30 17:45:39 -07:00
ohemorange
2af297d72f Make each DNS plugin respect EXCLUDE_CERTBOT_DEPS (#8117)
* Don't include certbot deps when EXCLUDE_CERTBOT_DEPS is set

* import os
2020-06-29 16:58:26 -07:00
Brad Warren
95ef53e5d5 Add missing spaces to manual plugin help. (#8116) 2020-06-29 13:34:24 -07:00
Brad Warren
24c5fab8b6 Add awscli to requirements.txt (#8113) 2020-06-25 16:52:56 -07:00
ohemorange
713b91495b Fix paths when calling out to programs outside of snap (#8108)
Fixes #8093.

This PR modifies and audits all uses of `subprocess` and `Popen` outside of tests, `certbot-ci/`, `certbot-compatibility-test/`, `letsencrypt-auto-source/`, `tools/`, and `windows-installer/`. Calls to outside programs have their `env` modified to remove the `SNAP` components of paths, if they exist. This includes any calls made from hooks, calls to `apachectl` and `nginx`, and to `openssl` from `ocsp.py`.

For testing manually, rsync flags will look something like:

```
rsync -avzhe ssh root@focal.domain:/home/certbot/certbot/certbot_*_amd64.snap .
rsync -avzhe ssh certbot_*_amd64.snap root@centos7.domain:/root/certbot/
```

With these modifications, `certbot plugins --prepare` now passes on Centos 7.

If I'm wrong and we package the `openssl` binary, the modifications should be removed from `ocsp.py`, and `env` should be passed into `run_script` rather than set internally in its calls from nginx and apache.

One caveat with this approach is the disconnect between why it's a problem (packaging) and where it's solved (internal to Certbot). I considered a wrapping approach, but we'd still have to audit specific calls. I think the best way to address this is robust testing; specifically, running the snap on other systems.

For hooks, all calls will remove the snap paths if they exist. This is probably fine, because even if the hook intends to call back into certbot, it can do that, it'll just create a new snap.

I'm not sure if we need these modifications for the Mac OS X/ Darwin calls, but they can't hurt.

* Add method to plugins util to get env without snap paths

* Use modified environment in Nginx plugin

* Pass through env to certbot.util.run_script

* Use modified environment in Apache plugin

* move env_no_snap_for_external_calls to certbot.util

* Set env internally to run_script, since we use that only to call out

* Add env to mac subprocess calls in certbot.util

* Add env to openssl call in ocsp.py

* Add env for hooks calls in certbot.compat.misc.

* Pass env into execute_command to avoid circular dependency

* Update hook test to assert called with env

* Fix mypy type hint to account for new param

* Change signature to include Optional

* go back to using CERTBOT_PLUGIN_PATH

* no need to modify PYTHONPATH in env

* robustly detect when we're in a snap

* Improve env util fxn docstring

* Update changelog

* Add unit tests for env_no_snap_for_external_calls

* Import compat.os
2020-06-25 15:36:29 -07:00
dmmortimer
0f4c31c9c7 Generalize renewal rate limit UI warning message (#3456) (#8061)
- Old text hard codes the rate limit
 - Let's Encrypt CA might change its rate limit
 - Other CAs might have different rate limits

Update CHANGELOG.md
2020-06-25 11:43:08 -07:00
dkp
b9a8248541 Remove SSL Labs From Certbot Output (#8109)
The Apache plugin expects clients to support SNI, but
SSL Labs tries without SNI and includes the results
in their score.

Closes certbot/certbot#7728
2020-06-25 11:42:07 -07:00
Brad Warren
8027430625 Correct plugin constraints. (#8104) 2020-06-23 14:16:41 -07:00
ohemorange
bce14ae65f Make DNS plugin snaps use core20 (#8106)
Fixes #8103.

* Update the DNS plugin generator script to core20 syntax

* Generate new snapcraft.yamls for the DNS plugins

* Update certbot.wrapper to search for python3.8 paths
2020-06-23 09:31:08 -07:00
Adrien Ferrand
25d1977d4f Add script and generated snapcraft.yaml files (#8096)
This PR adds a proper snapcraft.yaml file for each DNS plugin, and provides a shell script to generate them.
2020-06-22 17:07:08 -07:00
Brad Warren
46eb4ec7e3 Remove unneeded step to create constraints file. (#8102) 2020-06-22 16:48:50 -07:00
Brad Warren
3ae8fa640b Remove snap-plugin from README (#8101) 2020-06-22 15:49:18 -07:00
Adrien Ferrand
035b6514db Build the constraints file insided snapcraft (#8097)
Fixes #7957

This PR makes snapcraft generate itself the dependencies constraints file during the snap build, instead of having an external script that does it before calling snapcraft.
2020-06-22 13:54:18 -07:00
Brad Warren
f151099342 Do not unstage generated cffi augeas (#8094)
This line seems to refer to [augeas.py](https://github.com/hercules-team/python-augeas/blob/v0.5.0/augeas.py) from the version of Augeas we normally have pinned. This was necessary when we were installing each Certbot component in separate parts and only combining them later to ensure that the Augeas fork (which uses cffi) was used instead of the unmodified pinned version of Augeas.

Since everything is installed in one part and we're removing the Augeas pinning now though, this line is no longer necessary. You can see the snap being built and tested successfully with this change at https://travis-ci.com/github/certbot/certbot/builds/172134518.

* Do not unstage generated cffi augeas

* Add comment about deleted line.
2020-06-22 12:11:38 -07:00
Florian Klink
25e79e4aca tree-wide: use LooseVersion instead of StrictVersion (#8081)
According to `distutils/version.py`, StrictVersion is pretty strict in
what version numbers to accept:

> A version number consists of two or three dot-separated numeric
> components, with an optional "pre-release" tag on the end.  The
> pre-release tag consists of the letter 'a' or 'b' followed by a number.

This assumption already fails for some pretty basic python libraries
itself, like setuptools, also available in `46.1.3.post20200610`, a
completely valid version number according to
https://www.python.org/dev/peps/pep-0440/#post-releases.

There doesn't seem to be a particular reason on why StrictVersion has
been used here, so let's use LooseVersion, to be compatible with these
versions.

Co-authored-by: Adrien Ferrand <adferrand@users.noreply.github.com>
2020-06-19 17:11:35 +02:00
Brad Warren
db064a4109 Use certbot.wrapper during renewal. (#8095) 2020-06-18 16:41:12 -07:00
Adrien Ferrand
860af81fef Deferred EFF subscription until the first certificate is successfully issued (#8076)
* Base logic

* Various controls when email is None

* Adapt eff tests

* Forward compatibility

* Also for csr

* Explicit regr or meta updates in account objects

* Adapt logic to ask for eff subscription during registering

* Adapt tests

* Move dry-run control

* Add some relevant controls on handle_subscription call checks
2020-06-18 15:58:19 -07:00
ohemorange
70c8481fd8 Don't include certbot deps when EXCLUDE_CERTBOT_DEPS is set in plugins (#8091)
This will allow DNS plugin snaps to build if they rely on unreleased acme/certbot, and remove other copy of Certbot from externally snapped plugins. Fixes #8064 and fixes #7946. Implementation is based on the design [here](https://github.com/certbot/certbot/issues/8064#issuecomment-645513120).

To test, see reverted commit 8632064. Steps taken:

- added changes to setup.py and snapcraft.yaml
- successfully snapped, connected, ran `sudo certbot plugins --prepare`
- added temporary changes to have both certbot and certbot-dns-dnsimple use DNSAuthenticator2
- snapped and installed certbot, `certbot plugins` failed as expected.
- snapped and installed certbot-dns-dnsimple, `sudo certbot plugins --prepare` succeeded
- Inspected dns plugin's `bin` and `lib`; no `certbot` or `acme`, as expected.
```
$ ls /snap/certbot-dns-dnsimple/current/lib/python3.6/site-packages/
OpenSSL                                         future-0.18.2.dist-info        requests_file.py
PyYAML-5.3.1.dist-info                          idna                           setuptools
_cffi_backend.cpython-36m-x86_64-linux-gnu.so   idna-2.9.dist-info             setuptools-47.3.1.dist-info
certbot_dns_dnsimple                            lexicon                        six-1.15.0.dist-info
certbot_dns_dnsimple-1.6.0.dev0-py3.6.egg-info  libfuturize                    six.py
certifi                                         libpasteurize                  tldextract
certifi-2020.4.5.1.dist-info                    past                           tldextract-2.2.2.dist-info
cffi                                            pip                            urllib3
cffi-1.14.0.dist-info                           pip-20.1.1.dist-info           urllib3-1.25.9.dist-info
chardet                                         pkg_resources                  wheel
chardet-3.0.4.dist-info                         pyOpenSSL-19.1.0.dist-info     wheel-0.34.2.dist-info
cryptography                                    pycparser                      yaml
cryptography-2.8.dist-info                      pycparser-2.20.dist-info       zope
dns_lexicon-3.3.26.dist-info                    requests                       zope.interface-5.1.0-py3.6-nspkg.pth
easy_install.py                                 requests-2.23.0.dist-info      zope.interface-5.1.0.dist-info
future                                          requests_file-1.5.1.dist-info
$ ls /snap/certbot-dns-dnsimple/current/bin/
chardetect  futurize  lexicon  pasteurize  tldextract
```
- reset to HEAD^
- snapped and installed certbot to not have the DNSAuthenticator2 changes, `certbot plugins` failed as expected.

* Don't include certbot deps when EXCLUDE_CERTBOT_DEPS is set

* Set EXCLUDE_CERTBOT_DEPS in certbot-dns-dnsimple/snap/snapcraft.yaml
2020-06-18 15:24:10 -07:00
ohemorange
c5e5594ac3 Switch to using snap-constraints in dns plugin so the .gitignore catches it. (#8092) 2020-06-18 15:15:41 -07:00
ohemorange
ad8ffc1bf0 Merge pull request #8089 from certbot/squashed-snap-plugin
Merge the snap-plugin branch into master
2020-06-18 12:59:04 -07:00
Erica Portnoy
29a23d3148 Update README instructions for merged master
Use Focal when following README instructions

Discard changes to disco.py

bring README up to date with current knowledge

Move README to snap/local/
2020-06-18 12:49:58 -07:00
Brad Warren
44b1bd8e0e Update README
fix branch name

grammar

remove readme link

Remove links to Robie's repo.

say you cant use docker

Commands not command

Update publishing permissions section.

Have Certbot trust plugins.

Do not run snapcraft with sudo.
2020-06-18 12:20:57 -07:00
Robie Basak
0bebdedcbc Initial revision
Fix headings

Fix error in build instructions
2020-06-18 12:20:56 -07:00
Brad Warren
8ccc96bbdd update certbot-dns-dnsimple snapcraft.yml.
update dnsimple snapcraft.yml
2020-06-18 12:20:53 -07:00
Robie Basak
4a2618d415 Add CERTBOT_PLUGIN_PATH support
Initial revision

Fix interface export path

Switch to strict confinement

Normalise slot parameters
2020-06-18 12:20:18 -07:00
ohemorange
bcb3554836 Merge pull request #8086 from certbot/core20-squashed
Upgrade snap to be based on core20

This PR makes several changes to be built on top of the core20 base snap. Fixes #7854.

The main changes are to `snapcraft.yaml`. With Snapcraft 4.0/core20 base, the python plugin is a thin wrapper, basically creating a `venv` and installing the packages from the source. The trouble with this is that it runs pycache, creating caches that conflict from the different parts. So to solve that, we put everything in a single part. Other changes include:

- We use classic confinement, so we need to specify a bunch of python packages to `stage-packages`, as mentioned [here](https://forum.snapcraft.io/t/trouble-bundling-python-with-classic-confinement-in-core20-4-0-4/18234/2).
- The certbot executable now lives in `bin`, so specify running `certbot/bin`.
- Since `python-augeas` is now being pulled into the single part, remove the pinning from constraints so we can use the latest version directly from github.
- Precompile our `cryptography` and `cffi` wheels to be based on python3.8.

Separately, we had to upgrade the snapcraft docker image to be based on focal, due to the thin wrapper situation. This was accomplished [here](https://github.com/adferrand/snapcraft/pull/1).
2020-06-17 17:20:46 -07:00
Erica Portnoy
0f53e8ad4e Upgrade snap to be based on core20
* Get rid of a whole bunch of error message
* Remove some more overlaps
* don't use certbot from nginx and apache
* use python3 from bin
* certbot needs to be in bin
* try to exclude just the certbot folder
* try a couple things to use the python from the venv bin
* play around with which versions of things we want from each package
* ok, certbot-nginx does need to stage bin
* certbot needs to not stage bin. why does certbot not put certbot in bin?
* fail to inspect more versions of things in the container shell
* take cffi backend from python-augeas
* if we use certbot from bin things should work?
* why is bin not in path? no idea, but let's get it compiled then inspect things in the snap shell
* use snap.certbot instead of bin/certbot
* it does require bin/certbot. I don't know why.
* let's see if we can stick it all in one step
* try installing local subdirectories
* move python-augeas into the single part
* remove after
* put back python-augeas part for now; ERROR: Could not satisfy constraints for 'python-augeas': installation from path or url cannot be constrained to a version
* how was this previously working without git installed? install git.
* maybe it needs to already have python3-wheel installed
* maybe wheel will install first if I change it to -e
* no -e
* maybe try a different python3 package to stage
* this last change wasn't necessary
* remove the bin/ from renew
* nope, it does need bin/certbot
* back to wget
* stage a bare python3
* add all necessary python packages to stage-packages
* pretty sure we don't actually need wheel. let's try removing it!
* remove python-augeas, since we have it pinned to an older version in cb-auto that might work
* stage augeas
* still need libaugeas-dev
* ok let's try building
* combining into one part works! just make sure to unpin python-augeas when generating snap-constraints.txt
* change our scripts to unpin python-augeas
* Use ubuntu 20 in compile_native_wheels.sh
* .travis.yml should use python3-dev instead of python-dev
* jk! we don't need python3-dev in travis
* Update cffi and cryptography wheels for ubuntu20 version of python
* looks like we need python3-dev to build things
* Remove deprecated i386 wheels
2020-06-17 16:57:51 -07:00
Brad Warren
2a18ae6d57 Trivially upgrade to core20? 2020-06-17 16:52:17 -07:00
Brad Warren
3c4b922197 Remove the need for TRAVIS to be set. (#8084)
I initially added this when the script was doing things like migrating all LXD containers to the snap. I think the external side effects are now pretty minimal thought so I think we can remove the need for this environment variable which makes it easier to use outside of CI for manual testing.
2020-06-17 14:41:11 -07:00
Adrien Ferrand
0bb1f0b2ce Drop i386 architecture on snap build (#8083)
This PR remove the i386 architecture in the snap build process, because the base snap `core20` is not available for this architecture.
2020-06-16 15:57:05 -07:00
Brad Warren
ba192f321d Remove on_cancel which isn't recognized by Travis. (#8077) 2020-06-15 13:27:23 -07:00
Cameron Steel
961c573864 dns-cloudflare: Update docs and error messages to reflect new API permissions (#8015)
* Tweaks for improved Cloudflare API

* Update docs for dns-cloudflare

* Update tests and changelog

* Fix bad merge

* Fix error code for record add

* Improve error message

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-06-12 20:38:13 +02:00
Brad Warren
fb39de7d01 add changelog entry (#8067) 2020-06-11 10:19:51 +02:00
Brad Warren
97fcfd40d1 starts not stats (#8066) 2020-06-10 16:58:46 -07:00
Adrien Ferrand
9ac476e87b Certbot snap multiarch build (#8016)
This PR proposes a way to build the certbot snap for several architectures, using a QEMU-base emulation approach, and several optimizations to keep the build time of each snap below 20 minutes.

Most of the reasoning for the approach proposed here is described in the original PR: https://github.com/basak/certbot-snap-build/pull/27

On top of it, I added a docker pull to a pre-compiled snapcraft docker, instead of compiling it during the Travis pipeline, in order to save 5 to 7 minutes more on each snap build. The snap images are compiled and stored here: https://hub.docker.com/repository/docker/adferrand/snapcraft. Depending on the time the PR will be reviewed, we can:
* continue to use `adferrand/snapcraft`
* move its logic to certbot scope and use something like `certbot/snapcraft`
* wait for https://github.com/snapcore/snapcraft/pull/3144 to be merged, and use `snapcore/snapcraft`.

* Backport https://github.com/basak/certbot-snap-build/pull/27 into Certbot project

* Fix build deps

* Integrate proactively #8012 to fix builds on non-amd64 archs

* Configure jobs on Travis

* Focus on snap builds. Disable temporarily some jobs. Disable deploy actions by security.

* Specify TARGET_ARCH during snap build

* Do not do anything if TOXENV is not set

* Various optimizations

* Use recent version of ubuntu for get correct features on snap out of the box

* Add up to date wheels

* Organizing scripts

* Set dest dir

* Get back original configuration for Travis

* Add comments

* Update common_libs.sh

* Use adferrand/snapcraft

* Test build

* Stable snapcraft

* Update build_and_install.sh

* Move back snap builds to the cron/release pipeline

* Update snap/local/compile_native_wheels.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update snap/local/compile_native_wheels.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update snap/local/compile_native_wheels.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Update snap/local/build_and_install.sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Enable i386 builds, various optimizations

* Update dependencies

* Configure a simple http server to serve the pre compiled wheels

* Fix wheels compilation

* Relax permissions

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-06-10 14:33:02 -07:00
ohemorange
8d776fb7ac Make Certbot find externally snapped plugins (#8054)
* Add snap plugin support

Switch to a PoC branch of certbot that supports the new
CERTBOT_PLUGIN_PATH and wrap the snap to set this variable correctly
based on the content interfaces connected.

(cherry picked from commit 7076a55fd82116d068e2aca7239209b7203917d2)

* Modify certbot.wrapper to append to PYTHONPATH instead of separate CERTBOT_PLUGIN_PATH variable

* Update certbot-wrapper to python3.6 version

* add source field

* Update changelog

* Use bash instead of sh

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Exit if something goes wrong

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* No leading : when modifying empty PYTHONPATH

* Improve bash handling of PYTHONPATH manipulation

Co-authored-by: Robie Basak <robie.basak@canonical.com>
Co-authored-by: Brad Warren <bmw@eff.org>
Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-06-10 13:52:56 -07:00
Adrien Ferrand
50fa04ba0c Implement umask for Windows (#7967)
This PR gets its root from an observation I did on current version of Certbot (1.3.0): the `renewal-hooks` directory in Certbot configuration directory is created on Windows with write permissions to everybody.

I thought it was a critical bug since this directory contains hooks that are executed by Certbot, and you certainly do not want this folder to be open to any malicious hook that could be inserted by everyone, then executed with administrator privileges by Certbot.

Turns out for this specific problem that the bug is not critical for the hooks, because the scripts are expected to be in subdirectories of `renewal-hooks` (namely `pre`, `post` and `deploy`), and these subdirectories have proper permissions because we set them explicitly when Certbot is starting.

Still, there is a divergence here between Linux and Windows: on Linux all Certbot directories without explicit permissions have at maximum `0o755` permissions by default, while on Windows it is a `0o777` equivalent. It is not an immediate security risk, but it is definitly error-prone, not expected, and so a potential breach in the future if we forget about it.

Root cause is that umask is not existing in Windows. Indeed under Linux the umask defines the default permissions when you create a file or a directory. Python takes that into account, with an API for `os.open` and `os.mkdir` that expose a `mode` parameter with default value of `0o777`. In practice it is never `0o777` (either you the the `mode` explictly or left the default one) because the effective mode is masked by the current umask value in the system: on Linux it is `0o022`, so files/directories have a maximum mode of `0o755` if you did not set the umask explicitly, and it is what it is observed for Certbot.

However on Windows, the `mode` value passed (and got from default) to the `open` and `mkdir` of `certbot.compat.filesystem` module is taken verbatim, since umask does not exit, and then is used to calculate the DACL of the newly created file/directory. So if the mode is not set explicitly, we end up with files and directories with `0o777` permissions.

This PR fixes this problem by implementing a umask behavior in the `certbot.compat.filesystem` module, that will be applied to any file or directory created by Certbot since we forbid to use the `os` module directly.

The implementation is quite straight-forward. For Linux the behavior is not changed. On Windows a `mask` parameter is added to the function that calculates the DACL, to be invoked appropriately when file or directory are created. The actual value of the mask is taken from an internal class of the `filesystem` module: its default value is `0o755` to match default umasks on Linux, and can be changed with the new method `umask` that have the same behavior than the original `os.umask`. Of course `os.umask` becomes a forbidden function and `filesystem.umask` must be used instead.

Existing code that is impacted have been updated, and new unit tests are created for this new function.

* Implement umask for Windows

* Set umask at the beginning of tests

* Fix lint, update local oldest requirements

* Update certbot-apache/setup.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Improve tests

* Adapt filesystem.makedirs for Windows

* Fix

* Update certbot-apache/setup.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Changelog entries

* Fix lint

* Update certbot/CHANGELOG.md

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-06-09 17:08:22 -07:00
Brad Warren
e31834a6cd Stop running snapcraft with sudo (#8063)
* Do not use sudo when building the snap.

* add user to lxd

* Run with lxd group.
2020-06-09 14:46:11 -07:00
Brad Warren
340a4280ea Merge pull request #8053 from certbot/upgrade-acmev1
Read acmev1 Let's Encrypt server URL from renewal config as acmev2 URL
2020-06-09 11:43:06 -07:00
Brian Heim
cc07722b3e Fix certbot.compat.filesystem documentation (#8058)
* Fix bad rst docstrings

* AUTHORS.md: add Brian Heim

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2020-06-08 14:00:16 -07:00
Rasesh Patel
67bcf0f6bd Remove documentation for configuring ciphersuites (#8027) (#8056)
Issue #1123 discusses a feature that allows users to set the cipher
security level. That feature wasn't built. It didn't provide enough
user value to justify the corresponding increase in complexity. The
feature request and the associated discussion threads were closed.
However, the proposed API spec and the TODO section remained in the
cipher docs. They're a vestige of that issue from olden days and this PR
removes those last living traces...

Fixes #8027.
2020-06-08 12:15:44 -07:00
Brad Warren
1b2328f18b Add comment about pyca's use of tools script (#8044) 2020-06-08 12:14:02 -07:00
Brian Heim
560b9e5012 AUTHORS.md: fix GH url for Brandon Kreisel (#8059) 2020-06-08 12:13:29 -07:00
Lloyd Parkes
2f6fbe9987 Add support for NetBSD (#8033)
* Add support for NetBSD by telling certbot-nginx where the nginx
configuration directory is.

* Update the CHANGELOG.

* Pass the right type of sequence to "in". Thanks lint.

* Adjust the CHANGELOG.md entry following feedback from ohemorange.

Co-authored-by: Lloyd Parkes <lloyd@must-have-coffee.gen.nz>
2020-06-08 12:06:38 -07:00
Erica Portnoy
bebcad0588 update changelog 2020-06-04 15:57:34 -07:00
Erica Portnoy
92f26367eb Merge remote-tracking branch 'alexzorin/7979_restore_v1_as_v2' into upgrade-acmev1 2020-06-04 14:29:56 -07:00
alexzorin
d135e6140b apache: handle statically linked mod_ssl (#8007)
In #7771, the Apache configurator gained the ability to identify what
version of OpenSSL Apache's ssl_module is linked against. However, the
detection was only functional if the module was built as a DSO (which is
almost always the case).

This commit covers the case where the ssl_module is statically linked
within the Apache binary. It requires the user to specify the path to
the binary (with --apache-bin) and emits a warning if static linking is
detected but no path has been provided.
2020-06-04 10:34:10 -07:00
Adrien Ferrand
010b38fa10 Upgrade Certbot dependencies (#8036)
This PR upgrades Certbot pinned dependencies through `letsencrypt-auto-source/rebuild_dependencies.py` while taking into account the problems detected in https://github.com/certbot/certbot/pull/8035:
* `cryptography` is pinned to `2.8` to continue to support OpenSSL 1.0.1 on non-x86 ancient Linux distributions (RHEL 6 + Debian 8)
* `parsedatetime` is pinned to `2.5` because of an incompatibility with Python 2.7 (see https://github.com/bear/parsedatetime/issues/246)
* `letsencrypt-auto-source/rebuild_dependencies.py` now takes into account the environment markers that are aded to `AUTHORITATIVE_CONSTRAINTS`: this is used for the `enum34` dependency, to not install it on Python 3.6+ and not break the distribution by swapping the built-in `enum` module during the setup of Certbot venv.

Fixes #8030

* Pin cryptography and parsedatetime

* Upgrade dependencies

* Remove authoritative constraint

* Upgrade dependencies

* Rebuild certbot-auto

* Update letsencrypt-auto-source/rebuild_dependencies.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Honor specific requirements in the AUTHORITATIVE_CONSTRAINTS

* Fix injection

* Update dependencies

* Update rebuild_dependencies.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-06-04 08:59:45 -07:00
ohemorange
8c8d3fab91 Merge pull request #8040 from certbot/candidate-1.5.0
Release 1.5.0
2020-06-02 12:19:39 -07:00
Brad Warren
baf69d210b Bump version to 1.6.0 2020-06-02 10:32:41 -07:00
Brad Warren
beea2d2208 Add contents to certbot/CHANGELOG.md for next version 2020-06-02 10:32:40 -07:00
Brad Warren
4938273e0f Release 1.5.0 2020-06-02 10:32:38 -07:00
Brad Warren
466b4fbf71 Update changelog for 1.5.0 release 2020-06-02 10:12:33 -07:00
Brad Warren
95ae5f69f5 Make rebuild_dependencies.py executable (#8039) 2020-06-02 19:11:47 +02:00
ohemorange
2acc1dcc89 Fix TLS-ALPN tests with newer versions of OpenSSL (#8026)
Fixes #7988. As described there, the steps involved are:

1. Update our tests so they fail due to this problem.
2. Update the keys used in the tests so they pass with the new changes.

For 1, see a [failing travis run](https://travis-ci.com/github/certbot/certbot/jobs/340710511) with the included change. And for the full output to confirm that this is what is failing, see a [run on debian 10](https://github.com/certbot/certbot/files/4692350/debian_run_log.txt).

This PR adds `rsa4096_key.pem` and `rsa4096_cert.pem`, updates the `TLS-ALPN` test to use those keys in place of the 1024-bit versions, and fixes the README in that `testdata` folder with correct instructions to generate these files.

* export PIP_NO_BINARY in pip install subshell in test_sdists.sh

* set environment variable on the line that installs most packages

* Generate 4096-bit rsa key and cert, and fix README instructions to do so.

* Update TLS_ALPN test to use 4096-bit key instead of 1024-bit key.

* Update changelog

* Older versions of Python have an error when both VIRTUAL_NO_DOWNLOAD and PIP_NO_BINARY are set, so only apply the latter at the install phase.

* Add enum34 constraint manually, since rebuild_dependencies.py seems to be broken.

* only delete key if it exists

* Check OpenSSL version before trying to set PIP_NO_BINARY

* Add comment explaining why we only set PIP_NO_BINARY at the install step
2020-06-01 15:18:38 -07:00
Brad Warren
fa55b468c8 Revert "Upgrade pinned certbot dependencies (#8012)" (#8035)
This reverts commit 6b97ac3344.
2020-06-01 20:17:26 +02:00
Brad Warren
cd27dcc32c Add the content interface to Certbot (#8009)
* Add the content interface to Certbot

This commit contains a subset of the changes from 7076a55fd82116d068e2aca7239209b7203917d2.

* Normalise slot parameters

(cherry picked from commit 810941979bcf609c1e0be18e9263abf046b90e82)

Co-authored-by: Robie Basak <robie.basak@canonical.com>
2020-05-27 13:59:08 -07:00
Adrien Ferrand
6b97ac3344 Upgrade pinned certbot dependencies (#8012)
* Upgrade certbot dependencies

* Rebuild letsencrypt-auto
2020-05-26 15:19:10 -07:00
ohemorange
332def46da Require explicit confirmation of snap plugin permissions before connecting (#8013)
Fixes #7667.

Implements the plan described in #7667.

Here's a terminal log showing that it does so:

```
# sudo snap connect certbot:plugin certbot-dns-dnsimple
error: cannot perform the following tasks:
- Run hook prepare-plug-plugin of snap "certbot" (run hook "prepare-plug-plugin": 
-----
Only connect this interface if you trust the plugin author to have root on the system
Run `snap set certbot trust-plugin-with-root=ok` to acknowledge this and then run this command again to perform the connection
-----)
# snap set certbot trust-plugin-with-root=ok
# sudo snap connect certbot:plugin certbot-dns-dnsimple
# sudo snap disconnect certbot:plugin certbot-dns-dnsimple:certbot
# sudo snap connect certbot:plugin certbot-dns-dnsimple
error: cannot perform the following tasks:
- Run hook prepare-plug-plugin of snap "certbot" (run hook "prepare-plug-plugin": 
-----
Only connect this interface if you trust the plugin author to have root on the system
Run `snap set certbot trust-plugin-with-root=ok` to acknowledge this and then run this command again to perform the connection
-----)
```

* Add plugin connection hook to accept root trust

* snapctl requires a configure hook to set options

* Add sh notice

* Update changelog
2020-05-26 12:02:33 -07:00
Adrien Ferrand
b42e24178a Consistent directory mode apply when makedirs is called (#8010)
Fixes #7993 

This PR uses `os.umask()` during `certbot.compat.filesystem.makedirs()` call to ensure that all directories, and not only the leaf one, have the provided `mode` when created. This ensures a safe and consistent behavior independently from the Python version, since the behavior of `os.makedirs` changed on that matter with Python 3.7.

* Implement logic to apply the same permission on all dirs created by makedirs

* Add a test

* Add comment

* Update certbot/certbot/compat/filesystem.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-05-21 15:29:06 -07:00
Alex Zorin
8a15bd7927 renewal: disregard acme-v01 in renewal configs
Fixes #7979
2020-05-21 23:01:53 +10:00
ohemorange
3ea5170647 Error out earlier in apache installer when mod_ssl is not available (#7984)
* Error out in apache installer when mod_ssl is not available

* Update to MisconfigurationError and add/fix tests

* Remove error cases we no longer hit and associated test

* mock out function to have consistent error across machines

* improve changelog message

* only check key in modules list, not value
2020-05-19 15:34:21 -07:00
schoen
0b53c0d476 Merge pull request #7952 from ntkme/allow-empty-existing-dir
Allow existing but empty archive and live dir to be used when creating new lineage
2020-05-19 15:25:32 -07:00
Brad Warren
4eb9a71a4c remove quay cruft (#8003)
Our README still has links to our old quay.io builds which we shutdown a while ago. See #4343. This PR simply removes the old stray links.
2020-05-19 14:29:28 -07:00
Brad Warren
96e003d1a3 mention python3-venv in docs (#8006)
The error message from `python3 -m venv` when you don't have `python3-venv` installed is pretty good, but lets skip the failure and make sure it is installed the first time.
2020-05-19 14:28:41 -07:00
schoen
7a7c6737cc Merge pull request #8000 from certbot/make-exit-message-red
Print cause of exit in red text
2020-05-18 17:13:03 -07:00
Brad Warren
0e59c6ba1b handle more cases 2020-05-18 10:17:32 -07:00
Brad Warren
d230dcafeb Print cause of exit in red text. 2020-05-18 09:31:15 -07:00
alexzorin
bcf33c6659 ocsp: add support for public key hash ResponderIDs (#7989)
For both cases where the the response is signed by the issuer, or by a
delegated OCSP signer.

Resolves #7986
2020-05-13 15:55:35 -07:00
Brad Warren
71e3d82e47 Remove passthrough because its no longer needed (#7956) 2020-05-06 14:28:15 -07:00
Adrien Ferrand
bb6a660b21 Platform-agnostic method to copy owner and permissions between files (#7968)
Related to #7649 since @joohoi needs a method to copy owner and permissions together from a source file to a destination file.

This PR creates the method `copy_ownership_and_mode()` in `certbot.compat.filesystem` module to achieve this goal. Its behavior is consistent across Linux and Windows in respect to the security model that have been defined for Certbot on Windows.

The method behaves globally the same than `copy_ownership_and_apply_mode`, but this time the permissions are extracted from the source file. For Windows it means that the DACL is copied from the source to the destination with the same content.

* Create copy_ownership_and_mode to copy both owner and mode from src to dst

* Update certbot/tests/compat/filesystem_test.py

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>

* Fix docstring

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-05-06 12:33:50 -07:00
Brad Warren
c35054fd11 Merge pull request #7974 from certbot/candidate-1.4.0
Release 1.4.0
2020-05-05 17:06:31 -07:00
Erica Portnoy
b224b49986 Bump version to 1.5.0 2020-05-05 13:44:23 -07:00
Erica Portnoy
1a72cdecf2 Add contents to certbot/CHANGELOG.md for next version 2020-05-05 13:44:23 -07:00
Erica Portnoy
5586ae071a Release 1.4.0 2020-05-05 13:44:21 -07:00
Erica Portnoy
ca60ad52b9 Update changelog for 1.4.0 release 2020-05-05 12:37:33 -07:00
Brad Warren
9154e7965f drop min certbot coverage (#7972)
`tox -e cover` fails for me on macOS. This is due to the differences in the code that is run when on Linux vs. other platforms in `certbot.util` and its tests. Diffing my local coverage with Travis, the only difference in lines missing test coverage is:
```
$ diff travis.txt local.txt
< certbot/certbot/util.py                                        238     24    90%   132-134, 210, 275-281, 318, 330-340, 347, 381-382
> certbot/certbot/util.py                                        239     34    86%   30, 132-134, 210, 275-281, 301, 305, 316-318, 330-340, 347, 367-373, 381-382
< certbot/tests/util_test.py                                     392      0   100%
> certbot/tests/util_test.py                                     375     26    93%   483-487, 492-502, 507-514, 548-550, 555-557
```
I think tests on `master` should not be failing locally for people.

While there would be other ways to fix this by adding `# pragma: no cover` lines or writing mocked out tests for other platforms, I personally just think dropping the minimum coverage one percentage point is fine at least for now.
2020-05-05 09:38:20 -07:00
Brad Warren
ac2d691ade Add warning about ignoring our own warnings (#7971)
Coming out of the conversation at #7863 in the linked Google Doc, we should always have at least 1 release between updating one of our plugins to stop using a deprecated acme/certbot API and removing it from acme/certbot. Doing this gives the plugin changes time to propagate rather than potentially having the plugin break because Certbot was updated before the plugin had made the necessary changes.

This comment here should help ensure this.

* Add pytest warnings warning.

* clarify comment
2020-05-04 16:54:09 -07:00
schoen
5536c91223 Merge pull request #7938 from taixx046/1618-awkward-language-during-email-problems
Fixed #1618 awkward language during email problems
2020-04-27 18:33:07 -07:00
Adrien Ferrand
9cbb13ef04 Run hooks with Powershell on Windows (#7800)
Fixes #7713.

As discussed in #7713, providing a Powershell script as hook for Certbot is not working currently. This is because hooks are run in a `cmd` environment, that recognizes only `.bat` files as valid scripts that can be run from their bare name on command line.

On the other hand, the Powershell both `.bat` and `.ps1` scripts as valid scripts.

This PR makes hooks command be executed by Powershell, instead of `cmd` as `Popen` does by default when `shell=true` is used. It also modifies the tests to handle this new environment, in particular in term of encoding (UTF-16-LE is the default one in Powershell).

* Run hooks in powershell on Windows

* Fix hook test

* Fallback to unittest.mock

* In fact, shell_cmd as a list of str could not work. Declare only str as acceptable input for shell_cmd.

* Added changelog
2020-04-27 09:38:30 -07:00
なつき
db6de76b11 Allow existing but empty archive and live dir 2020-04-25 01:47:30 -07:00
Brad Warren
01dc981a09 Merge pull request #7948 from certbot/snap-build-squashed
Despite this PR (only) being ~200 lines containing mostly code copied from another repo, there is a lot going on here. For the sake of making it both easier to review and to remember some of these things in the future by referring back to this PR, I've documented a lot of noteworthy things with section headers below. With that said, it's probably not necessary to read each section unless you're interested in that topic.

The most noteworthy thing for the reviewer is **this PR should be merged and not squashed** to preserve authorship. To merge this code, once we're happy with this PR, I'll probably open a new PR squashing any commits I make in response in review comments back into a single commit to try to keep history somewhat clean. To help prevent this PR from being accidentally squashed, I'm making this a draft PR for now.

### Git history of https://github.com/basak/certbot-snap-build

I think it is worth preserving the git history of https://github.com/basak/certbot-snap-build that this PR is based on in this repo to help us track why things were done a certain way. To do this while keeping our git history somewhat clean, I took the approach described at https://stackoverflow.com/questions/1425892/how-do-you-merge-two-git-repositories/21495718#21495718 to move all history of https://github.com/basak/certbot-snap-build into a `snap` directory. I then squashed all commits so that sequential commits from the same author are one commit. I probably could have reordered commits to try and squash things a little more, but I personally don't think it's worth the trouble. Finally, I merged this rewritten history into this branch of the Certbot repo.

The contents of the `snap` directory are identical to the current contents of https://github.com/basak/certbot-snap-build before my final commit in this PR which makes the changes to make things work in this repo.

### Travis stages

This is described in general at https://docs.travis-ci.com/user/build-stages/, but I don't think we should deploy the snap if any of our tests are failing. To accomplish this, I created a "Snap" stage that builds, tests, and deploys the snap which is only executed after a "Test" stage that contains all of our other tests. The "Snap" stage will not run until the "Test" stage completes successfully.

### snap/local

This directory is ignored by `snapcraft` which I think makes it a good place to store `snap` specific scripts like `build_and_install.sh`.

See https://bugs.launchpad.net/snapcraft/+bug/1792203 for more info.

### Why remove certbot-compatibility-test from apacheconftest toxenvs?

Because it's not used. In theory, it could go in its own PR, but it'll create merge conflicts with this one so I'd personally prefer to include this simple change in this PR as well.

### Checklist for landing this PR

- [x] Squash all of my commits into one commit
- [x] Update the release instructions to have to move the snap to the beta channel
- [x] Shut down Robie's nightly builds probably by updating his repo to say that the code has moved here and deleting everything
2020-04-24 14:13:09 -07:00
Brad Warren
335894ab3b Merge snap code into the Certbot repo
* merge .gitignore

* Move snapcraft.yml up one level.

* update source

* move test.sh to tox.ini

* use new tox.ini in .travis.yml

* move snap build code

* make script executable

* remove unused python3-dev

* don't use deprecated classic flag

* go back to stable channel

* add nginx in snap addons

* add deploy steps

* Add comments explaining external tox envs.

* error if not in CI

* don't use --depth

* remove old .travis.yml

* Add big comment about SNAP_TOKEN.

* Set all_branches: true.

* Add repo setting.

* run travis on tags

* Add more documenting comments to .travis.yml.
2020-04-24 13:47:36 -07:00
Brad Warren
4cce3458f3 Avoid deleting the workspace twice. (#7923) 2020-04-23 23:29:16 +02:00
Brad Warren
d71d3a1144 dont use venv.py (#7936) 2020-04-23 23:27:05 +02:00
Brad Warren
b06dacdfb5 Add --chain-path to --help install output. (#7937) 2020-04-23 23:26:34 +02:00
taixx046
e5cde2c598 updated changelog.md 2020-04-23 15:29:38 -05:00
taixx046
84b6c3cebb reorganized error message when a user entered an invalid email address 2020-04-23 15:20:32 -05:00
ohemorange
a06d5ac7a1 Deprecate certbot-auto on Gentoo, macOS, and FreeBSD (#7926)
* Deprecate certbot-auto on Gentoo

* Deprecate certbot-auto on macOS

* Deprecate certbot-auto on FreeBSD

* build le-auto

* Update Changelog

* Deprecate certbot-auto on Gentoo, FreeBSD, and macOS
2020-04-23 12:49:35 -07:00
Brad Warren
9d94c6c5ef Remove useless before_install lines (#7920) 2020-04-22 13:10:13 -07:00
Brad Warren
ed9648b4a3 Add snap instructions (#7889)
Part of #7671.

* Add snap instructions.

* Mention Linux+systemd
2020-04-22 11:30:33 -07:00
Adrien Ferrand
2bcabe6626 Fix certbot part build in snap
* Declare properly source-subdir to build the certbot part

* Use snapcraft 3.10+, remove the custom python plugin
2020-04-21 16:36:49 -07:00
Brad Warren
c12baf7d8c Fix path to apache-conf-test 2020-04-21 16:36:49 -07:00
Sergio Schvezov
193b44a0fa Snap plugin
* snap: move snapcraft.yaml to snap directory

Signed-off-by: Sergio Schvezov <sergio.schvezov@canonical.com>

* snap: use a local plugin to get around the delivered plugin

Add a plugin to the project which behaves as expected until a version
of snapcraft satisfies the project needs.

Additional snapcraft.yaml changes were made to accommodate for the snap
to build.

Signed-off-by: Sergio Schvezov <sergio.schvezov@canonical.com>

* snap: compile pycache in the last step for the last part

Signed-off-by: Sergio Schvezov <sergio.schvezov@canonical.com>
2020-04-21 16:36:49 -07:00
Robie Basak
b69f5588f4 Continued improvements
* Remove legacy Store upload credentials

These have not been needed since 5d7969a.

* Work around dev part dependency failure case

Get pip to install certbot from its VCS repository directly during the
build of the nginx and apache plugin parts.

This works around issue #12 when it affects the interaction between the
apache or nginx plugin and certbot itself.

It does not work around the case where the same problem occurs in the
interaction between certbot and acme. This looks harder to work around
because pip's VCS URL handling doesn't appear to include a facility to
install from a subdirectory of a git repository and this is where the
acme source is located.

* Switch to using lxd for the snapcraft run

The Docker image is 16.04 only. Before we can switch to 18.04, we need
to remove Docker, which means using the snapcraft snap using Travis' new
snap support, and then using the lxd functionality in snapcraft so that
it is enabled to build in the appropriate environment.

* Switch build to use core 18

Fixes: #14

* Move constraints into a list

This seems to be a requirement either either newer snapcraft, snapcraft
in lxd or in the move to core18 (it isn't clear to me which). This fixes
the error:

Failed to load plugin: properties failed to load for certbot: The 'constraints' property does not match the required schema: '$SNAPCRAFT_PART_SRC/constraints.txt' is not of type 'array'

* version-script -> snapcraftctl set-version

The use of version-script seems to break with either newer snapcraft,
snapcraft with lxd or core18 (it's not clear to me which). The breakage
is related to "parts/certbot/src" not being found. This can be fixed
with $SNAPCRAFT_PART_SRC, but this doesn't seem to be defined at
"version-script time".

However https://snapcraft.io/docs/deprecation-notices/dn10 deprecates
the use of version-script, and if we convert to the recommended new way,
then we use override-pull instead and $SNAPCRAFT_PART_SRC is defined
there, so this conveniently fixes both problems at once.

* Do not explicitly install snapd

Since this is now handled by the Travis addon, we do not need to do it
explicitly.
2020-04-21 16:36:49 -07:00
Adrien Ferrand
2c622944dd Various optimizations part 2
* Revert logic of building against a tag
* Fix schema, add nginx
* Update snapcraft.yaml
* Update snapcraft.yaml
* Update snapcraft.yaml
* Update test.sh
* Update test.sh
* Update test.sh
* Update test.sh
* Config tests
* Add an apache test
* Relaunch CI
* Clean config
* Install venv
* Decompose steps
* Update test.sh
* Use virtual environment
* Update python-augeas
* Add fork python-augeas
* Update .travis.yml
* Exclusion rule
* Try with after
2020-04-21 16:36:49 -07:00
Robie Basak
e0b72d9a62 Travis improvements
* Add Travis notifications

* Adjust automatic snap deployment configuration

Travis now has a documented[1] "snap" provider and the previous
experimental mechanism seems to have stopped working, presumably because
it was deprecated in favour of this new mechanism.

[1] https://docs.travis-ci.com/user/deployment/snaps/
2020-04-21 16:36:49 -07:00
Adrien Ferrand
d63be466a8 Various optimizations part 1
* Configure for python3
* Update tests
* Use appropriate virtualenv
* Install nginx for the integration tests
* Try use LD_LIBRARY_PATH to find augeas shared library in snap when python-augeas is invoked
* Update travis to use build-in setup capabilities
* Update .travis.yml
* Add acme build
* Update tests
* Try more recent dist
* Update command
* Clean tests
* Add back augeas
* Add env
* Revert to last working snapcraft config
* Add a gitignore
* Reintegrate acme. Declare augeas in certbot parts
* Use release version of certbot
* Try new approach
* Fix config
* Directly install version of python-augeas from pypi
* Restart from basic
* Clone only once certbot repository. Use pinned versions of dependencies from certbot-auto.
* Try relatively to source
* Use snapcraft env variables
* Strip hashes
* Fix path
* Redefine path
* Continue to prepare the runtime
* Fix command line
* Update .travis.yml
* Add back certbot-apache
* Update snapcraft.yaml
* Build snap against the latest release of certbot
2020-04-21 16:36:49 -07:00
Robie Basak
0f6486ec7f Initial commit
* Add renewal timer

* Install libaugeas0 in python-augeas part build

This part needs libaugeas0 to build.

* Bump to 0.26.1

* Always act directly on upstream master

I want to keep this always working, so move to master. We can
reintroduce upstream stable releases when we are ready for general use.

Closes: #5

That particular issue seems to no longer happen. Presumably something
changed in upstream git or in PyPI. If it happens again, hopefully I'll
have CI against upstream master up by then and I'll be able to pin it
down.

* Add empty Travis build

* Add Travis automatic snap edge publication

* Add integration test

This uses upstream's test suite from their source tree to check the
built snap to make sure it behaves as expected, before attempting upload
to the store.

* Point Augeas to its lens library

Augeas defaults to looking in /usr/share/augeas/lenses, which in a snap
isn't found at this path, but inside $SNAP. So set AUGEAS_LENS_LIB to
where the lenses can be found within the snap.

This fixes the Apache plugin that uses Augeas.
2020-04-21 16:36:49 -07:00
Brad Warren
06e68cce44 flip condition (#7927)
In #7925 we accidently changed the logic here. Before it was:

type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))

now it's

type = cron OR (type = push AND branch = master)

We want to be able to run our full test suite on things like test-* branches. The reason we had been excluding master is it has the full test suite run on it (through what Travis calls cron) nightly and not running on every commit helps prevent us on waiting on CI since our nightly tests spin up so many jobs.

This PR changes things back to the intended behavior.

(We could talk about changing the condition to just type = cron OR type = push if we want, but I'd rather do that in a separate PR once things like test- branches are fixed.)
2020-04-21 16:34:50 -07:00
ohemorange
eed45827ad Remove references to the apache-parser-v2 branch (#7925)
Fixes #7786.
2020-04-21 13:06:30 -07:00
ohemorange
751d836746 Update using.rst to accurately describe acmev2 behavior (#7924)
Fixes #7268

I removed the reference to automatically selecting which ACME protocol we use, since at some point we'll want to rip out the non-spec-compliant ACMEv1 code.
2020-04-21 12:47:59 -07:00
Brad Warren
6693e87500 Update dev docs to reflect Python 2 EOL. (#7914)
Python 2 is going to get harder and harder to install locally so I don't think we should assume/require devs to have it installed.

This PR builds on #7905 so our developer guide only has people use Python 3.
2020-04-21 10:27:48 -07:00
schoen
08cea381c8 Merge pull request #7917 from certbot/fedora31
Test on fedora 31
2020-04-20 19:36:24 -07:00
schoen
84b5c571c0 Merge pull request #7916 from certbot/ubuntu-1910
Test on Ubuntu 19.10
2020-04-20 19:35:41 -07:00
schoen
0f35836deb Merge pull request #7919 from certbot/rate-limits-url
Update rate limits url
2020-04-20 19:29:18 -07:00
Brad Warren
5b749ff8f7 Use Python 3 in the release script. (#7918)
Fixes #7902.
2020-04-20 14:44:53 -07:00
Brad Warren
41306e1e37 update rate limits url 2020-04-20 09:20:31 -07:00
Brad Warren
864ea08341 test on fedora 31 2020-04-16 15:00:28 -07:00
Brad Warren
74eea40905 test on ubuntu 19.10 2020-04-16 14:57:36 -07:00
Brad Warren
859dc38cb9 Consolidate cover envs and default to py3-cover (#7905)
* Consolidate cover envs and default to py3-cover

* use py38 for code coverage in Travis

* Disable coverage on Python < 3.6 line.
2020-04-16 08:59:40 -07:00
April King
f66314926a Update URL for Mozilla SSL Configuration Generator (#7912) 2020-04-15 13:54:17 -07:00
ohemorange
9c345ac301 Do not require mock in Python 3 in certbot-dns modules (#7900)
Part of #7886.

This PR conditionally installs `mock` in `certbot-dns-*/setup.py` based on setuptools version and python version, when possible. It then updates the tests to use `unittest.mock` when `mock` isn't available.

* Do not require mock in Python 3 in certbot-dns modules

* update changelog

* error when trying to build wheels with old setuptools

* add type: ignores
2020-04-15 11:54:44 -07:00
ohemorange
af21d1d56e Do not require mock in Python 3 in certbot-compatibility-test module (#7899)
* Do not require mock in Python 3 in certbot-compatibility-test module

* error when trying to build wheels with old setuptools

* add type: ignores
2020-04-15 11:40:03 -07:00
ohemorange
49912732ac Do not require mock in Python 3 in nginx module (#7898)
* Do not require mock in Python 3 in nginx module

* error when trying to build wheels with old setuptools

* add type: ignores
2020-04-15 11:39:44 -07:00
ohemorange
8fb9a395ab Do not require mock in Python 3 in apache module (#7896)
Part of #7886.

This PR conditionally installs mock in `apache/setup.py` based on setuptools version and python version, when possible. It then updates `apache` tests to use `unittest.mock` when `mock` isn't available.

* Conditionally install mock in apache

* error out on newer python and older setuptools

* error when trying to build wheels with old setuptools

* use unittest.mock when third-party mock isn't available in apache, with no cover and type ignore
2020-04-15 11:30:08 -07:00
ohemorange
35fb99b86f Do not require mock in Python 3 in certbot module (#7911)
This PR is exactly the same as #7895, but know we know a little bit more about what was going on with `mypy`.

Part of #7886.

This PR conditionally installs mock in `certbot/setup.py` based on setuptools version and python version, when possible. It then updates `certbot` tests to use `unittest.mock` when `mock` isn't available.

* Conditionally install mock in certbot

* use unittest.mock when third-party mock isn't available in certbot

* Add type:ignores because of https://github.com/python/mypy/issues/1153

* error out on newer python and older setuptools

* error when trying to build wheels with old setuptools
2020-04-15 11:28:47 -07:00
ohemorange
127d2dc307 Do not require mock in Python 3 in acme module (#7910)
Part of #7886.

This PR conditionally installs mock in `acme/setup.py` based on setuptools version and python version, when possible. It then updates `acme` tests to use `unittest.mock` when `mock` isn't available.

Now with `type: ignore` as appropriate. Once the "future steps" of #7886 are finished, and mypy is on Python 3, the `pragma no cover`s and `type ignore`s will be gone.

* Conditionally install mock in acme

* error out on newer python and older setuptools

* error when trying to build wheels with old setuptools

* use unittest.mock when third-party mock isn't available in acme, with no cover and type ignore
2020-04-15 11:27:55 -07:00
Brad Warren
569df2d37a Remove Ubuntu 19.04 tests. (#7906)
This PR fixes the Travis failures that can be seen https://travis-ci.com/certbot/certbot/builds/160258644. Running the tests locally, it looks like Ubuntu has started shutting down the 19.04 repos which makes sense as this release has been EOL'd. See https://wiki.ubuntu.com/Releases.

I have the full suite including the test farm tests running at https://travis-ci.com/github/certbot/certbot/builds/160269969 with this change.

The issue of adding 19.10 to our test farm tests is tracked by #7851. I think that issue is important and it's in our current milestone, but I'd personally rather get our tests passing for now and try to expand them to run on other systems later.
2020-04-14 17:01:59 -07:00
ohemorange
ff732bf975 Revert the last two mock PRs (#7903)
* Revert "Do not require mock in Python 3 in certbot module (#7895)"

This reverts commit 77871ba71c.

* Revert "Do not require mock in Python 3 in acme module (#7894)"

This reverts commit cd0acf5dcc.
2020-04-13 17:09:24 -07:00
ohemorange
77871ba71c Do not require mock in Python 3 in certbot module (#7895)
Part of #7886.

This PR conditionally installs mock in `certbot/setup.py` based on setuptools version and python version, when possible. It then updates `certbot` tests to use `unittest.mock` when `mock` isn't available.

* Conditionally install mock in certbot

* use unittest.mock when third-party mock isn't available in certbot

* Add type:ignores because of https://github.com/python/mypy/issues/1153

* error when trying to build wheels with old setuptools
2020-04-13 14:33:23 -07:00
ohemorange
cd0acf5dcc Do not require mock in Python 3 in acme module (#7894)
Part of #7886.

This PR conditionally installs mock in acme/setup.py based on setuptools version and python version, when possible. It then updates acme tests to use unittest.mock when mock isn't available.

* Conditionally install mock in acme

* use unittest.mock when third-party mock isn't available in acme

* error when trying to build wheels with old setuptools
2020-04-13 14:32:22 -07:00
Karan Suthar
8e4dc0a48c Minor bugfixes (#7891)
* Fix dangerous default argument

* Remove unused imports

* Remove unnecessary comprehension

* Use literal syntax to create data structure

* Use literal syntax instead of function calls to create data structure

Co-authored-by: deepsource-autofix[bot] <62050782+deepsource-autofix[bot]@users.noreply.github.com>
2020-04-13 10:41:39 -07:00
ohemorange
316e4640f8 Upgrade the test farm tests to use Python 3 (#7876)
Fixes #7857.

* stop using urllib2 in test farm tests

* use six for urllib instead

* remove fabric lcd usage

* correct lcd removal

* remove fabric cd

* convert some remote calls to v2

* move more cxns to v2

* get run working with prefix

* get sudo commands working

* remove final fabric v1 references including local

* update requirements and README

* add new venv to gitignore

* update version used in travis

* remove deploy_script unused kwargs

* fix killboulder implementation so I can test creating a new boulder server

* hardcode the gopath due to broken env manamagement in fabric2

* Update letstest readme

* move the comment about hardcoding the ggopath

* catch BaseException instead of Exception

* work around fabric #2007

* use connections as context managers to ensure they're closed

* remove reference to virtualenv
2020-04-09 14:35:47 -07:00
inejge
537bee0994 Add minimal proxy support for OCSP verification (#7892)
Translate a proxy specified by an environment variable ("http_proxy"
or "HTTP_PROXY") into options recognized by "openssl ocsp". Support
is limited to HTTP proxies which don't require authentication.

Fixes #6150
2020-04-09 11:25:39 -07:00
alexzorin
e9895d2ec6 Fix fullchain parsing for CRLF chains (#7877)
Fixes #7875 .

After [this comment](https://github.com/certbot/certbot/issues/7875#issuecomment-608145208) and evaluating the options, I opted to go with `stricttextualmsg`, as required by RFC 8555. Reasoning is that the ACME v1 code path (via OpenSSL) produces a `fullchain_pem` which satisfies `stricttextualmsg`, so we don't need to be more generous than that.

One downside of the `re` approach is that it doesn't seem capable of capturing repeating group matches. As a result, it matches each certificate individually, silently passing over any data in between the encapsulation boundaries, such as explanatory text, which is prohibited by RFC 8555.

It would be ideal to raise an error when encountering such a non-conformant chain, but we'd need to create a mini-parser to do it, I think.

* Fix fullchain parsing for CRLF chains.

fullchain parsing now works in two passes:

1. A first pass which is generous with what it accepts - basically
   preeb(CERTIFICATE)+anything+posteb(CERTIFICATE). This determines
   the boundaries for each certificate.
2. A second pass which normalizes (by parsing and re-encoding) each
   certificate found in the first pass.

* typo in docstring

* remove redundant group in regex

* can't use assertRaisesRegex until py27 is gone
2020-04-07 16:19:13 -07:00
Brad Warren
5992d521e2 Print boulder logs when boulder setup fails (#7885)
This is part of https://github.com/certbot/certbot/issues/7303.

* Print boulder logs if boulder fails to start

* Print description and fix command.

* Change output to stderr.
2020-04-07 15:26:19 -07:00
alexzorin
4ca86d9482 acme: socket timeout for HTTP standalone servers (#7388)
* acme: socket timeout for HTTP standalone servers

Adds a default 30 second timeout to the StreamRequestHandler for clients
connecting to standalone HTTP-01 servers. This should prevent most cases
of an idle client connection from preventing the standalone server from
shutting down.

Fixes #7386

* use idiomatic kwargs default value

* move HTTP01Server lower to fix mypy forward ref.

* fix test crash on macOS due to socket double-close

* maybe its not an OSError?

* disable coverage check on useless branch
2020-04-01 23:53:58 +02:00
Adrien Ferrand
bc3088121b Add a step to check powershell version in vs2017-win2016 (#7870)
Following discussion at https://github.com/certbot/certbot/pull/7539#issuecomment-572318805, this PR adds a check for Powershell version: we expect that the `vs2017-win2016` node that will test the installer has Powershell 5.x, and nothing else.

This ensure that at least one node of the pipeline is testing the installer with the lowest Powershell version supported by Certbot.

One full pipeline success can be seen here: https://dev.azure.com/adferrand/certbot/_build/results?buildId=713

I also create on purpose a failing pipeline, that would check that Powershell 6.x is installed. Its result can be seen here: https://dev.azure.com/adferrand/certbot/_build/results?buildId=714
2020-03-31 13:12:42 -07:00
ohemorange
dbda499b08 Remove interactive redirect ask (#7861)
Fixes #7594.

Removes the code asking interactively if the user would like to add a redirect.

* Remove interactive redirect ask

* display.enhancements is no longer used, so remove it.

* update changelog

* remove references to removed display.enhancements

* add redirect_default flag to enhance_config to conditionally set default for redirect value

* Update default in help text.
2020-03-31 12:12:14 -07:00
Brad Warren
6df90d17ae Wait 5 minutes for boulder to start. (#7864) 2020-03-25 14:50:13 -07:00
alexzorin
e4a0edc7af Add a 10-second timeout to OCSP queries. (#7860)
* Add a 10-second timeout to OCSP queries.

Closes #7859

* Update CHANGELOG

* Fix test
2020-03-24 15:02:53 +01:00
m0namon
1285297b23 [Apache v2] Load apacheconfig tree and gate related tests (#7710)
* Load apacheconfig dependency, gate behind flag

* Bump apacheconfig dependency to latest version and install dev version of apache for coverage tests

* Move augeasnode_test tests to more generic parsernode_test

* Revert "Move augeasnode_test tests to more generic parsernode_test"

This reverts commit 6bb986ef786b9d68bb72776bde66e6572cf505a9.

* Mock AugeasNode into DualNode's place, and run augeasnode tests exclusively on AugeasNode

* Don't calculate coverage for skeleton functions

* clean up helper function in augeasnode_test
2020-03-23 17:05:22 -07:00
ohemorange
9e3c348dff Disable TLS session tickets in Apache (#7771)
Fixes #7350.

This PR changes the parsed modules from a `set` to a `dict`, with the filepath argument as the value. Accordingly, after calling `enable_mod` to enable `ssl_module`, modules now need to be re-parsed, so call `reset_modules`.

* Add mechanism for selecting apache config file, based on work done in #7191.

* Check OpenSSL version

* Remove os imports

* debian override still needs os

* Reformat remaining apache tests with modules dict syntax

* Clean up more apache tests

* Switch from property to method for openssl and add tests for coverage.

* Sometimes the dict location will be None in which case we should in fact return None

* warn thoroughly and consistently in openssl_version function

* update tests for new warnings

* read file as bytes, and factor out the open for testing

* normalize ssl_module_location path to account for being relative to server root

* Use byte literals in a python 2 and 3 compatible way

* string does need to be a literal

* patch builtins open

* add debug, remove space

* Add test to check if OpenSSL detection is working on different systems

* fix relative test location for cwd

* put </IfModule> on its own line in test case

* Revert test file to status in master.

* Call augeas load before reparsing modules to pick up the changes

* fix grep, tail, and mod_ssl location on centos

* strip the trailing whitespace from fedora

* just use LooseVersion in test

* call apache2ctl on debian systems

* Use sudo for apache2ctl command

* add check to make sure we're getting a version

* Add boolean so we don't warn on debian/ubuntu before trying to enable mod_ssl

* Reduce warnings while testing by setting mock _openssl_version.

* Make sure we're not throwing away any unwritten changes to the config

* test last warning case for coverage

* text changes for clarity
2020-03-23 16:49:52 -07:00
schoen
3bfaf41d3d Merge pull request #7849 from TechplexEngineer/patch-1
Fix plugin links
2020-03-16 11:51:45 -07:00
Brad Warren
06599a1e18 Cleanup more pylint issues (#7848)
This PR builds on #7657 and cleans up additional unnecessary pylint comments and some stray comments referring to pylint: disable comments that have been deleted that I didn't notice in my review of that PR.

* Remove stray pylint link.

* Cleanup more pylint comments

* Cleanup magic_typing imports

* Remove unneeded pylint: enable comments
2020-03-16 09:43:48 -07:00
schoen
30ec4cafe1 Merge pull request #7797 from g6123/nginx-utf8
Use UTF-8 encoding for nginx plugin
2020-03-13 17:07:40 -07:00
schoen
c6d35549d6 Merge branch 'master' into nginx-utf8 2020-03-13 16:28:24 -07:00
Blake Bourque
9a256ca4fe Fix plugin links 2020-03-13 15:26:15 -04:00
Adrien Ferrand
809cb516c9 Fix acme compliance to RFC 8555 (#7176)
This PR is an alternative to #7125.

Instead of disabling the strict mode on Pebble, this PR fixes the JWS payloads regarding RFC 8555 to be compliant, and allow certbot to work with Pebble v2.1.0+.

* Fix acme compliance to RFC 8555.

* Working mixin

* Activate back pebble strict mode

* Use mixin for type

* Update dependencies

* Fix also in fields_to_partial_json

* Update pebble

* Add changelog
2020-03-13 09:56:35 -07:00
Adrien Ferrand
07abe7a8d6 Reimplement tls-alpn-01 in acme (#6886)
This PR is the first part of work described in #6724.

It reintroduces the tls-alpn-01 challenge in `acme` module, that was introduced by #5894 and reverted by #6100. The reason it was removed in the past is because some tests showed that with `1.0.2` branch of OpenSSL, the self-signed certificate containing the authorization key is sent to the requester even if the ALPN protocol `acme-tls/1` was not declared as supported by the requester during the TLS handshake.

However recent discussions lead to the conclusion that this behavior was not a security issue, because first it is coherent with the behavior with servers that do not support ALPN at all, and second it cannot make a tls-alpn-01 challenge be validated in this kind of corner case.

On top of the original modifications given by #5894, I merged the code to be up-to-date with our `master`, and fixed tests to match recent evolution about not displaying the `keyAuthorization` in the deserialized JSON form of an ACME challenge.

I also move the logic to verify if ALPN is available on the current system, and so that the tls-alpn-01 challenge can be used, to a dedicated static function `is_available` in `acme.challenge.TLSALPN01`. This function is used in the related tests to skip them, and will be used in the future from Certbot plugins to trigger or not the logic related to tls-alpn-01, depending on the OpenSSL version available to Python.

* Reimplement TLS-ALPN-01 challenge and standalone TLS-ALPN server from #5894.

* Setup a class method to check if tls-alpn-01 is supported.

* Add potential missing parameter in validation for tls-alpn

* Improve comments

* Make a class private

* Handle old versions of openssl that do not terminate the handshake when they should do.

* Add changelog

* Explicitly close the TLS connection by the book.

* Remove unused exception

* Fix lint
2020-03-12 13:53:19 -07:00
osirisinferi
2fd85a4f36 Add serial number to certificates output (#7842)
Fixes #7835

I had to mock out `get_serial_from_cert` to keep a test from failing, because `cert_path` was mocked itself in `test_report_human_readable`. 

Also, I kept the same style for the serial number as the recent Let's Encrypt e-mail: lowercase hexadecimal without a `0x` prefix and without colons every 2 chars. Shouldn't be a problem to change the format if required.
2020-03-12 09:37:49 -07:00
Adrien Ferrand
44b97df4e9 Exposes environment variable to let hooks scripts know when the last challenge is handled (#7837)
Fixes #5484 

This PRs makes Certbot expose two new environment variables in the auth and cleanup hooks of the `manual` plugin:
* `CERTBOT_REMAINING_CHALLENGES` contains the number of challenges that remain after the current one (so it equals to 0 when the script is called for the last challenge)
* `CERTBOT_ALL_DOMAINS` contains a comma-separated list of all domains concerned by a challenge for the current certificate

With these variables, an hook script can know when it is run for the last time, and then trigger appropriate finalizers for all challenges that have been executed. This will be particularly useful for certificates with a lot of domains validated with DNS-01 challenges: instead of waiting on each hook execution to check that the relevant DNS TXT entry has been inserted, these waits can be avoided thanks to the latest hook verifying all domains in one run.

* Inject environment variables in manual scripts about remaining challenges

* Adapt tests

* Less variables and less lines

* Update manual.py

* Update manual_test.py

* Add documentation

* Add changelog
2020-03-12 09:29:03 -07:00
radek-sprta
78168a5248 Add CloudDNS to third-party plugins (#7840) 2020-03-11 13:27:19 -07:00
Brad Warren
69aec55ead Remove --no-site-packages outside of certbot-auto. (#7832) 2020-03-09 13:05:35 -07:00
Brad Warren
7f63141e41 Add changes to the correct changelog entry (#7833)
https://github.com/certbot/certbot/pull/7742 and https://github.com/certbot/certbot/pull/7738 landed after our 1.2.0 release, but the 1.2.0 changelog entry was modified instead of the one for master/1.3.0.

This PR moves the changelog entries to the 1.3.0 section.
2020-03-06 09:46:30 -08:00
Brad Warren
d72a1a71d2 Fix issues with Azure Pipelines (#7838)
This PR fixes two issues.

First, it fixes #7814 by removing our tests on Windows Server 2012. I also added the sentence "Certbot supports Windows Server 2016 and Windows Server 2019." to https://community.letsencrypt.org/t/beta-phase-of-certbot-for-windows/105822.

Second, it fixes the test failures which can be seen at https://dev.azure.com/certbot/certbot/_build/results?buildId=1309&view=results by no longer manually installing our own version of Python and instead using the one provided by Azure.

These small changes are in the same PR because I wanted to fix test failures ASAP and `UsePythonVersion` is not available on Windows 2012. See https://github.com/certbot/certbot/pull/7641#discussion_r358510854.

You can see tests passing with this change at https://dev.azure.com/certbot/certbot/_build/results?buildId=1311&view=results.

* stop testing on win2012

* switch to UsePythonVersion
2020-03-05 11:50:52 -08:00
ohemorange
68f4ae12be Merge pull request #7831 from certbot/candidate-1.3.0
Update files from 1.3.0 release
2020-03-03 17:34:31 -08:00
Brad Warren
144d4f2b44 Bump version to 1.4.0 2020-03-03 12:43:04 -08:00
Brad Warren
e362948d45 Add contents to certbot/CHANGELOG.md for next version 2020-03-03 12:43:03 -08:00
Brad Warren
6edb4e1a39 Release 1.3.0 2020-03-03 12:43:02 -08:00
Brad Warren
b1fb3296e9 Update changelog for 1.3.0 release 2020-03-03 12:36:36 -08:00
Brad Warren
3147026211 Check OCSP as part of determining if the certificate is due for renewal (#7829)
Fixes #1028.

Doing this now because of https://community.letsencrypt.org/t/revoking-certain-certificates-on-march-4/.

The new `ocsp_revoked_by_paths` function  is taken from https://github.com/certbot/certbot/pull/7649 with the optional argument removed for now because it is unused.

This function was added in this PR because `storage.py` uses `self.latest_common_version()` to determine which certificate should be looked at for determining renewal status at 9f8e4507ad/certbot/certbot/_internal/storage.py (L939-L947)

I think this is unnecessary and you can just look at the currently linked certificate, but I don't think we should be changing the logic that code has always had now.

* Check OCSP status as part of determining to renew

* add integration tests

* add ocsp_revoked_by_paths
2020-03-03 11:07:15 -08:00
Michael Brown
9f8e4507ad Document safe and simple usage by services without root privileges (#7821)
Certificates are public information by design: they are provided by
web servers without any prior authentication required.  In a public
key cryptographic system, only the private key is secret information.

The private key file is already created as accessible only to the root
user with mode 0600, and these file permissions are set before any key
content is written to the file.  There is no window within which an
attacker with access to the containing directory would be able to read
the private key content.

Older versions of Certbot (prior to 0.29.0) would create private key
files with mode 0644 and rely solely on the containing directory
permissions to restrict access.  We therefore cannot (yet) set the
relevant default directory permissions to 0755, since it is possible
that a user could install Certbot, obtain a certificate, then
downgrade to a pre-0.29.0 version of Certbot, then obtain another
certificate.  This chain of events would leave the second
certificate's private key file exposed.

As a compromise solution, document the fact that it is safe for the
common case of non-downgrading users to change the permissions of
/etc/letsencrypt/{live,archive} to 0755, and explain how to use chgrp
and chmod to make the private key file readable by a non-root service
user.

This provides guidance on the simplest way to solve the common problem
of making keys and certificates usable by services that run without
root privileges, with no requirement to create a custom (and hence
error-prone) executable hook.

Remove the existing custom executable hook example, so that the
documentation contains only the simplest and safest way to solve this
very common problem.

Signed-off-by: Michael Brown <mbrown@fensystems.co.uk>
2020-02-27 16:44:23 -08:00
Brad Warren
50ea608553 Don't run advanced tests on PRs. (#7820)
When I wrote https://github.com/certbot/certbot/pull/7813, I didn't understand the default behavior for pull requests if you don't specify `pr` in the yaml file. According to https://docs.microsoft.com/en-us/azure/devops/pipelines/build/triggers?view=azure-devops&tabs=yaml#pr-triggers:

> If no pr triggers appear in your YAML file, pull request builds are automatically enabled for all branches...

This is not the behavior we want. This PR fixes the problem by disabling builds on PRs.

You should be able to see this working because the advanced tests should not run on this PR but they did run on https://github.com/certbot/certbot/pull/7811.
2020-02-27 15:07:33 -08:00
Brad Warren
fa67b7ba0f Remove codecov (#7811)
After getting a +1 from everyone on the team, this PR removes the use of `codecov` from the Certbot repo because we keep having problems with it.

Two noteworthy things about this PR are:

1. I left the text at 4ea98d830b/.azure-pipelines/INSTALL.md (add-a-secret-variable-to-a-pipeline-like-codecov_token) because I think it's useful to document how to set up a secret variable in general.
2. I'm not sure what the text "Option -e makes sure we fail fast and don't submit to codecov." in `tox.cover.py` refers to but it seems incorrect since `-e` isn't accepted or used by the script so I just deleted the line.

As part of this, I said I'd open an issue to track setting up coveralls (which seems to be the only real alternative to codecov) which is at https://github.com/certbot/certbot/issues/7810.

With my change, failure output looks something like:
```
$ tox -e py27-cover
...
Name                                                         Stmts   Miss  Cover   Missing
------------------------------------------------------------------------------------------
certbot/certbot/__init__.py                                      1      0   100%
certbot/certbot/_internal/__init__.py                            0      0   100%
certbot/certbot/_internal/account.py                           191      4    98%   62-63, 206, 337
...
certbot/tests/storage_test.py                                  530      0   100%
certbot/tests/util_test.py                                     374     29    92%   211-213, 480-484, 489-499, 504-511, 545-547, 552-554
------------------------------------------------------------------------------------------
TOTAL                                                        14451    647    96%
Command '['/path/to/certbot/dir/.tox/py27-cover/bin/python', '-m', 'coverage', 'report', '--fail-under', '100', '--include', 'certbot/*', '--show-missing']' returned non-zero exit status 2
Test coverage on certbot did not meet threshold of 100%.
ERROR: InvocationError for command /Users/bmw/Development/certbot/certbot/.tox/py27-cover/bin/python tox.cover.py (exited with code 1)
_________________________________________________________________________________________________________________________________________________________ summary _________________________________________________________________________________________________________________________________________________________
ERROR:   py27-cover: commands failed
```
I printed the exception just so we're not throwing away information.

I think it's also possible we fail for a reason other than the threshold not meeting the percentage, but I've personally never seen this, `coverage report` output is not being captured so hopefully that would inform devs if something else is going on, and saying something like "Test coverage probably did not..." seems like overkill to me personally.

* remove codecov

* remove unused variable group

* remove codecov.yml

* Improve tox.cover.py failure output.
2020-02-27 14:44:39 -08:00
Brad Warren
6309ded92f Remove references to deprecated flags in Certbot. (#7509)
Related to https://github.com/certbot/certbot/pull/7482, this removes some references to deprecated options in Certbot.

The only references I didn't remove were:

* In `certbot/tests/testdata/sample-renewal*` which contains a lot of old values and I think there's even some value in keeping them so we know if we make a change that suddenly causes old renewal configuration files to error.
* In the Apache and Nginx plugins and I created https://github.com/certbot/certbot/issues/7508 to resolve that issue.
2020-02-27 14:43:28 -08:00
m0namon
5a4f158c55 Merge pull request #7541 from certbot/no-client-plugins
Fix docstring
2020-02-27 14:36:59 -08:00
Brad Warren
a2be8e1956 Fix tests on macOS Catalina (#7794)
This PR fixes the failures that can be seen at https://dev.azure.com/certbot/certbot/_build/results?buildId=1184&view=results.

You can see this code running on macOS Catalina at https://dev.azure.com/certbot/certbot/_build/results?buildId=1192&view=results.
2020-02-27 10:50:20 -08:00
Brad Warren
2f737ee292 Change how _USE_DISTRO is set for mypy (#7804)
If you run `mypy --platform darwin certbot/certbot/util.py` you'll get:
```
certbot/certbot/util.py:303: error: Name 'distro' is not defined
certbot/certbot/util.py:319: error: Name 'distro' is not defined
certbot/certbot/util.py:369: error: Name 'distro' is not defined
```
This is because mypy's logic for handling platform specific code is pretty simple and can't figure out what we're doing with `_USE_DISTRO` here. See https://mypy.readthedocs.io/en/stable/common_issues.html#python-version-and-system-platform-checks for more info.

Setting `_USE_DISTRO` to the result of `sys.platform.startswith('linux')` solves the problem without changing the overall behavior of our code here though.

This fixes part of https://github.com/certbot/certbot/issues/7803, but there's more work to be done on Windows.
2020-02-27 10:49:50 -08:00
Brad Warren
8c75a9de9f Remove unused notify code. (#7805)
This code is unused and hasn't been modified since 2015 except for various times our files have been renamed. Let's remove it.
2020-02-27 10:47:56 -08:00
Brad Warren
24aa1e9127 update letstest reqs (#7809)
I don't fully understand why, but since I updated my macbook to macOS Catalina, the test script currently fails to run for me with the versions of our dependencies we have pinned. Updating the dependencies solves the problem though and you can see Travis also successfully running tests with these new dependencies at https://travis-ci.com/certbot/certbot/builds/150573696.
2020-02-27 10:47:43 -08:00
Brad Warren
f4c0a9fd63 Split advanced pipeline (#7813)
I want to do what I did in https://github.com/certbot/certbot/pull/7733 to our Azure Pipelines setup, but unfortunately this isn't currently possible. The only filters available for service hooks for the "build completed" trigger are the pipeline and build status. See 
![Screen Shot 2020-02-26 at 3 04 56 PM](https://user-images.githubusercontent.com/6504915/75396464-64ad0780-58a9-11ea-97a1-3454a9754675.png)

To accomplish this, I propose splitting the "advanced" pipeline into two cases. One is for builds on protected branches where we want to be notified if they fail while the other is just used to manually run tests on certain branches.
2020-02-27 10:43:41 -08:00
m0namon
f169c37153 Merge pull request #7742 from osirisinferi/force-non-restrictive-umask
Force non restrictive umask when creating challenge directory in Apache plugin
2020-02-26 17:09:20 -08:00
cumul0529
a489079208 Update parser test to better assert logging output 2020-02-25 13:26:36 +09:00
cumul0529
ddf68aea80 Update comment in testdata file 2020-02-25 13:21:10 +09:00
cumul0529
2ae090529e Fixed typo & some trivial documentation change 2020-02-25 13:18:03 +09:00
Brad Warren
4ea98d830b remove _internal docs (#7801) 2020-02-24 21:31:16 +01:00
martin-c
4fd04366aa Fix issue #7165 in _create_challenge_dirs(), attempt to fix pylint errors (#7568)
* fix issue #7165 by checking if directory exists before trying to create it, fix possible pylint issues in webroot.py

* fix get_chall_pref definition

* Update CHANGELOG.md

* Update CHANGELOG.md

Co-authored-by: Adrien Ferrand <adferrand@users.noreply.github.com>
2020-02-23 22:14:51 +01:00
alexzorin
2633c3ffb6 acme: ignore params in content-type check (#7342)
* acme: ignore params in content-type check

Fixes the warning in #7339

* Suppress coverage complaint in test

* Update CHANGELOG

* Repair symlink

Co-authored-by: Adrien Ferrand <adferrand@users.noreply.github.com>
2020-02-23 21:49:42 +01:00
cumul0529
5b29e4616c Add simple comments 2020-02-24 05:30:54 +09:00
cumul0529
32904d8c9e Add TestCase.assertLogs() backport for Python 2.7 2020-02-24 05:13:34 +09:00
cumul0529
d68f37ae88 Add this change to CHANGELOG.md 2020-02-24 02:56:43 +09:00
cumul0529
b3071aab29 Add my name to AUTHORS.md
:)
2020-02-24 02:52:41 +09:00
cumul0529
2aac24c982 Trivial code clean-up 2020-02-24 02:49:08 +09:00
cumul0529
20df5507ae Add logging test for _parse_files() 2020-02-24 02:48:55 +09:00
cumul0529
36311a276b Add test case for _parse_ssl_options() 2020-02-24 02:46:27 +09:00
cumul0529
22685ef86f Remove unicode_support/ path in test case 2020-02-24 02:23:46 +09:00
cumul0529
c3cfd412c9 Relpace deprecated logger.warn() with logger.warning() 2020-02-24 01:47:12 +09:00
Seth Schoen
0b21e716ca Fix lint problems with long lines 2020-02-24 01:45:23 +09:00
cumul
8b90b55518 Added test for valid/invalid unicode characters 2020-02-24 01:35:00 +09:00
cumul
247d9cd887 Use io module instead of codecs
See https://mail.python.org/pipermail/python-list/2015-March/687124.html
2020-02-24 01:29:37 +09:00
cumul
d6ef34a03e Use UTF-8 encoding for nginx plugin 2020-02-24 01:25:16 +09:00
osirisinferi
9819443440 Add test 2020-02-22 15:22:27 +01:00
Raklyon
84b57fac93 Refactor cli.py, splitting in it smaller submodules (#6803)
* Refactor cli.py into a package with submodules

* Added unit tests for helpful module in cli.

* Fixed linter errors

* Fixed pylint issues

* Updated changelog.md

* Fixed test failing and mypy error. Appeared a new pylint error (seems to be in conflict with mypy)

mypy require zope.interface to be imported but when imported it is not used and pylint throws an error.

* Fixed pylint errors

* Apply changes to cli since last merge from master (efc8d49806)

* Fix lint

* Remaining lint errors

Co-authored-by: Adrien Ferrand <adferrand@users.noreply.github.com>
2020-02-21 21:30:58 +01:00
Brad Warren
7d79c91e9b Move our macOS tests to Azure Pipelines (#7793)
[Our macOS tests are failing](https://travis-ci.com/certbot/certbot/builds/149965318) again this time due to the problem described at https://travis-ci.community/t/macos-build-fails-because-of-homebrew-bundle-unknown-command/7296/14.

I tried adding `update: true` to the Homebrew config as described in that thread, but [it didn't work](https://travis-ci.com/certbot/certbot/builds/150070374). I also tried updating the macOS image we use which [didn't work](https://travis-ci.com/certbot/certbot/builds/150072389).

Since we continue to have problems with macOS on Travis, let try moving the tests to Azure Pipelines.

* test macos

* Remove Travis macOS setup

* add displayName
2020-02-21 11:18:53 -08:00
Brad Warren
c883efde0f add pgp key docs (#7765)
Fixes #7613.
2020-02-20 14:35:47 -08:00
Brad Warren
42dda355c5 Correct AutoHSTS docs (#7767)
domains is a list of strings, not a single string.

* Correct AutoHSTS docs.

* Fix Apache enable_autohsts docs.
2020-02-18 14:54:07 -08:00
Brad Warren
99b1538d0a Fix spurious pylint errors. (#7780)
This fixes (part of) the problem identified in https://github.com/certbot/certbot/pull/7657#issuecomment-586506340.

When I tested our pylint setup on Python 3.5.9, 3.6.9, or 3.6.10, tests failed with:
```
************* Module acme.challenges
acme/acme/challenges.py:57:15: E1101: Instance of 'UnrecognizedChallenge' has no 'jobj' member (no-member)
************* Module acme.jws
acme/acme/jws.py:28:16: E1101: Class 'Signature' has no '_orig_slots' member (no-member)
```
These errors did not occur for me on Python 3.6.7 or Python 3.7+.

You also cannot run our lint setup on Python 2.7 because our pinned version of pylint's dependency `asteroid` does not support Python 2. Because of this, `pylint` is not installed in the virtual environment created by `tools/venv.py` and our [`lint` environment in tox specifies that Python 3 should be used](fd64c8c33b/tox.ini (L132)).

I tried updating pylint and its dependencies to fix the problem, but they still occur so I think adding back these disable checks on these lines again is the best fix for now.
2020-02-18 11:55:48 -08:00
Brad Warren
fd64c8c33b Remove letshelp-certbot (#7761)
* remove references to letshelp

* remove letshelp files

* Remove line continuation

Co-authored-by: ohemorange <ebportnoy@gmail.com>
2020-02-14 17:19:19 -08:00
Brad Warren
3f52695ec2 more robustly stop patches (#7763) 2020-02-14 17:18:53 -08:00
Adrien Ferrand
fc7e5e8e60 Remove useless pylint error suppression directives (#7657)
As pylint is evolving, it improves its accuracy, and several pylint error suppression (`# pylint: disable=ERROR) added in certbot codebase months or years ago are not needed anymore to make it happy.

There is a (disabled by default) pylint error to detect the useless suppressions (pylint-ception: `useless-suppression`). It is not working perfectly (it has also false-positives ...) but it is a good start to clean the codebase.

This PR removes several of these useless suppressions as detected by the current pylint version we use.

* Remove useless suppress

* Remove useless lines
2020-02-13 13:56:16 -08:00
m0namon
bcaee66b0a Merge pull request #7766 from certbot/min-pyparsing-version
Clarify the minimum pyparsing version
2020-02-12 13:26:10 -08:00
Brad Warren
df584a3b90 Remove _internal from docstring. 2020-02-12 13:12:03 -08:00
Brad Warren
7d540fc33a update pyparsing comment 2020-02-11 14:44:37 -08:00
Brad Warren
605ef40656 Remove duplicate pyparsing pin 2020-02-11 14:20:29 -08:00
Brad Warren
b8856ac810 Fix unpinned tests (#7760)
Our nightly tests failed last night due to a new release of `virtualenv` and `pip`'s lack of dependency resolution: https://travis-ci.com/certbot/certbot/jobs/285797857#L280. It looks like we were not the only ones affected by this problem: https://github.com/pypa/virtualenv/issues/1551

This fixes the problem by using `-I` to skip the logic where `pip` decides a dependency is already satisfied and has it reinstall/update the packages passed to `pip` and all of their dependencies.

You can see our nightly tests passing with this change at https://github.com/certbot/certbot/runs/439231061.
2020-02-11 12:05:29 -08:00
Brad Warren
02bf7d7dfc Print script output in case of a failure. (#7759)
These tests failed at https://travis-ci.com/certbot/certbot/jobs/285202481 but do not include any output from the script about what went wrong because the string created from `subprocess.CalledProcessError` does not include value of output.

This PR fixes that by printing these values which `pytest` will include in the output if the test fails.
2020-02-10 11:01:17 -08:00
Joona Hoikkala
e6f050dbe9 Move ocsp.py to public api (#7744)
We should move ocsp.py to public API, as an upcoming OCSP prefetching functionality in Apache plugin relies on it, and as the plugins are note released in lockstep with the Certbot core, we need to be careful when changing those APIs.

* Move ocsp.py to public api

* Fix type annotations, move to pointing to an interface and fix linting

* Add certbot.ocsp to documentation table of contents

* Modify tests to reflect the changes in ocsp.py

* Add changelog entry

* Fix notAfter mock for tests
2020-02-10 09:52:42 -08:00
Brad Warren
5607025e9b Really remove old docs link from README (#7758) 2020-02-07 12:58:15 -08:00
Brad Warren
7cc6cf2604 Remove link to letsencrypt readthedocs (#7757)
After a brief discussion in Mattermost, I shut down letsencrypt.readthedocs.io. Turns out we were linking to it in our README here so let's remove the broken link.

I didn't update the link to point to one of the readthedocs projects we still have because are main Certbot docs are self-hosted rather than being on readthedocs.
2020-02-07 11:04:07 -08:00
Brad Warren
86a6cc53cf Remove text that certbot.tests.utils isn't public (#7754) 2020-02-07 09:08:41 +01:00
Brad Warren
1859fb059d Don't display todo comments in docs (#7753)
Currently if you go to https://certbot.eff.org/docs/api/certbot.crypto_util.html, there is a todo comment displayed at the top of the page. These todos were written for developers, not users, so I do not think they should be shown from our documentation.

This PR makes the quick and easy fix of configuring Sphinx not to show these todo items. I created #7752 to track removing all of these todos from our docstrings and disabling the Sphinx todo extension.

* Set todo_include_todos=False in sphinx-quickstart

* Remove todos from existing docs.
2020-02-06 15:39:47 -08:00
ohemorange
c5a2ba03da Merge pull request #7735 from certbot/apache-parser-v2
[Apache v2] Merge apache-parser-v2 feature branch back to master
2020-02-06 15:29:28 -08:00
schoen
995e70542a Merge pull request #7738 from osirisinferi/nginx-hostname
[nginx] Parse $hostname in `server_name`
2020-02-06 14:44:03 -08:00
OsirisInferi
4f80f8b910 Fixing existing tests 2020-02-06 21:24:25 +01:00
OsirisInferi
0e03f82733 Remove todo:: 2020-02-06 21:12:17 +01:00
OsirisInferi
5035a510a2 Add test for $hostname parsing 2020-02-06 21:10:41 +01:00
Adrien Ferrand
ef388a309f Merge pull request #7751 from Pilifer/master
Don't verify certificate in HTTP01Response.simple_verify (certbot#6614)
2020-02-06 16:58:39 +01:00
Filip Lajszczak
c98183c998 restore CHANGELOG in root directory 2020-02-06 15:27:20 +00:00
Filip Lajszczak
2b051dd197 Merge branch 'master' of https://github.com/certbot/certbot 2020-02-06 15:14:17 +00:00
Brad Warren
7da5196206 Add triggers for only a single CI system (#7748)
* Configure travis-test to only run on Travis.

* Configure azure-test to only run on Azure.

* Add docs and comments to keep it up-to-date.
2020-02-05 23:49:01 +01:00
Brad Warren
cc764b65c1 Set recreate = true in tox.ini. (#7746)
Fixes #7745.
2020-02-05 14:37:39 -08:00
Adrien Ferrand
7b35abbcb4 Windows installer integration tests (#7724)
As discussed in #7539, we need proper tests of the Windows installer itself in order to variety that all the logic contained in a production-grade runtime of Certbot on Windows is correctly setup by each version of the installer, and so for a variety of Windows OSes. 

This PR handles this requirement. The new `windows_installer_integration_tests` module in `certbot-ci` will:
* run the given Windows installer
* check that Certbot is properly installed and working
* check that the scheduled renew task is set up
* check that the scheduled task actually launch the Certbot renew logic

The Windows nightly tests are updated accordingly, in order to have the tests run on Windows Server 2012R2, 2016 and 2019.

These tests will evolve as we add more logic on the installer. 

* Configure an integration test testing the windows installer

* Write the test module

* Configurable installer path, prepare azure pipelines

* Fix option

* Update test_main.py

* Add confirmation for this destructive test

* Use regex to validate certbot --version output

* Explicit dependency on a log output

* Use an exception to ask confirmation

* Use --allow-persistent-changes
2020-02-05 14:12:29 -08:00
Brad Warren
6601d03ce8 Merge pull request #7743 from certbot/candidate-1.2.0
Candidate 1.2.0
2020-02-05 13:48:51 -08:00
OsirisInferi
d3a4b8fd8c Missing import 2020-02-05 22:27:12 +01:00
OsirisInferi
f3ed133744 Wrap makedirs() within exception handelrs 2020-02-05 22:17:29 +01:00
m0namon
1a2189f4df Merge pull request #7729 from certbot/fix-nginx-typo
Fix a typo in Nginx
2020-02-04 17:00:13 -08:00
Erica Portnoy
6a4b610269 Bump version to 1.3.0 2020-02-04 14:01:04 -08:00
Erica Portnoy
97ae63efa6 Add contents to certbot/CHANGELOG.md for next version 2020-02-04 14:01:03 -08:00
Erica Portnoy
3907b53b4b Release 1.2.0 2020-02-04 14:01:02 -08:00
Erica Portnoy
6c5959d892 Update changelog for 1.2.0 release 2020-02-04 13:46:57 -08:00
OsirisInferi
601a114d1b Update changelog 2020-02-04 19:47:27 +01:00
OsirisInferi
86926dff92 Use unrestrictive umask for challenge directory 2020-02-04 19:27:27 +01:00
OsirisInferi
9b35dbf2be Forgot to remove a breakpoint() statement 2020-02-02 22:31:05 +01:00
OsirisInferi
05e35ff2e0 Add CHANGELOG entry 2020-02-02 21:59:03 +01:00
OsirisInferi
7d0651c315 Parse $hostname in server_name 2020-02-02 21:56:09 +01:00
Brad Warren
174fa0e05c Turn off Travis notifications in test branches. (#7733)
When I want to manually run the full test suite to test something, I've been manually deleting our notification setup from `.travis.yml` to avoid spamming IRC with my personal test failures.

This PR sets this behavior up to happen automatically by turning off IRC notifications in test branches. You can see this working by noticing the IRC notification section in the bottom of the config for this PR at https://travis-ci.com/certbot/certbot/builds/146827907/config and the fact that it is absent from a `test-` branch based on this one at https://travis-ci.com/certbot/certbot/jobs/282059094/config.
2020-01-30 13:26:39 -08:00
Brad Warren
8d9943cb08 Update instructions about how to build docs (#7605) 2020-01-30 11:47:48 -08:00
Brad Warren
715899d5a8 Merge pull request #7731 from certbot/master_to_ap2
[Apache v2] Merge master to apache-parser-v2
2020-01-30 10:21:16 -08:00
Joona Hoikkala
882335c7ec Merge remote-tracking branch 'origin/master' into ap2_to_master 2020-01-30 17:08:16 +02:00
Brad Warren
35fa4c0457 Add space between words. 2020-01-29 15:30:51 -08:00
ohemorange
11e402893f Remove SSLCompression off line from all config options (#7726)
Based on discussion at https://github.com/certbot/certbot/pull/7712#discussion_r371451761.

* Remove SSLCompression off line from all config options

* Update changelog
2020-01-29 15:21:17 -08:00
Brad Warren
2338ab36fd Add backwards compatibility docs (#7611)
Fixes #7463.

* Add backwards compatibility docs.

* Exclude certbot-auto
2020-01-27 13:13:38 -08:00
Cameron Steel
e3c996de10 dns-cloudflare: Implement limited-scope API Tokens (#7583)
A while ago Cloudflare added support for limited-scope API Tokens in place of using a global API key, but support for them in cloudflare/python-cloudflare took a while to get through.

In summary, this PR:
- Implements token functionality through the INI file parameter `dns_cloudflare_api_token` (in addition to the traditional `dns_cloudflare_email` and `dns_cloudflare_api_key`). This needed a more advanced parameter validator than the built in `required_variables` mechanism.
- Updates the docs to reflect the new option, needed token permissions, and version details of the `cloudflare` module

* Update python-cloudflare version

* Add Cloudflare API Token support to certbot-dns-cloudflare

* Add token-specific errors to certbot-dns-cloudflare

* Tidy up certbot-dns-cloudflare

* Implement Cloudflare API Tokens in testing for certbot-dns-cloudflare(needs work)

* Further tidying of certbot-dns-cloudflare

* Update CHANGELOG with Cloudflare API Tokens implementation

* Improve testing of certbot-dns-cloudflare

* Improve certbot-dns-cloudflare test formatting

* Further improve testing for certbot-dns-cloudflare

* Change needed permissions for token

* Add documentation regarding python-cloudflare version

* Fix changelog, references to python-cloudflare and docs

* Fix behaviour when domain does not match cloudflare root domain. Improve error handling.

* Improve testing

* Improve hints and error handling
2020-01-24 15:25:03 -08:00
Brad Warren
b8a9dd75eb Update dns-lexicon version. (#7723) 2020-01-25 00:02:57 +01:00
Brad Warren
2072599bd7 Unpin Python 3.4 dependencies (#7709)
* Unpin dependencies pinned back for py3.4 support.

* update pinned packages

* run build.py

* Update boto3 and deps to work with requests
2020-01-24 23:02:54 +01:00
ohemorange
b1a8e7175b Disable old SSL versions and ciphersuites to follow Mozilla recommendations in Apache (#7712)
Part of #7204.

Makes the smaller changes described at https://github.com/certbot/certbot/issues/7204#issuecomment-571838185 to disable many old ciphersuites and TLS versions < 1.2. Does not add checks for OpenSSL version or modify session tickets.

Since Apache uses TLS protocol blacklisting instead of whitelisting (as in NGINX), we additionally may not need to determine if the server supports TLS1.3 and turn it on or off based on Apache version.

* Update SSL versions and ciphersuites based on Mozilla intermediate recommendations for apache

* Update constants with hashes of new config files

* Update changelog
2020-01-24 13:37:42 -08:00
Brad Warren
1e2f70b17a Drop Python 3.4 support (#7721)
Fixes #7393.

* Remove Python 3.4 classifiers

* Remove unneeded typing dependency

* Exclude Python 3.4 in python_requires

* Remove Python 3.4 deprecation warning

* update changelog
2020-01-24 12:32:07 -08:00
ohemorange
896c1e0b66 Remove ECDHE-RSA-AES128-SHA from NGINX ciphers list (#7719)
As mentioned in https://github.com/certbot/certbot/pull/7712#discussion_r370419867, it's time to remove this ciphersuite now that Windows 2008 R2 and Windows 7 are EOLed.

* Remove ECDHE-RSA-AES128-SHA from NGINX ciphers list to celebrate Windows 2008 R2 deprecation

* Update changelog
2020-01-24 10:09:28 -08:00
Hugo van Kemenade
2f24726d4c Fix collections.abc imports for Python 3.9 (#7707)
* Fix collections.abc imports for Python 3.9

* Update AUTHORS.md

* No longer ignore collections.abc deprecation warning

* Update changelog

* Remove outdated comment

* Disabling no-name-in-module not needed as linting is on Python 3
2020-01-24 14:13:58 +01:00
Amjad Mashaal
5f315b46e9 Update documentation files to remove claiming support for Python 3.4 (#7395) 2020-01-23 16:35:39 -08:00
Josh McCullough
a342eb5546 fixes #1948 -- MD5 on FIPS systems (#7708)
* use MD5 in non-security mode to get around FIPS issue

* update CHANGELOG

* add myself to AUTHORS

* ignore hashlib params
2020-01-23 10:58:36 -08:00
Brad Warren
90fd1afc38 unpin macos (#7705) 2020-01-22 08:20:52 +01:00
Brad Warren
4473fd25cb Don't run Python 3.5 tests twice. (#7704) 2020-01-22 08:18:21 +01:00
Brad Warren
a6772043d6 Minor release script improvements (#7697)
* Do not use git diff.

* Add a warning on exit.
2020-01-21 15:53:31 -08:00
Amjad Mashaal
7234d8922d Drop Travis tests for Python 3.4 (#7394) 2020-01-21 15:34:34 -08:00
Brad Warren
07dc2400eb Downgrade NSIS and upgrade Python (#7702)
* Add --allow-downgrade to chocolatey command.

* Upgrade tests to use Python 3.8.1.
2020-01-21 23:53:19 +01:00
Ville Skyttä
1702cb90fd Spelling and grammar fixes (#7695) 2020-01-17 18:55:51 +01:00
Ville Skyttä
fcdeaf48f2 Include added/deleted TXT record name in RFC 2136 debug log (#7696) 2020-01-17 16:42:10 +02:00
Brad Warren
702ad99090 Don't run some tests multiple times. (#7685) 2020-01-16 23:08:38 +01:00
Brad Warren
5f0703cbf1 Fix minimum certbot version in plugins (#7684)
Fixes the problem found at https://github.com/certbot/certbot/pull/7682#discussion_r367140415.
2020-01-16 13:54:25 -08:00
Brad Warren
9a3186a67e Cleanup disabled warnings list in pytest.ini. (#7690) 2020-01-16 22:47:23 +01:00
Brad Warren
91ce42ce9c Do not list the name twice. (#7689) 2020-01-16 22:44:08 +01:00
Brad Warren
097c76f512 Merge branch 'master' into no-client-plugins 2020-01-16 11:56:41 -08:00
osirisinferi
6e07e8b5c0 Add missing directory field (#7687)
Fixes #7683.

* Add missing directory field to error message

* Added change to CHANGELOG.md
2020-01-16 11:31:22 -08:00
Brad Warren
fd91643a7f Merge pull request #7682 from certbot/candidate-1.1.0
Update files from 1.1.0 release
2020-01-15 16:14:22 -08:00
Brad Warren
619b17753e Bump version to 1.2.0 2020-01-14 10:52:05 -08:00
Brad Warren
60cd920bcb Add contents to certbot/CHANGELOG.md for next version 2020-01-14 10:52:05 -08:00
Brad Warren
f512b5eaa2 Release 1.1.0 2020-01-14 10:52:03 -08:00
Brad Warren
9800e5d8fc Update changelog for 1.1.0 release 2020-01-14 10:41:32 -08:00
Adrien Ferrand
e84ed49c56 Fix certbot-auto regarding python 3.4 -> python 3.6 migration for CentOS 6 users (#7519)
* Revert "Add back Python 3.4 support (#7510)"

This reverts commit 9b848b1d65.

* Fix certbot-auto

* Use a more consistent way to enable rh-python36

* Avoid to call CompareVersions unecessarily

* Control rh-python36 exit code

* Fix travis config

* Remove vscode config

* Ignore vscode

* Fix merge conflicts regarding #7587 (#70)

* Add changelog entry

* Finish sentence

* Update certbot/CHANGELOG.md

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Joona Hoikkala <joohoi@users.noreply.github.com>

* Update comments

* Improve warning message

* Update changelog

Co-authored-by: Joona Hoikkala <joohoi@users.noreply.github.com>
2020-01-13 09:24:41 +01:00
Brad Warren
ceea41c1e2 Do not document private members (#7675)
It looks like we're currently documenting functions that are marked private (prefixed with an underscore) such as https://certbot.eff.org/docs/api/certbot.crypto_util.html#certbot.crypto_util._load_cert_or_req. I do not think we should do this because the functionality is private, should not be used, and including it in our docs just adds visual noise.

This PR stops us from documenting private code and fixes up `tools/sphinx-quickstart.sh` so we don't document it in future modules.

* Do not document private code.

* Don't document private members in the future.
2020-01-10 16:48:01 -08:00
Vladimir Varlamov
456122e342 improve help about supply selecting in delete command (#7673)
for #6625
2020-01-09 11:34:04 -08:00
Adrien Ferrand
84c1b912d9 Implement a sunset mechanism in certbot-auto for systems not supported anymore (#7587)
* Sunset mechanism

* Simplify code

* Update letsencrypt-auto-source/letsencrypt-auto.template

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update template

* Deprecate for all RHEL/CentOS 6 32bits flavors

* Add a wrapper to uname to do tests on fake 32 bits versions

* Replace all occurences

* Add some tests about sunset mechanism

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Various corrections

* Recreate script

* Update comment position

* Test also install only

* Fix docker

* Update letsencrypt-auto-source/tests/centos6_tests.sh

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* What error command is doing here ?

* Fix permissions

* Rebuild script

* Add changelog

* Update letsencrypt-auto-source/letsencrypt-auto.template

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update changelog

* Trigger CI

* Handle old venv path

* Modify test

* Fix test error detection from subpaths

* Edit echo

* Use set -e

* Update letsencrypt-auto-source/letsencrypt-auto.template

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Corrections

Co-authored-by: Brad Warren <bmw@users.noreply.github.com>
2020-01-08 16:36:34 +01:00
Brad Warren
2bc64183a8 fix docstring 2019-11-11 17:11:47 -08:00
Giles Thomas
b27e5804b9 Added change description to CHANGELOG.md 2019-02-05 18:43:03 +00:00
Giles Thomas
4ca03aec8d Don't verify existing certificate in HTTP01Response.simple_verify (certbot#6614) 2019-02-05 18:37:09 +00:00
423 changed files with 10269 additions and 7202 deletions

View File

@@ -69,12 +69,12 @@ Access can be defined for all or only selected repositories, which is nice.
```
- Redirected to Azure DevOps, select the account created in _Having an Azure DevOps account_ section.
- Select the organization, and click "Create a new project" (let's name it the same than the targetted github repo)
- Select the organization, and click "Create a new project" (let's name it the same than the targeted github repo)
- The Visibility is public, to profit from 10 parallel jobs
```
!!! ACCESS !!!
Azure Pipelines needs access to the GitHub account (in term of beeing able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
Azure Pipelines needs access to the GitHub account (in term of being able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
```
_Done. We can move to pipelines configuration._

View File

@@ -0,0 +1,13 @@
# Advanced pipeline for running our full test suite on demand and for release branches.
trigger:
- '*.x'
# When changing these triggers, please ensure the documentation under
# "Running tests in CI" is still correct.
- test-*
pr: none
stages:
- template: templates/stages/test-and-package-stage.yml
# Notify failures only for release branches.
- ${{ if not(startsWith(variables['Build.SourceBranchName'], 'test-')) }}:
- template: templates/stages/notify-failure-stage.yml

View File

@@ -1,20 +0,0 @@
# Advanced pipeline for isolated checks and release purpose
trigger:
- test-*
- '*.x'
pr:
- test-*
# This pipeline is also nightly run on master
schedules:
- cron: "0 4 * * *"
displayName: Nightly build
branches:
include:
- master
always: true
jobs:
# Any addition here should be reflected in the release pipeline.
# It is advised to declare all jobs here as templates to improve maintainability.
- template: templates/tests-suite.yml
- template: templates/installer-tests.yml

View File

@@ -1,12 +1,7 @@
trigger:
# apache-parser-v2 is a temporary branch for doing work related to
# rewriting the parser in the Apache plugin.
- apache-parser-v2
- master
trigger: none
pr:
- apache-parser-v2
- master
- '*.x'
jobs:
- template: templates/tests-suite.yml
- template: templates/jobs/standard-tests-jobs.yml

View File

@@ -0,0 +1,15 @@
# Nightly pipeline running each day for master.
trigger: none
pr: none
schedules:
- cron: "30 4 * * *"
displayName: Nightly build
branches:
include:
- master
always: true
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/deploy-stage.yml
- template: templates/stages/notify-failure-stage.yml

View File

@@ -5,9 +5,8 @@ trigger:
- v*
pr: none
jobs:
# Any addition here should be reflected in the advanced pipeline.
# It is advised to declare all jobs here as templates to improve maintainability.
- template: templates/tests-suite.yml
- template: templates/installer-tests.yml
- template: templates/changelog.yml
stages:
- template: templates/stages/test-and-package-stage.yml
- template: templates/stages/changelog-stage.yml
- template: templates/stages/deploy-stage.yml
- template: templates/stages/notify-failure-stage.yml

View File

@@ -1,14 +0,0 @@
jobs:
- job: changelog
pool:
vmImage: vs2017-win2016
steps:
- bash: |
CERTBOT_VERSION="$(cd certbot && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
displayName: Prepare changelog
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: changelog
displayName: Publish changelog

View File

@@ -1,54 +0,0 @@
jobs:
- job: installer_build
pool:
vmImage: vs2017-win2016
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.7
architecture: x86
addToPath: true
- script: python windows-installer/construct.py
displayName: Build Certbot installer
- task: CopyFiles@2
inputs:
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
contents: '*.exe'
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: windows-installer
displayName: Publish Windows installer
- job: installer_run
dependsOn: installer_build
strategy:
matrix:
win2019:
imageName: windows-2019
win2016:
imageName: vs2017-win2016
win2012r2:
imageName: vs2015-win2012r2
pool:
vmImage: $(imageName)
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: windows-installer
path: $(Build.SourcesDirectory)/bin
displayName: Retrieve Windows installer
- script: $(Build.SourcesDirectory)\bin\certbot-beta-installer-win32.exe /S
displayName: Install Certbot
- powershell: Invoke-WebRequest https://www.python.org/ftp/python/3.8.0/python-3.8.0-amd64-webinstall.exe -OutFile C:\py3-setup.exe
displayName: Get Python
- script: C:\py3-setup.exe /quiet PrependPath=1 InstallAllUsers=1 Include_launcher=1 InstallLauncherAllUsers=1 Include_test=0 Include_doc=0 Include_dev=1 Include_debug=0 Include_tcltk=0 TargetDir=C:\py3
displayName: Install Python
- script: |
py -3 -m venv venv
venv\Scripts\python tools\pip_install.py -e certbot-ci
displayName: Prepare Certbot-CI
- script: |
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
displayName: Run integration tests

View File

@@ -0,0 +1,96 @@
jobs:
- job: extended_test
variables:
- name: IMAGE_NAME
value: ubuntu-18.04
- group: certbot-common
strategy:
matrix:
linux-py36:
PYTHON_VERSION: 3.6
TOXENV: py36
linux-py37:
PYTHON_VERSION: 3.7
TOXENV: py37
linux-py37-nopin:
PYTHON_VERSION: 3.7
TOXENV: py37
CERTBOT_NO_PIN: 1
linux-boulder-v1-integration-certbot-oldest:
TOXENV: integration-certbot-oldest
ACME_SERVER: boulder-v1
linux-boulder-v2-integration-certbot-oldest:
TOXENV: integration-certbot-oldest
ACME_SERVER: boulder-v2
linux-boulder-v1-integration-nginx-oldest:
TOXENV: integration-nginx-oldest
ACME_SERVER: boulder-v1
linux-boulder-v2-integration-nginx-oldest:
TOXENV: integration-nginx-oldest
ACME_SERVER: boulder-v2
linux-boulder-v1-py27-integration:
PYTHON_VERSION: 2.7
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py27-integration:
PYTHON_VERSION: 2.7
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py35-integration:
PYTHON_VERSION: 3.5
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py35-integration:
PYTHON_VERSION: 3.5
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py36-integration:
PYTHON_VERSION: 3.6
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py36-integration:
PYTHON_VERSION: 3.6
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py37-integration:
PYTHON_VERSION: 3.7
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py37-integration:
PYTHON_VERSION: 3.7
TOXENV: integration
ACME_SERVER: boulder-v2
linux-boulder-v1-py38-integration:
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: boulder-v1
linux-boulder-v2-py38-integration:
PYTHON_VERSION: 3.8
TOXENV: integration
ACME_SERVER: boulder-v2
nginx-compat:
TOXENV: nginx_compat
le-auto-jessie:
TOXENV: le_auto_jessie
le-auto-centos6:
TOXENV: le_auto_centos6
le-auto-oraclelinux6:
TOXENV: le_auto_oraclelinux6
docker-dev:
TOXENV: docker_dev
farmtest-apache2:
PYTHON_VERSION: 3.7
TOXENV: test-farm-apache2
farmtest-leauto-upgrades:
PYTHON_VERSION: 3.7
TOXENV: test-farm-leauto-upgrades
farmtest-certonly-standalone:
PYTHON_VERSION: 3.7
TOXENV: test-farm-certonly-standalone
farmtest-sdists:
PYTHON_VERSION: 3.7
TOXENV: test-farm-sdists
pool:
vmImage: $(IMAGE_NAME)
steps:
- template: ../steps/tox-steps.yml

View File

@@ -0,0 +1,102 @@
jobs:
- job: installer_build
pool:
vmImage: vs2017-win2016
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: 3.7
architecture: x86
addToPath: true
- script: python windows-installer/construct.py
displayName: Build Certbot installer
- task: CopyFiles@2
inputs:
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
contents: '*.exe'
targetFolder: $(Build.ArtifactStagingDirectory)
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: windows-installer
displayName: Publish Windows installer
- job: installer_run
dependsOn: installer_build
strategy:
matrix:
win2019:
imageName: windows-2019
win2016:
imageName: vs2017-win2016
pool:
vmImage: $(imageName)
steps:
- powershell: |
if ($PSVersionTable.PSVersion.Major -ne 5) {
throw "Powershell version is not 5.x"
}
condition: eq(variables['imageName'], 'vs2017-win2016')
displayName: Check Powershell 5.x is used in vs2017-win2016
- task: UsePythonVersion@0
inputs:
versionSpec: 3.8
addToPath: true
- task: DownloadPipelineArtifact@2
inputs:
artifact: windows-installer
path: $(Build.SourcesDirectory)/bin
displayName: Retrieve Windows installer
- script: |
py -3 -m venv venv
venv\Scripts\python tools\pip_install.py -e certbot-ci
displayName: Prepare Certbot-CI
- script: |
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\windows_installer_integration_tests --allow-persistent-changes --installer-path $(Build.SourcesDirectory)\bin\certbot-beta-installer-win32.exe
displayName: Run windows installer integration tests
- script: |
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
displayName: Run certbot integration tests
- job: snap_build
strategy:
matrix:
amd64:
ARCH: amd64
arm64:
ARCH: arm64
armhf:
ARCH: armhf
pool:
vmImage: ubuntu-18.04
steps:
- script: |
snap/local/build.sh ${ARCH}
mv *.snap $(Build.ArtifactStagingDirectory)
displayName: Build Certbot snap
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: snap-$(arch)
displayName: Store snap artifact
- job: snap_run
dependsOn: snap_build
pool:
vmImage: ubuntu-18.04
steps:
- script: |
sudo apt-get update
sudo apt-get install -y --no-install-recommends nginx-light snapd
python tools/pip_install.py -U tox
displayName: Install dependencies
- task: DownloadPipelineArtifact@2
inputs:
artifact: snap-amd64
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snap
- script: |
sudo snap install --dangerous --classic snap/*.snap
displayName: Install Certbot snap
- script: |
python -m tox -e integration-external,apacheconftest-external-with-pebble
displayName: Run tox

View File

@@ -0,0 +1,73 @@
jobs:
- job: test
strategy:
matrix:
macos-py27:
IMAGE_NAME: macOS-10.14
PYTHON_VERSION: 2.7
TOXENV: py27
macos-py38:
IMAGE_NAME: macOS-10.14
PYTHON_VERSION: 3.8
TOXENV: py38
windows-py35:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.5
TOXENV: py35
windows-py37-cover:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.7
TOXENV: py37-cover
windows-integration-certbot:
IMAGE_NAME: vs2017-win2016
PYTHON_VERSION: 3.7
TOXENV: integration-certbot
linux-oldest-tests-1:
IMAGE_NAME: ubuntu-18.04
TOXENV: py27-{acme,apache,apache-v2,certbot}-oldest
linux-oldest-tests-2:
IMAGE_NAME: ubuntu-18.04
TOXENV: py27-{dns,nginx}-oldest
linux-py27:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 2.7
TOXENV: py27
linux-py35:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.5
TOXENV: py35
linux-py38-cover:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.8
TOXENV: py38-cover
linux-py37-lint:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.7
TOXENV: lint
linux-py35-mypy:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 3.5
TOXENV: mypy
linux-integration:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 2.7
TOXENV: integration
ACME_SERVER: pebble
apache-compat:
IMAGE_NAME: ubuntu-18.04
TOXENV: apache_compat
le-auto-xenial:
IMAGE_NAME: ubuntu-18.04
TOXENV: le_auto_xenial
apacheconftest:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 2.7
TOXENV: apacheconftest-with-pebble
nginxroundtrip:
IMAGE_NAME: ubuntu-18.04
PYTHON_VERSION: 2.7
TOXENV: nginxroundtrip
pool:
vmImage: $(IMAGE_NAME)
steps:
- template: ../steps/tox-steps.yml

View File

@@ -0,0 +1,16 @@
stages:
- stage: Changelog
jobs:
- job: prepare
pool:
vmImage: vs2017-win2016
steps:
- bash: |
CERTBOT_VERSION="$(cd certbot && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
displayName: Prepare changelog
- task: PublishPipelineArtifact@1
inputs:
path: $(Build.ArtifactStagingDirectory)
artifact: changelog
displayName: Publish changelog

View File

@@ -0,0 +1,43 @@
stages:
- stage: Deploy
jobs:
# This job relies on a snapcraft.cfg preconfigured with credential,
# stored as a secure file in Azure Pipeline.
# This credential has a maximum lifetime of 1 year and the current
# credential will expire on 6/25/2021. The content of snapcraft.cfg
# will need to be updated to use a new credential before then to
# prevent automated deploys from breaking. Remembering to do this is
# also tracked by https://github.com/certbot/certbot/issues/7931.
- job: publish_snap
strategy:
matrix:
amd64:
ARCH: amd64
arm64:
ARCH: arm64
armhf:
ARCH: armhf
pool:
vmImage: ubuntu-18.04
variables:
- group: certbot-common
steps:
- bash: |
sudo apt-get update
sudo apt-get install -y --no-install-recommends snapd
sudo snap install --classic snapcraft
displayName: Install dependencies
- task: DownloadPipelineArtifact@2
inputs:
artifact: snap-$(arch)
path: $(Build.SourcesDirectory)/snap
displayName: Retrieve Certbot snap
- task: DownloadSecureFile@1
name: snapcraftCfg
inputs:
secureFile: snapcraft.cfg
- bash: |
mkdir -p .snapcraft
ln -s $(snapcraftCfg.secureFilePath) .snapcraft/snapcraft.cfg
snapcraft push --release=edge snap/*.snap
displayName: Publish to Snap store

View File

@@ -0,0 +1,18 @@
stages:
- stage: On_Failure
jobs:
- job: notify_mattermost
variables:
- group: certbot-common
pool:
vmImage: ubuntu-latest
steps:
- bash: |
MESSAGE="\
---\n\
##### Azure Pipeline
*Repo* $(Build.Repository.ID) - *Pipeline* $(Build.DefinitionName) #$(Build.BuildNumber) - *Branch/PR* $(Build.SourceBranchName)\n\
:warning: __Pipeline has failed__: [Link to the build](https://dev.azure.com/$(Build.Repository.ID)/_build/results?buildId=$(Build.BuildId)&view=results)\n\n\
---"
curl -i -X POST --data-urlencode "payload={\"text\":\"${MESSAGE}\"}" "$(MATTERMOST_URL)"
condition: failed()

View File

@@ -0,0 +1,6 @@
stages:
- stage: TestAndPackage
jobs:
- template: ../jobs/standard-tests-jobs.yml
- template: ../jobs/extended-tests-jobs.yml
- template: ../jobs/packaging-jobs.yml

View File

@@ -0,0 +1,53 @@
steps:
- bash: |
brew install augeas
condition: startswith(variables['IMAGE_NAME'], 'macOS')
displayName: Install MacOS dependencies
- bash: |
sudo apt-get update
sudo apt-get install -y --no-install-recommends \
python-dev \
gcc \
libaugeas0 \
libssl-dev \
libffi-dev \
ca-certificates \
nginx-light \
openssl
sudo systemctl stop nginx
condition: startswith(variables['IMAGE_NAME'], 'ubuntu')
displayName: Install Linux dependencies
- task: UsePythonVersion@0
inputs:
versionSpec: $(PYTHON_VERSION)
addToPath: true
condition: ne(variables['PYTHON_VERSION'], '')
# tools/pip_install.py is used to pin packages to a known working version
# except in tests where the environment variable CERTBOT_NO_PIN is set.
# virtualenv is listed here explicitly to make sure it is upgraded when
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
# version of virtualenv. The option "-I" is set so when CERTBOT_NO_PIN is also
# set, pip updates dependencies it thinks are already satisfied to avoid some
# problems with its lack of real dependency resolution.
- bash: |
python tools/pip_install.py -I tox virtualenv
displayName: Install runtime dependencies
- task: DownloadSecureFile@1
name: testFarmPem
inputs:
secureFile: azure-test-farm.pem
condition: contains(variables['TOXENV'], 'test-farm')
- bash: |
export TARGET_BRANCH="`echo "${BUILD_SOURCEBRANCH}" | sed -E 's!refs/(heads|tags)/!!g'`"
[ -z "${SYSTEM_PULLREQUEST_TARGETBRANCH}" ] || export TARGET_BRANCH="${SYSTEM_PULLREQUEST_TARGETBRANCH}"
env
if [[ "${TOXENV}" == *"oldest"* ]]; then
tools/run_oldest_tests.sh
else
python -m tox
fi
env:
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
AWS_EC2_PEM_FILE: $(testFarmPem.secureFilePath)
displayName: Run tox

View File

@@ -1,38 +0,0 @@
jobs:
- job: test
pool:
vmImage: vs2017-win2016
strategy:
matrix:
py35:
PYTHON_VERSION: 3.5
TOXENV: py35
py37-cover:
PYTHON_VERSION: 3.7
TOXENV: py37-cover
integration-certbot:
PYTHON_VERSION: 3.7
TOXENV: integration-certbot
PYTEST_ADDOPTS: --numprocesses 4
variables:
- group: certbot-common
steps:
- task: UsePythonVersion@0
inputs:
versionSpec: $(PYTHON_VERSION)
addToPath: true
- script: python tools/pip_install.py -U tox coverage
displayName: Install dependencies
- script: python -m tox
displayName: Run tox
# We do not require codecov report upload to succeed. So to avoid to break the pipeline if
# something goes wrong, each command is suffixed with a command that hides any non zero exit
# codes and echoes an informative message instead.
- bash: |
curl -s https://codecov.io/bash -o codecov-bash || echo "Failed to download codecov-bash"
chmod +x codecov-bash || echo "Failed to apply execute permissions on codecov-bash"
./codecov-bash -F windows || echo "Codecov did not collect coverage reports"
condition: in(variables['TOXENV'], 'py37-cover', 'integration-certbot')
env:
CODECOV_TOKEN: $(codecov_token)
displayName: Publish coverage

View File

@@ -1,18 +0,0 @@
coverage:
status:
project:
default: off
linux:
flags: linux
# Fixed target instead of auto set by #7173, can
# be removed when flags in Codecov are added back.
target: 97.4
threshold: 0.1
base: auto
windows:
flags: windows
# Fixed target instead of auto set by #7173, can
# be removed when flags in Codecov are added back.
target: 97.4
threshold: 0.1
base: auto

11
.gitignore vendored
View File

@@ -26,6 +26,7 @@ tags
\#*#
.idea
.ropeproject
.vscode
# auth --cert-path --chain-path
/*.pem
@@ -34,6 +35,7 @@ tags
tests/letstest/letest-*/
tests/letstest/*.pem
tests/letstest/venv/
tests/letstest/venv3/
.venv
@@ -49,3 +51,12 @@ tests/letstest/venv/
.certbot_test_workspace
**/assets/pebble*
**/assets/challtestsrv*
# snap files
.snapcraft
parts
prime
stage
*.snap
snap-constraints.txt
qemu-*

View File

@@ -1,309 +0,0 @@
language: python
dist: xenial
cache:
directories:
- $HOME/.cache/pip
before_script:
- 'if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then ulimit -n 1024 ; fi'
# On Travis, the fastest parallelization for integration tests has proved to be 4.
- 'if [[ "$TOXENV" == *"integration"* ]]; then export PYTEST_ADDOPTS="--numprocesses 4"; fi'
# Use Travis retry feature for farm tests since they are flaky
- 'if [[ "$TOXENV" == "travis-test-farm"* ]]; then export TRAVIS_RETRY=travis_retry; fi'
- export TOX_TESTENV_PASSENV=TRAVIS
# Only build pushes to the master branch, PRs, and branches beginning with
# `test-` or of the form `digit(s).digit(s).x`. This reduces the number of
# simultaneous Travis runs, which speeds turnaround time on review since there
# is a cap of on the number of simultaneous runs.
branches:
only:
# apache-parser-v2 is a temporary branch for doing work related to
# rewriting the parser in the Apache plugin.
- apache-parser-v2
- master
- /^\d+\.\d+\.x$/
- /^test-.*$/
# Jobs for the main test suite are always executed (including on PRs) except for pushes on master.
not-on-master: &not-on-master
if: NOT (type = push AND branch = master)
# Jobs for the extended test suite are executed for cron jobs and pushes to
# non-development branches. See the explanation for apache-parser-v2 above.
extended-test-suite: &extended-test-suite
if: type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))
matrix:
include:
# Main test suite
- python: "2.7"
env: ACME_SERVER=pebble TOXENV=integration
<<: *not-on-master
# This job is always executed, including on master
- python: "2.7"
env: TOXENV=py27-cover FYI="py27 tests + code coverage"
- python: "3.7"
env: TOXENV=lint
<<: *not-on-master
- python: "3.5"
env: TOXENV=mypy
<<: *not-on-master
- python: "2.7"
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
env: TOXENV='py27-{acme,apache,apache-v2,certbot,dns,nginx}-oldest'
<<: *not-on-master
- python: "3.4"
env: TOXENV=py34
<<: *not-on-master
- python: "3.7"
env: TOXENV=py37
<<: *not-on-master
- python: "3.8"
env: TOXENV=py38
<<: *not-on-master
- sudo: required
env: TOXENV=apache_compat
services: docker
before_install:
addons:
<<: *not-on-master
- sudo: required
env: TOXENV=le_auto_xenial
services: docker
<<: *not-on-master
- python: "2.7"
env: TOXENV=apacheconftest-with-pebble
<<: *not-on-master
- python: "2.7"
env: TOXENV=nginxroundtrip
<<: *not-on-master
# Extended test suite on cron jobs and pushes to tested branches other than master
- sudo: required
env: TOXENV=nginx_compat
services: docker
before_install:
addons:
<<: *extended-test-suite
- python: "2.7"
env:
- TOXENV=travis-test-farm-apache2
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
<<: *extended-test-suite
- python: "2.7"
env:
- TOXENV=travis-test-farm-leauto-upgrades
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
git:
depth: false # This is needed to have the history to checkout old versions of certbot-auto.
<<: *extended-test-suite
- python: "2.7"
env:
- TOXENV=travis-test-farm-certonly-standalone
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
<<: *extended-test-suite
- python: "2.7"
env:
- TOXENV=travis-test-farm-sdists
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
<<: *extended-test-suite
- python: "3.7"
env: TOXENV=py37 CERTBOT_NO_PIN=1
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration-certbot-oldest
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration-certbot-oldest
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration-nginx-oldest
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
sudo: required
services: docker
<<: *extended-test-suite
- python: "2.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration-nginx-oldest
# Ubuntu Trusty or older must be used because the oldest version of
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.4"
env: TOXENV=py34
<<: *extended-test-suite
- python: "3.5"
env: TOXENV=py35
<<: *extended-test-suite
- python: "3.6"
env: TOXENV=py36
<<: *extended-test-suite
- python: "3.7"
env: TOXENV=py37
<<: *extended-test-suite
- python: "3.8"
env: TOXENV=py38
<<: *extended-test-suite
- python: "3.4"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.4"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.5"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.5"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.6"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.6"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.7"
env: ACME_SERVER=boulder-v1 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.7"
env: ACME_SERVER=boulder-v2 TOXENV=integration
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.8"
env: ACME_SERVER=boulder-v1 TOXENV=integration
<<: *extended-test-suite
- python: "3.8"
env: ACME_SERVER=boulder-v2 TOXENV=integration
<<: *extended-test-suite
- sudo: required
env: TOXENV=le_auto_jessie
services: docker
<<: *extended-test-suite
- sudo: required
env: TOXENV=le_auto_centos6
services: docker
<<: *extended-test-suite
- sudo: required
env: TOXENV=docker_dev
services: docker
addons:
apt:
packages: # don't install nginx and apache
- libaugeas0
<<: *extended-test-suite
- language: generic
env: TOXENV=py27
os: osx
# Using this osx_image is a workaround for
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
osx_image: xcode10.2
addons:
homebrew:
packages:
- augeas
- python2
<<: *extended-test-suite
- language: generic
env: TOXENV=py3
os: osx
# Using this osx_image is a workaround for
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
osx_image: xcode10.2
addons:
homebrew:
packages:
- augeas
- python3
<<: *extended-test-suite
# container-based infrastructure
sudo: false
addons:
apt:
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
- python-dev
- gcc
- libaugeas0
- libssl-dev
- libffi-dev
- ca-certificates
# For certbot-nginx integration testing
- nginx-light
- openssl
# tools/pip_install.py is used to pin packages to a known working version
# except in tests where the environment variable CERTBOT_NO_PIN is set.
# virtualenv is listed here explicitly to make sure it is upgraded when
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
# version of virtualenv.
install: 'tools/pip_install.py -U codecov tox virtualenv'
# Most of the time TRAVIS_RETRY is an empty string, and has no effect on the
# script command. It is set only to `travis_retry` during farm tests, in
# order to trigger the Travis retry feature, and compensate the inherent
# flakiness of these specific tests.
script: '$TRAVIS_RETRY tox'
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov -F linux'
notifications:
email: false
irc:
channels:
# This is set to a secure variable to prevent forks from sending
# notifications. This value was created by installing
# https://github.com/travis-ci/travis.rb and running
# `travis encrypt "chat.freenode.net#certbot-devel"`.
- secure: "EWW66E2+KVPZyIPR8ViENZwfcup4Gx3/dlimmAZE0WuLwxDCshBBOd3O8Rf6pBokEoZlXM5eDT6XdyJj8n0DLslgjO62pExdunXpbcMwdY7l1ELxX2/UbnDTE6UnPYa09qVBHNG7156Z6yE0x2lH4M9Ykvp0G0cubjPQHylAwo0="
on_cancel: never
on_success: never
on_failure: always

View File

@@ -21,6 +21,7 @@ Authors
* [Andrzej Górski](https://github.com/andrzej3393)
* [Anselm Levskaya](https://github.com/levskaya)
* [Antoine Jacoutot](https://github.com/ajacoutot)
* [April King](https://github.com/april)
* [asaph](https://github.com/asaph)
* [Axel Beckert](https://github.com/xtaran)
* [Bas](https://github.com/Mechazawa)
@@ -35,7 +36,9 @@ Authors
* [Blake Griffith](https://github.com/cowlicks)
* [Brad Warren](https://github.com/bmw)
* [Brandon Kraft](https://github.com/kraftbj)
* [Brandon Kreisel](https://github.com/kraftbj)
* [Brandon Kreisel](https://github.com/BKreisel)
* [Brian Heim](https://github.com/brianlheim)
* [Cameron Steel](https://github.com/Tugzrida)
* [Ceesjan Luiten](https://github.com/quinox)
* [Chad Whitacre](https://github.com/whit537)
* [Chhatoi Pritam Baral](https://github.com/pritambaral)
@@ -82,6 +85,7 @@ Authors
* [Felix Schwarz](https://github.com/FelixSchwarz)
* [Felix Yan](https://github.com/felixonmars)
* [Filip Ochnik](https://github.com/filipochnik)
* [Florian Klink](https://github.com/flokli)
* [Francois Marier](https://github.com/fmarier)
* [Frank](https://github.com/Frankkkkk)
* [Frederic BLANC](https://github.com/fblanc)
@@ -100,7 +104,9 @@ Authors
* [Harlan Lieberman-Berg](https://github.com/hlieberman)
* [Henri Salo](https://github.com/fgeek)
* [Henry Chen](https://github.com/henrychen95)
* [Hugo van Kemenade](https://github.com/hugovk)
* [Ingolf Becker](https://github.com/watercrossing)
* [Ivan Nejgebauer](https://github.com/inejge)
* [Jaap Eldering](https://github.com/eldering)
* [Jacob Hoffman-Andrews](https://github.com/jsha)
* [Jacob Sachs](https://github.com/jsachs)
@@ -124,6 +130,7 @@ Authors
* [Jonathan Herlin](https://github.com/Jonher937)
* [Jon Walsh](https://github.com/code-tree)
* [Joona Hoikkala](https://github.com/joohoi)
* [Josh McCullough](https://github.com/JoshMcCullough)
* [Josh Soref](https://github.com/jsoref)
* [Joubin Jabbari](https://github.com/joubin)
* [Juho Juopperi](https://github.com/jkjuopperi)
@@ -197,6 +204,7 @@ Authors
* [Pierre Jaury](https://github.com/kaiyou)
* [Piotr Kasprzyk](https://github.com/kwadrat)
* [Prayag Verma](https://github.com/pra85)
* [Rasesh Patel](https://github.com/raspat1)
* [Reinaldo de Souza Jr](https://github.com/juniorz)
* [Remi Rampin](https://github.com/remram44)
* [Rémy HUBSCHER](https://github.com/Natim)
@@ -232,6 +240,7 @@ Authors
* [Stefan Weil](https://github.com/stweil)
* [Steve Desmond](https://github.com/stevedesmond-ca)
* [sydneyli](https://github.com/sydneyli)
* [taixx046](https://github.com/taixx046)
* [Tan Jay Jun](https://github.com/jayjun)
* [Tapple Gao](https://github.com/tapple)
* [Telepenin Nikolay](https://github.com/telepenin)
@@ -263,5 +272,6 @@ Authors
* [Yomna](https://github.com/ynasser)
* [Yoni Jah](https://github.com/yonjah)
* [YourDaddyIsHere](https://github.com/YourDaddyIsHere)
* [Yuseong Cho](https://github.com/g6123)
* [Zach Shepherd](https://github.com/zjs)
* [陈三](https://github.com/chenxsan)

View File

@@ -1,15 +1,22 @@
"""ACME Identifier Validation Challenges."""
import abc
import codecs
import functools
import hashlib
import logging
import socket
from cryptography.hazmat.primitives import hashes # type: ignore
import josepy as jose
import requests
import six
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
from OpenSSL import crypto
from acme import crypto_util
from acme import errors
from acme import fields
from acme.mixins import ResourceMixin, TypeMixin
logger = logging.getLogger(__name__)
@@ -28,7 +35,7 @@ class Challenge(jose.TypedJSONObjectWithFields):
return UnrecognizedChallenge.from_json(jobj)
class ChallengeResponse(jose.TypedJSONObjectWithFields):
class ChallengeResponse(ResourceMixin, TypeMixin, jose.TypedJSONObjectWithFields):
# _fields_to_partial_json
"""ACME challenge response."""
TYPES = {} # type: dict
@@ -303,7 +310,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
uri = chall.uri(domain)
logger.debug("Verifying %s at %s...", chall.typ, uri)
try:
http_response = requests.get(uri)
http_response = requests.get(uri, verify=False)
except requests.exceptions.RequestException as error:
logger.error("Unable to reach %s: %s", uri, error)
return False
@@ -362,29 +369,163 @@ class HTTP01(KeyAuthorizationChallenge):
@ChallengeResponse.register
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
"""ACME TLS-ALPN-01 challenge response.
This class only allows initiating a TLS-ALPN-01 challenge returned from the
CA. Full support for responding to TLS-ALPN-01 challenges by generating and
serving the expected response certificate is not currently provided.
"""
"""ACME tls-alpn-01 challenge response."""
typ = "tls-alpn-01"
PORT = 443
"""Verification port as defined by the protocol.
@Challenge.register
You can override it (e.g. for testing) by passing ``port`` to
`simple_verify`.
"""
ID_PE_ACME_IDENTIFIER_V1 = b"1.3.6.1.5.5.7.1.30.1"
ACME_TLS_1_PROTOCOL = "acme-tls/1"
@property
def h(self):
"""Hash value stored in challenge certificate"""
return hashlib.sha256(self.key_authorization.encode('utf-8')).digest()
def gen_cert(self, domain, key=None, bits=2048):
"""Generate tls-alpn-01 certificate.
:param unicode domain: Domain verified by the challenge.
:param OpenSSL.crypto.PKey key: Optional private key used in
certificate generation. If not provided (``None``), then
fresh key will be generated.
:param int bits: Number of bits for newly generated key.
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
"""
if key is None:
key = crypto.PKey()
key.generate_key(crypto.TYPE_RSA, bits)
der_value = b"DER:" + codecs.encode(self.h, 'hex')
acme_extension = crypto.X509Extension(self.ID_PE_ACME_IDENTIFIER_V1,
critical=True, value=der_value)
return crypto_util.gen_ss_cert(key, [domain], force_san=True,
extensions=[acme_extension]), key
def probe_cert(self, domain, host=None, port=None):
"""Probe tls-alpn-01 challenge certificate.
:param unicode domain: domain being validated, required.
:param string host: IP address used to probe the certificate.
:param int port: Port used to probe the certificate.
"""
if host is None:
host = socket.gethostbyname(domain)
logger.debug('%s resolved to %s', domain, host)
if port is None:
port = self.PORT
return crypto_util.probe_sni(host=host, port=port, name=domain,
alpn_protocols=[self.ACME_TLS_1_PROTOCOL])
def verify_cert(self, domain, cert):
"""Verify tls-alpn-01 challenge certificate.
:param unicode domain: Domain name being validated.
:param OpensSSL.crypto.X509 cert: Challenge certificate.
:returns: Whether the certificate was successfully verified.
:rtype: bool
"""
# pylint: disable=protected-access
names = crypto_util._pyopenssl_cert_or_req_all_names(cert)
logger.debug('Certificate %s. SANs: %s', cert.digest('sha256'), names)
if len(names) != 1 or names[0].lower() != domain.lower():
return False
for i in range(cert.get_extension_count()):
ext = cert.get_extension(i)
# FIXME: assume this is the ACME extension. Currently there is no
# way to get full OID of an unknown extension from pyopenssl.
if ext.get_short_name() == b'UNDEF':
data = ext.get_data()
return data == self.h
return False
# pylint: disable=too-many-arguments
def simple_verify(self, chall, domain, account_public_key,
cert=None, host=None, port=None):
"""Simple verify.
Verify ``validation`` using ``account_public_key``, optionally
probe tls-alpn-01 certificate and check using `verify_cert`.
:param .challenges.TLSALPN01 chall: Corresponding challenge.
:param str domain: Domain name being validated.
:param JWK account_public_key:
:param OpenSSL.crypto.X509 cert: Optional certificate. If not
provided (``None``) certificate will be retrieved using
`probe_cert`.
:param string host: IP address used to probe the certificate.
:param int port: Port used to probe the certificate.
:returns: ``True`` if and only if client's control of the domain has been verified.
:rtype: bool
"""
if not self.verify(chall, account_public_key):
logger.debug("Verification of key authorization in response failed")
return False
if cert is None:
try:
cert = self.probe_cert(domain=domain, host=host, port=port)
except errors.Error as error:
logger.debug(str(error), exc_info=True)
return False
return self.verify_cert(domain, cert)
@Challenge.register # pylint: disable=too-many-ancestors
class TLSALPN01(KeyAuthorizationChallenge):
"""ACME tls-alpn-01 challenge.
This class simply allows parsing the TLS-ALPN-01 challenge returned from
the CA. Full TLS-ALPN-01 support is not currently provided.
"""
typ = "tls-alpn-01"
"""ACME tls-alpn-01 challenge."""
response_cls = TLSALPN01Response
typ = response_cls.typ
def validation(self, account_key, **kwargs):
"""Generate validation for the challenge."""
raise NotImplementedError()
"""Generate validation.
:param JWK account_key:
:param unicode domain: Domain verified by the challenge.
:param OpenSSL.crypto.PKey cert_key: Optional private key used
in certificate generation. If not provided (``None``), then
fresh key will be generated.
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
"""
return self.response(account_key).gen_cert(
key=kwargs.get('cert_key'),
domain=kwargs.get('domain'))
@staticmethod
def is_supported():
"""
Check if TLS-ALPN-01 challenge is supported on this machine.
This implies that a recent version of OpenSSL is installed (>= 1.0.2),
or a recent cryptography version shipped with the OpenSSL library is installed.
:returns: ``True`` if TLS-ALPN-01 is supported on this machine, ``False`` otherwise.
:rtype: bool
"""
return (hasattr(SSL.Connection, "set_alpn_protos")
and hasattr(SSL.Context, "set_alpn_select_callback"))
@Challenge.register

View File

@@ -13,18 +13,20 @@ import josepy as jose
import OpenSSL
import requests
from requests.adapters import HTTPAdapter
from requests.utils import parse_header_links
from requests_toolbelt.adapters.source import SourceAddressAdapter
import six
from six.moves import http_client # pylint: disable=import-error
from six.moves import http_client
from acme import crypto_util
from acme import errors
from acme import jws
from acme import messages
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Text # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict
from acme.magic_typing import List
from acme.magic_typing import Set
from acme.magic_typing import Text
from acme.mixins import VersionedLEACMEMixin
logger = logging.getLogger(__name__)
@@ -36,7 +38,7 @@ if sys.version_info < (2, 7, 9): # pragma: no cover
try:
requests.packages.urllib3.contrib.pyopenssl.inject_into_urllib3() # type: ignore
except AttributeError:
import urllib3.contrib.pyopenssl # pylint: disable=import-error
import urllib3.contrib.pyopenssl
urllib3.contrib.pyopenssl.inject_into_urllib3()
DEFAULT_NETWORK_TIMEOUT = 45
@@ -666,7 +668,7 @@ class ClientV2(ClientBase):
response = self._post(self.directory['newOrder'], order)
body = messages.Order.from_json(response.json())
authorizations = []
for url in body.authorizations: # pylint: disable=not-an-iterable
for url in body.authorizations:
authorizations.append(self._authzr_from_response(self._post_as_get(url), uri=url))
return messages.OrderResource(
body=body,
@@ -732,11 +734,13 @@ class ClientV2(ClientBase):
raise errors.ValidationError(failed)
return orderr.update(authorizations=responses)
def finalize_order(self, orderr, deadline):
def finalize_order(self, orderr, deadline, fetch_alternative_chains=False):
"""Finalize an order and obtain a certificate.
:param messages.OrderResource orderr: order to finalize
:param datetime.datetime deadline: when to stop polling and timeout
:param bool fetch_alternative_chains: whether to also fetch alternative
certificate chains
:returns: finalized order
:rtype: messages.OrderResource
@@ -753,8 +757,13 @@ class ClientV2(ClientBase):
if body.error is not None:
raise errors.IssuanceError(body.error)
if body.certificate is not None:
certificate_response = self._post_as_get(body.certificate).text
return orderr.update(body=body, fullchain_pem=certificate_response)
certificate_response = self._post_as_get(body.certificate)
orderr = orderr.update(body=body, fullchain_pem=certificate_response.text)
if fetch_alternative_chains:
alt_chains_urls = self._get_links(certificate_response, 'alternate')
alt_chains = [self._post_as_get(url).text for url in alt_chains_urls]
orderr = orderr.update(alternative_fullchains_pem=alt_chains)
return orderr
raise errors.TimeoutError()
def revoke(self, cert, rsn):
@@ -784,6 +793,20 @@ class ClientV2(ClientBase):
new_args = args[:1] + (None,) + args[1:]
return self._post(*new_args, **kwargs)
def _get_links(self, response, relation_type):
"""
Retrieves all Link URIs of relation_type from the response.
:param requests.Response response: The requests HTTP response.
:param str relation_type: The relation type to filter by.
"""
# Can't use response.links directly because it drops multiple links
# of the same relation type, which is possible in RFC8555 responses.
if not 'Link' in response.headers:
return []
links = parse_header_links(response.headers['Link'])
return [l['url'] for l in links
if 'rel' in l and 'url' in l and l['rel'] == relation_type]
class BackwardsCompatibleClientV2(object):
"""ACME client wrapper that tends towards V2-style calls, but
@@ -862,11 +885,13 @@ class BackwardsCompatibleClientV2(object):
return messages.OrderResource(authorizations=authorizations, csr_pem=csr_pem)
return self.client.new_order(csr_pem)
def finalize_order(self, orderr, deadline):
def finalize_order(self, orderr, deadline, fetch_alternative_chains=False):
"""Finalize an order and obtain a certificate.
:param messages.OrderResource orderr: order to finalize
:param datetime.datetime deadline: when to stop polling and timeout
:param bool fetch_alternative_chains: whether to also fetch alternative
certificate chains
:returns: finalized order
:rtype: messages.OrderResource
@@ -897,7 +922,7 @@ class BackwardsCompatibleClientV2(object):
chain = crypto_util.dump_pyopenssl_chain(chain).decode()
return orderr.update(fullchain_pem=(cert + chain))
return self.client.finalize_order(orderr, deadline)
return self.client.finalize_order(orderr, deadline, fetch_alternative_chains)
def revoke(self, cert, rsn):
"""Revoke certificate.
@@ -942,7 +967,7 @@ class ClientNetwork(object):
:param messages.RegistrationResource account: Account object. Required if you are
planning to use .post() with acme_version=2 for anything other than
creating a new account; may be set later after registering.
:param josepy.JWASignature alg: Algoritm to use in signing JWS.
:param josepy.JWASignature alg: Algorithm to use in signing JWS.
:param bool verify_ssl: Whether to verify certificates on SSL connections.
:param str user_agent: String to send as User-Agent header.
:param float timeout: Timeout for requests.
@@ -987,6 +1012,8 @@ class ClientNetwork(object):
:rtype: `josepy.JWS`
"""
if isinstance(obj, VersionedLEACMEMixin):
obj.le_acme_version = acme_version
jobj = obj.json_dumps(indent=2).encode() if obj else b''
logger.debug('JWS payload:\n%s', jobj)
kwargs = {
@@ -1022,6 +1049,9 @@ class ClientNetwork(object):
"""
response_ct = response.headers.get('Content-Type')
# Strip parameters from the media-type (rfc2616#section-3.7)
if response_ct:
response_ct = response_ct.split(';')[0].strip()
try:
# TODO: response.json() is called twice, once here, and
# once in _get and _post clients
@@ -1117,8 +1147,8 @@ class ClientNetwork(object):
debug_content = response.content.decode("utf-8")
logger.debug('Received response:\nHTTP %d\n%s\n\n%s',
response.status_code,
"\n".join(["{0}: {1}".format(k, v)
for k, v in response.headers.items()]),
"\n".join("{0}: {1}".format(k, v)
for k, v in response.headers.items()),
debug_content)
return response

View File

@@ -11,10 +11,9 @@ from OpenSSL import crypto
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
from acme import errors
from acme.magic_typing import Callable # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Optional # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Tuple # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Union # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Callable
from acme.magic_typing import Tuple
from acme.magic_typing import Union
logger = logging.getLogger(__name__)
@@ -28,19 +27,41 @@ logger = logging.getLogger(__name__)
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
class SSLSocket(object):
class _DefaultCertSelection(object):
def __init__(self, certs):
self.certs = certs
def __call__(self, connection):
server_name = connection.get_servername()
return self.certs.get(server_name, None)
class SSLSocket(object): # pylint: disable=too-few-public-methods
"""SSL wrapper for sockets.
:ivar socket sock: Original wrapped socket.
:ivar dict certs: Mapping from domain names (`bytes`) to
`OpenSSL.crypto.X509`.
:ivar method: See `OpenSSL.SSL.Context` for allowed values.
:ivar alpn_selection: Hook to select negotiated ALPN protocol for
connection.
:ivar cert_selection: Hook to select certificate for connection. If given,
`certs` parameter would be ignored, and therefore must be empty.
"""
def __init__(self, sock, certs, method=_DEFAULT_SSL_METHOD):
def __init__(self, sock, certs=None,
method=_DEFAULT_SSL_METHOD, alpn_selection=None,
cert_selection=None):
self.sock = sock
self.certs = certs
self.alpn_selection = alpn_selection
self.method = method
if not cert_selection and not certs:
raise ValueError("Neither cert_selection or certs specified.")
if cert_selection and certs:
raise ValueError("Both cert_selection and certs specified.")
if cert_selection is None:
cert_selection = _DefaultCertSelection(certs)
self.cert_selection = cert_selection
def __getattr__(self, name):
return getattr(self.sock, name)
@@ -57,24 +78,25 @@ class SSLSocket(object):
:type connection: :class:`OpenSSL.Connection`
"""
server_name = connection.get_servername()
try:
key, cert = self.certs[server_name]
except KeyError:
logger.debug("Server name (%s) not recognized, dropping SSL",
server_name)
pair = self.cert_selection(connection)
if pair is None:
logger.debug("Certificate selection for server name %s failed, dropping SSL",
connection.get_servername())
return
key, cert = pair
new_context = SSL.Context(self.method)
new_context.set_options(SSL.OP_NO_SSLv2)
new_context.set_options(SSL.OP_NO_SSLv3)
new_context.use_privatekey(key)
new_context.use_certificate(cert)
if self.alpn_selection is not None:
new_context.set_alpn_select_callback(self.alpn_selection)
connection.set_context(new_context)
class FakeConnection(object):
"""Fake OpenSSL.SSL.Connection."""
# pylint: disable=missing-docstring
# pylint: disable=missing-function-docstring
def __init__(self, connection):
self._wrapped = connection
@@ -86,13 +108,15 @@ class SSLSocket(object):
# OpenSSL.SSL.Connection.shutdown doesn't accept any args
return self._wrapped.shutdown()
def accept(self): # pylint: disable=missing-docstring
def accept(self): # pylint: disable=missing-function-docstring
sock, addr = self.sock.accept()
context = SSL.Context(self.method)
context.set_options(SSL.OP_NO_SSLv2)
context.set_options(SSL.OP_NO_SSLv3)
context.set_tlsext_servername_callback(self._pick_certificate_cb)
if self.alpn_selection is not None:
context.set_alpn_select_callback(self.alpn_selection)
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
ssl_sock.set_accept_state()
@@ -108,8 +132,9 @@ class SSLSocket(object):
return ssl_sock, addr
def probe_sni(name, host, port=443, timeout=300,
method=_DEFAULT_SSL_METHOD, source_address=('', 0)):
def probe_sni(name, host, port=443, timeout=300, # pylint: disable=too-many-arguments
method=_DEFAULT_SSL_METHOD, source_address=('', 0),
alpn_protocols=None):
"""Probe SNI server for SSL certificate.
:param bytes name: Byte string to send as the server name in the
@@ -121,6 +146,8 @@ def probe_sni(name, host, port=443, timeout=300,
:param tuple source_address: Enables multi-path probing (selection
of source interface). See `socket.creation_connection` for more
info. Available only in Python 2.7+.
:param alpn_protocols: Protocols to request using ALPN.
:type alpn_protocols: `list` of `bytes`
:raises acme.errors.Error: In case of any problems.
@@ -150,6 +177,8 @@ def probe_sni(name, host, port=443, timeout=300,
client_ssl = SSL.Connection(context, client)
client_ssl.set_connect_state()
client_ssl.set_tlsext_host_name(name) # pyOpenSSL>=0.13
if alpn_protocols is not None:
client_ssl.set_alpn_protos(alpn_protocols)
try:
client_ssl.do_handshake()
client_ssl.shutdown()
@@ -240,12 +269,14 @@ def _pyopenssl_cert_or_req_san(cert_or_req):
def gen_ss_cert(key, domains, not_before=None,
validity=(7 * 24 * 60 * 60), force_san=True):
validity=(7 * 24 * 60 * 60), force_san=True, extensions=None):
"""Generate new self-signed certificate.
:type domains: `list` of `unicode`
:param OpenSSL.crypto.PKey key:
:param bool force_san:
:param extensions: List of additional extensions to include in the cert.
:type extensions: `list` of `OpenSSL.crypto.X509Extension`
If more than one domain is provided, all of the domains are put into
``subjectAltName`` X.509 extension and first domain is set as the
@@ -258,10 +289,13 @@ def gen_ss_cert(key, domains, not_before=None,
cert.set_serial_number(int(binascii.hexlify(os.urandom(16)), 16))
cert.set_version(2)
extensions = [
if extensions is None:
extensions = []
extensions.append(
crypto.X509Extension(
b"basicConstraints", True, b"CA:TRUE, pathlen:0"),
]
)
cert.get_subject().CN = domains[0]
# TODO: what to put into cert.get_subject()?
@@ -298,7 +332,6 @@ def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
def _dump_cert(cert):
if isinstance(cert, jose.ComparableX509):
# pylint: disable=protected-access
cert = cert.wrapped
return crypto.dump_certificate(filetype, cert)

View File

@@ -15,7 +15,7 @@ class Header(jose.Header):
url = jose.Field('url', omitempty=True)
@nonce.decoder
def nonce(value): # pylint: disable=missing-docstring,no-self-argument
def nonce(value): # pylint: disable=no-self-argument,missing-function-docstring
try:
return jose.decode_b64jose(value)
except jose.DeserializationError as error:

View File

@@ -10,8 +10,6 @@ class TypingClass(object):
try:
# mypy doesn't respect modifying sys.modules
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
# pylint: disable=unused-import
from typing import Collection, IO # type: ignore
# pylint: enable=unused-import
except ImportError:
sys.modules[__name__] = TypingClass()

View File

@@ -9,9 +9,10 @@ from acme import errors
from acme import fields
from acme import jws
from acme import util
from acme.mixins import ResourceMixin
try:
from collections.abc import Hashable # pylint: disable=no-name-in-module
from collections.abc import Hashable
except ImportError: # pragma: no cover
from collections import Hashable
@@ -36,7 +37,7 @@ ERROR_CODES = {
' domain'),
'dns': 'There was a problem with a DNS query during identifier validation',
'dnssec': 'The server could not validate a DNSSEC signed domain',
'incorrectResponse': 'Response recieved didn\'t match the challenge\'s requirements',
'incorrectResponse': 'Response received didn\'t match the challenge\'s requirements',
# deprecate invalidEmail
'invalidEmail': 'The provided email for a registration was invalid',
'invalidContact': 'The provided contact URI was invalid',
@@ -245,13 +246,13 @@ class Directory(jose.JSONDeSerializable):
try:
return self[name.replace('_', '-')]
except KeyError as error:
raise AttributeError(str(error) + ': ' + name)
raise AttributeError(str(error))
def __getitem__(self, name):
try:
return self._jobj[self._canon_key(name)]
except KeyError:
raise KeyError('Directory field not found')
raise KeyError('Directory field "' + self._canon_key(name) + '" not found')
def to_partial_json(self):
return self._jobj
@@ -356,13 +357,13 @@ class Registration(ResourceBody):
@Directory.register
class NewRegistration(Registration):
class NewRegistration(ResourceMixin, Registration):
"""New registration."""
resource_type = 'new-reg'
resource = fields.Resource(resource_type)
class UpdateRegistration(Registration):
class UpdateRegistration(ResourceMixin, Registration):
"""Update registration."""
resource_type = 'reg'
resource = fields.Resource(resource_type)
@@ -460,7 +461,6 @@ class ChallengeResource(Resource):
@property
def uri(self):
"""The URL of the challenge body."""
# pylint: disable=function-redefined,no-member
return self.body.uri
@@ -488,7 +488,7 @@ class Authorization(ResourceBody):
wildcard = jose.Field('wildcard', omitempty=True)
@challenges.decoder
def challenges(value): # pylint: disable=missing-docstring,no-self-argument
def challenges(value): # pylint: disable=no-self-argument,missing-function-docstring
return tuple(ChallengeBody.from_json(chall) for chall in value)
@property
@@ -499,13 +499,13 @@ class Authorization(ResourceBody):
@Directory.register
class NewAuthorization(Authorization):
class NewAuthorization(ResourceMixin, Authorization):
"""New authorization."""
resource_type = 'new-authz'
resource = fields.Resource(resource_type)
class UpdateAuthorization(Authorization):
class UpdateAuthorization(ResourceMixin, Authorization):
"""Update authorization."""
resource_type = 'authz'
resource = fields.Resource(resource_type)
@@ -523,7 +523,7 @@ class AuthorizationResource(ResourceWithURI):
@Directory.register
class CertificateRequest(jose.JSONObjectWithFields):
class CertificateRequest(ResourceMixin, jose.JSONObjectWithFields):
"""ACME new-cert request.
:ivar josepy.util.ComparableX509 csr:
@@ -549,7 +549,7 @@ class CertificateResource(ResourceWithURI):
@Directory.register
class Revocation(jose.JSONObjectWithFields):
class Revocation(ResourceMixin, jose.JSONObjectWithFields):
"""Revocation message.
:ivar .ComparableX509 certificate: `OpenSSL.crypto.X509` wrapped in
@@ -566,9 +566,11 @@ class Revocation(jose.JSONObjectWithFields):
class Order(ResourceBody):
"""Order Resource Body.
:ivar list of .Identifier: List of identifiers for the certificate.
:ivar identifiers: List of identifiers for the certificate.
:vartype identifiers: `list` of `.Identifier`
:ivar acme.messages.Status status:
:ivar list of str authorizations: URLs of authorizations.
:ivar authorizations: URLs of authorizations.
:vartype authorizations: `list` of `str`
:ivar str certificate: URL to download certificate as a fullchain PEM.
:ivar str finalize: URL to POST to to request issuance once all
authorizations have "valid" status.
@@ -585,7 +587,7 @@ class Order(ResourceBody):
error = jose.Field('error', omitempty=True, decoder=Error.from_json)
@identifiers.decoder
def identifiers(value): # pylint: disable=missing-docstring,no-self-argument
def identifiers(value): # pylint: disable=no-self-argument,missing-function-docstring
return tuple(Identifier.from_json(identifier) for identifier in value)
class OrderResource(ResourceWithURI):
@@ -593,15 +595,20 @@ class OrderResource(ResourceWithURI):
:ivar acme.messages.Order body:
:ivar str csr_pem: The CSR this Order will be finalized with.
:ivar list of acme.messages.AuthorizationResource authorizations:
Fully-fetched AuthorizationResource objects.
:ivar authorizations: Fully-fetched AuthorizationResource objects.
:vartype authorizations: `list` of `acme.messages.AuthorizationResource`
:ivar str fullchain_pem: The fetched contents of the certificate URL
produced once the order was finalized, if it's present.
:ivar alternative_fullchains_pem: The fetched contents of alternative certificate
chain URLs produced once the order was finalized, if present and requested during
finalization.
:vartype alternative_fullchains_pem: `list` of `str`
"""
body = jose.Field('body', decoder=Order.from_json)
csr_pem = jose.Field('csr_pem', omitempty=True)
authorizations = jose.Field('authorizations')
fullchain_pem = jose.Field('fullchain_pem', omitempty=True)
alternative_fullchains_pem = jose.Field('alternative_fullchains_pem', omitempty=True)
@Directory.register
class NewOrder(Order):

65
acme/acme/mixins.py Normal file
View File

@@ -0,0 +1,65 @@
"""Useful mixins for Challenge and Resource objects"""
class VersionedLEACMEMixin(object):
"""This mixin stores the version of Let's Encrypt's endpoint being used."""
@property
def le_acme_version(self):
"""Define the version of ACME protocol to use"""
return getattr(self, '_le_acme_version', 1)
@le_acme_version.setter
def le_acme_version(self, version):
# We need to use object.__setattr__ to not depend on the specific implementation of
# __setattr__ in current class (eg. jose.TypedJSONObjectWithFields raises AttributeError
# for any attempt to set an attribute to make objects immutable).
object.__setattr__(self, '_le_acme_version', version)
def __setattr__(self, key, value):
if key == 'le_acme_version':
# Required for @property to operate properly. See comment above.
object.__setattr__(self, key, value)
else:
super(VersionedLEACMEMixin, self).__setattr__(key, value) # pragma: no cover
class ResourceMixin(VersionedLEACMEMixin):
"""
This mixin generates a RFC8555 compliant JWS payload
by removing the `resource` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self):
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(ResourceMixin, self),
'to_partial_json', 'resource')
def fields_to_partial_json(self):
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(ResourceMixin, self),
'fields_to_partial_json', 'resource')
class TypeMixin(VersionedLEACMEMixin):
"""
This mixin allows generation of a RFC8555 compliant JWS payload
by removing the `type` field if needed (eg. ACME v2 protocol).
"""
def to_partial_json(self):
"""See josepy.JSONDeserializable.to_partial_json()"""
return _safe_jobj_compliance(super(TypeMixin, self),
'to_partial_json', 'type')
def fields_to_partial_json(self):
"""See josepy.JSONObjectWithFields.fields_to_partial_json()"""
return _safe_jobj_compliance(super(TypeMixin, self),
'fields_to_partial_json', 'type')
def _safe_jobj_compliance(instance, jobj_method, uncompliant_field):
if hasattr(instance, jobj_method):
jobj = getattr(instance, jobj_method)()
if instance.le_acme_version == 2:
jobj.pop(uncompliant_field, None)
return jobj
raise AttributeError('Method {0}() is not implemented.'.format(jobj_method)) # pragma: no cover

View File

@@ -5,19 +5,16 @@ import logging
import socket
import threading
from six.moves import BaseHTTPServer # type: ignore # pylint: disable=import-error
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
from six.moves import BaseHTTPServer # type: ignore
from six.moves import http_client
from six.moves import socketserver # type: ignore
from acme import challenges
from acme import crypto_util
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List
logger = logging.getLogger(__name__)
# six.moves.* | pylint: disable=no-member,attribute-defined-outside-init
# pylint: disable=no-init
class TLSServer(socketserver.TCPServer):
"""Generic TLS Server."""
@@ -30,16 +27,22 @@ class TLSServer(socketserver.TCPServer):
self.address_family = socket.AF_INET
self.certs = kwargs.pop("certs", {})
self.method = kwargs.pop(
# pylint: disable=protected-access
"method", crypto_util._DEFAULT_SSL_METHOD)
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
socketserver.TCPServer.__init__(self, *args, **kwargs)
def _wrap_sock(self):
self.socket = crypto_util.SSLSocket(
self.socket, certs=self.certs, method=self.method)
self.socket, cert_selection=self._cert_selection,
alpn_selection=getattr(self, '_alpn_selection', None),
method=self.method)
def server_bind(self): # pylint: disable=missing-docstring
def _cert_selection(self, connection): # pragma: no cover
"""Callback selecting certificate for connection."""
server_name = connection.get_servername()
return self.certs.get(server_name, None)
def server_bind(self):
self._wrap_sock()
return socketserver.TCPServer.server_bind(self)
@@ -124,6 +127,40 @@ class BaseDualNetworkedServers(object):
self.threads = []
class TLSALPN01Server(TLSServer, ACMEServerMixin):
"""TLSALPN01 Server."""
ACME_TLS_1_PROTOCOL = b"acme-tls/1"
def __init__(self, server_address, certs, challenge_certs, ipv6=False):
TLSServer.__init__(
self, server_address, _BaseRequestHandlerWithLogging, certs=certs,
ipv6=ipv6)
self.challenge_certs = challenge_certs
def _cert_selection(self, connection):
# TODO: We would like to serve challenge cert only if asked for it via
# ALPN. To do this, we need to retrieve the list of protos from client
# hello, but this is currently impossible with openssl [0], and ALPN
# negotiation is done after cert selection.
# Therefore, currently we always return challenge cert, and terminate
# handshake in alpn_selection() if ALPN protos are not what we expect.
# [0] https://github.com/openssl/openssl/issues/4952
server_name = connection.get_servername()
logger.debug("Serving challenge cert for server name %s", server_name)
return self.challenge_certs.get(server_name, None)
def _alpn_selection(self, _connection, alpn_protos):
"""Callback to select alpn protocol."""
if len(alpn_protos) == 1 and alpn_protos[0] == self.ACME_TLS_1_PROTOCOL:
logger.debug("Agreed on %s ALPN", self.ACME_TLS_1_PROTOCOL)
return self.ACME_TLS_1_PROTOCOL
logger.debug("Cannot agree on ALPN proto. Got: %s", str(alpn_protos))
# Explicitly close the connection now, by returning an empty string.
# See https://www.pyopenssl.org/en/stable/api/ssl.html#OpenSSL.SSL.Context.set_alpn_select_callback # pylint: disable=line-too-long
return b""
class HTTPServer(BaseHTTPServer.HTTPServer):
"""Generic HTTP Server."""
@@ -139,10 +176,10 @@ class HTTPServer(BaseHTTPServer.HTTPServer):
class HTTP01Server(HTTPServer, ACMEServerMixin):
"""HTTP01 Server."""
def __init__(self, server_address, resources, ipv6=False):
def __init__(self, server_address, resources, ipv6=False, timeout=30):
HTTPServer.__init__(
self, server_address, HTTP01RequestHandler.partial_init(
simple_http_resources=resources), ipv6=ipv6)
simple_http_resources=resources, timeout=timeout), ipv6=ipv6)
class HTTP01DualNetworkedServers(BaseDualNetworkedServers):
@@ -167,6 +204,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
def __init__(self, *args, **kwargs):
self.simple_http_resources = kwargs.pop("simple_http_resources", set())
self.timeout = kwargs.pop('timeout', 30)
BaseHTTPServer.BaseHTTPRequestHandler.__init__(self, *args, **kwargs)
def log_message(self, format, *args): # pylint: disable=redefined-builtin
@@ -178,7 +216,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
self.log_message("Incoming request")
BaseHTTPServer.BaseHTTPRequestHandler.handle(self)
def do_GET(self): # pylint: disable=invalid-name,missing-docstring
def do_GET(self): # pylint: disable=invalid-name,missing-function-docstring
if self.path == "/":
self.handle_index()
elif self.path.startswith("/" + challenges.HTTP01.URI_ROOT_PATH):
@@ -216,7 +254,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
self.path)
@classmethod
def partial_init(cls, simple_http_resources):
def partial_init(cls, simple_http_resources, timeout):
"""Partially initialize this handler.
This is useful because `socketserver.BaseServer` takes
@@ -225,4 +263,18 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
"""
return functools.partial(
cls, simple_http_resources=simple_http_resources)
cls, simple_http_resources=simple_http_resources,
timeout=timeout)
class _BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
"""BaseRequestHandler with logging."""
def log_message(self, format, *args): # pylint: disable=redefined-builtin
"""Log arbitrary message."""
logger.debug("%s - - %s", self.client_address[0], format % args)
def handle(self):
"""Handle request."""
self.log_message("Incoming request")
socketserver.BaseRequestHandler.handle(self)

View File

@@ -13,7 +13,6 @@
# serve to show the default.
import os
import shlex
import sys
here = os.path.abspath(os.path.dirname(__file__))
@@ -41,7 +40,7 @@ extensions = [
]
autodoc_member_order = 'bysource'
autodoc_default_flags = ['show-inheritance', 'private-members']
autodoc_default_flags = ['show-inheritance']
# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']
@@ -113,7 +112,7 @@ pygments_style = 'sphinx'
#keep_warnings = False
# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = True
todo_include_todos = False
# -- Options for HTML output ----------------------------------------------

View File

@@ -1,10 +1,12 @@
from distutils.version import LooseVersion
import sys
from setuptools import __version__ as setuptools_version
from setuptools import find_packages
from setuptools import setup
from setuptools.command.test import test as TestCommand
version = '1.1.0.dev0'
version = '1.6.0'
# Please update tox.ini when modifying dependency version requirements
install_requires = [
@@ -15,9 +17,8 @@ install_requires = [
# 1.1.0+ is required to avoid the warnings described at
# https://github.com/certbot/josepy/issues/13.
'josepy>=1.1.0',
'mock',
# Connection.set_tlsext_host_name (>=0.13)
'PyOpenSSL>=0.13.1',
# Connection.set_tlsext_host_name (>=0.13) + matching Xenial requirements (>=0.15.1)
'PyOpenSSL>=0.15.1',
'pyrfc3339',
'pytz',
'requests[security]>=2.6.0', # security extras added in 2.4.1
@@ -26,6 +27,15 @@ install_requires = [
'six>=1.9.0', # needed for python_2_unicode_compatible
]
setuptools_known_environment_markers = (LooseVersion(setuptools_version) >= LooseVersion('36.2'))
if setuptools_known_environment_markers:
install_requires.append('mock ; python_version < "3.3"')
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Error, you are trying to build certbot wheels using an old version '
'of setuptools. Version 36.2+ of setuptools is required.')
elif sys.version_info < (3,3):
install_requires.append('mock')
dev_extras = [
'pytest',
'pytest-xdist',
@@ -61,7 +71,7 @@ setup(
author="Certbot Project",
author_email='client-dev@letsencrypt.org',
license='Apache License 2.0',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Intended Audience :: Developers',
@@ -70,7 +80,6 @@ setup(
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',

View File

@@ -2,10 +2,16 @@
import unittest
import josepy as jose
import mock
import OpenSSL
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import requests
from six.moves.urllib import parse as urllib_parse
from acme import errors
import test_util
CERT = test_util.load_comparable_cert('cert.pem')
@@ -181,7 +187,7 @@ class HTTP01ResponseTest(unittest.TestCase):
mock_get.return_value = mock.MagicMock(text=validation)
self.assertTrue(self.response.simple_verify(
self.chall, "local", KEY.public_key()))
mock_get.assert_called_once_with(self.chall.uri("local"))
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_bad_validation(self, mock_get):
@@ -197,7 +203,7 @@ class HTTP01ResponseTest(unittest.TestCase):
HTTP01Response.WHITESPACE_CUTSET))
self.assertTrue(self.response.simple_verify(
self.chall, "local", KEY.public_key()))
mock_get.assert_called_once_with(self.chall.uri("local"))
mock_get.assert_called_once_with(self.chall.uri("local"), verify=False)
@mock.patch("acme.challenges.requests.get")
def test_simple_verify_connection_error(self, mock_get):
@@ -256,30 +262,87 @@ class HTTP01Test(unittest.TestCase):
class TLSALPN01ResponseTest(unittest.TestCase):
def setUp(self):
from acme.challenges import TLSALPN01Response
self.msg = TLSALPN01Response(key_authorization=u'foo')
from acme.challenges import TLSALPN01
self.chall = TLSALPN01(
token=jose.b64decode(b'a82d5ff8ef740d12881f6d3c2277ab2e'))
self.domain = u'example.com'
self.domain2 = u'example2.com'
self.response = self.chall.response(KEY)
self.jmsg = {
'resource': 'challenge',
'type': 'tls-alpn-01',
'keyAuthorization': u'foo',
'keyAuthorization': self.response.key_authorization,
}
from acme.challenges import TLSALPN01
self.chall = TLSALPN01(token=(b'x' * 16))
self.response = self.chall.response(KEY)
def test_to_partial_json(self):
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
self.msg.to_partial_json())
self.response.to_partial_json())
def test_from_json(self):
from acme.challenges import TLSALPN01Response
self.assertEqual(self.msg, TLSALPN01Response.from_json(self.jmsg))
self.assertEqual(self.response, TLSALPN01Response.from_json(self.jmsg))
def test_from_json_hashable(self):
from acme.challenges import TLSALPN01Response
hash(TLSALPN01Response.from_json(self.jmsg))
def test_gen_verify_cert(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(self.domain, key1)
self.assertEqual(key1, key2)
self.assertTrue(self.response.verify_cert(self.domain, cert))
def test_gen_verify_cert_gen_key(self):
cert, key = self.response.gen_cert(self.domain)
self.assertTrue(isinstance(key, OpenSSL.crypto.PKey))
self.assertTrue(self.response.verify_cert(self.domain, cert))
def test_verify_bad_cert(self):
self.assertFalse(self.response.verify_cert(self.domain,
test_util.load_cert('cert.pem')))
def test_verify_bad_domain(self):
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
cert, key2 = self.response.gen_cert(self.domain, key1)
self.assertEqual(key1, key2)
self.assertFalse(self.response.verify_cert(self.domain2, cert))
def test_simple_verify_bad_key_authorization(self):
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
self.response.simple_verify(self.chall, "local", key2.public_key())
@mock.patch('acme.challenges.TLSALPN01Response.verify_cert', autospec=True)
def test_simple_verify(self, mock_verify_cert):
mock_verify_cert.return_value = mock.sentinel.verification
self.assertEqual(
mock.sentinel.verification, self.response.simple_verify(
self.chall, self.domain, KEY.public_key(),
cert=mock.sentinel.cert))
mock_verify_cert.assert_called_once_with(
self.response, self.domain, mock.sentinel.cert)
@mock.patch('acme.challenges.socket.gethostbyname')
@mock.patch('acme.challenges.crypto_util.probe_sni')
def test_probe_cert(self, mock_probe_sni, mock_gethostbyname):
mock_gethostbyname.return_value = '127.0.0.1'
self.response.probe_cert('foo.com')
mock_gethostbyname.assert_called_once_with('foo.com')
mock_probe_sni.assert_called_once_with(
host='127.0.0.1', port=self.response.PORT, name='foo.com',
alpn_protocols=['acme-tls/1'])
self.response.probe_cert('foo.com', host='8.8.8.8')
mock_probe_sni.assert_called_with(
host='8.8.8.8', port=mock.ANY, name='foo.com',
alpn_protocols=['acme-tls/1'])
@mock.patch('acme.challenges.TLSALPN01Response.probe_cert')
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
mock_probe_cert.side_effect = errors.Error
self.assertFalse(self.response.simple_verify(
self.chall, self.domain, KEY.public_key()))
class TLSALPN01Test(unittest.TestCase):
@@ -309,8 +372,13 @@ class TLSALPN01Test(unittest.TestCase):
self.assertRaises(
jose.DeserializationError, TLSALPN01.from_json, self.jmsg)
def test_validation(self):
self.assertRaises(NotImplementedError, self.msg.validation, KEY)
@mock.patch('acme.challenges.TLSALPN01Response.gen_cert')
def test_validation(self, mock_gen_cert):
mock_gen_cert.return_value = ('cert', 'key')
self.assertEqual(('cert', 'key'), self.msg.validation(
KEY, cert_key=mock.sentinel.cert_key, domain=mock.sentinel.domain))
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key,
domain=mock.sentinel.domain)
class DNSTest(unittest.TestCase):
@@ -413,5 +481,18 @@ class DNSResponseTest(unittest.TestCase):
self.msg.check_validation(self.chall, KEY.public_key()))
class JWSPayloadRFC8555Compliant(unittest.TestCase):
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
def test_challenge_payload(self):
from acme.challenges import HTTP01Response
challenge_body = HTTP01Response()
challenge_body.le_acme_version = 2
jobj = challenge_body.json_dumps(indent=2).encode()
# RFC8555 states that challenge responses must have an empty payload.
self.assertEqual(jobj, b'{}')
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -6,7 +6,10 @@ import json
import unittest
import josepy as jose
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import OpenSSL
import requests
from six.moves import http_client # pylint: disable=import-error
@@ -15,7 +18,7 @@ from acme import challenges
from acme import errors
from acme import jws as acme_jws
from acme import messages
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.mixins import VersionedLEACMEMixin
import messages_test
import test_util
@@ -260,7 +263,7 @@ class BackwardsCompatibleClientV2Test(ClientTestBase):
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.finalize_order(mock_orderr, mock_deadline)
mock_client().finalize_order.assert_called_once_with(mock_orderr, mock_deadline)
mock_client().finalize_order.assert_called_once_with(mock_orderr, mock_deadline, False)
def test_revoke(self):
self.response.json.return_value = DIRECTORY_V1.to_json()
@@ -839,6 +842,32 @@ class ClientV2Test(ClientTestBase):
deadline = datetime.datetime.now() - datetime.timedelta(seconds=60)
self.assertRaises(errors.TimeoutError, self.client.finalize_order, self.orderr, deadline)
def test_finalize_order_alt_chains(self):
updated_order = self.order.update(
certificate='https://www.letsencrypt-demo.org/acme/cert/',
)
updated_orderr = self.orderr.update(body=updated_order,
fullchain_pem=CERT_SAN_PEM,
alternative_fullchains_pem=[CERT_SAN_PEM,
CERT_SAN_PEM])
self.response.json.return_value = updated_order.to_json()
self.response.text = CERT_SAN_PEM
self.response.headers['Link'] ='<https://example.com/acme/cert/1>;rel="alternate", ' + \
'<https://exaple.com/dir>;rel="index", ' + \
'<https://example.com/acme/cert/2>;title="foo";rel="alternate"'
deadline = datetime.datetime(9999, 9, 9)
resp = self.client.finalize_order(self.orderr, deadline, fetch_alternative_chains=True)
self.net.post.assert_any_call('https://example.com/acme/cert/1',
mock.ANY, acme_version=2, new_nonce_url=mock.ANY)
self.net.post.assert_any_call('https://example.com/acme/cert/2',
mock.ANY, acme_version=2, new_nonce_url=mock.ANY)
self.assertEqual(resp, updated_orderr)
del self.response.headers['Link']
resp = self.client.finalize_order(self.orderr, deadline, fetch_alternative_chains=True)
self.assertEqual(resp, updated_orderr.update(alternative_fullchains_pem=[]))
def test_revoke(self):
self.client.revoke(messages_test.CERT, self.rsn)
self.net.post.assert_called_once_with(
@@ -886,7 +915,7 @@ class ClientV2Test(ClientTestBase):
self.client.net.get.assert_not_called()
class MockJSONDeSerializable(jose.JSONDeSerializable):
class MockJSONDeSerializable(VersionedLEACMEMixin, jose.JSONDeSerializable):
# pylint: disable=missing-docstring
def __init__(self, value):
self.value = value
@@ -980,6 +1009,35 @@ class ClientNetworkTest(unittest.TestCase):
self.assertEqual(
self.response, self.net._check_response(self.response))
@mock.patch('acme.client.logger')
def test_check_response_ok_ct_with_charset(self, mock_logger):
self.response.json.return_value = {}
self.response.headers['Content-Type'] = 'application/json; charset=utf-8'
# pylint: disable=protected-access
self.assertEqual(self.response, self.net._check_response(
self.response, content_type='application/json'))
try:
mock_logger.debug.assert_called_with(
'Ignoring wrong Content-Type (%r) for JSON decodable response',
'application/json; charset=utf-8'
)
except AssertionError:
return
raise AssertionError('Expected Content-Type warning ' #pragma: no cover
'to not have been logged')
@mock.patch('acme.client.logger')
def test_check_response_ok_bad_ct(self, mock_logger):
self.response.json.return_value = {}
self.response.headers['Content-Type'] = 'text/plain'
# pylint: disable=protected-access
self.assertEqual(self.response, self.net._check_response(
self.response, content_type='application/json'))
mock_logger.debug.assert_called_with(
'Ignoring wrong Content-Type (%r) for JSON decodable response',
'text/plain'
)
def test_check_response_conflict(self):
self.response.ok = False
self.response.status_code = 409

View File

@@ -11,14 +11,12 @@ import six
from six.moves import socketserver # type: ignore # pylint: disable=import-error
from acme import errors
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
import test_util
class SSLSocketAndProbeSNITest(unittest.TestCase):
"""Tests for acme.crypto_util.SSLSocket/probe_sni."""
def setUp(self):
self.cert = test_util.load_comparable_cert('rsa2048_cert.pem')
key = test_util.load_pyopenssl_private_key('rsa2048_key.pem')
@@ -32,7 +30,8 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
# six.moves.* | pylint: disable=attribute-defined-outside-init,no-init
def server_bind(self): # pylint: disable=missing-docstring
self.socket = SSLSocket(socket.socket(), certs=certs)
self.socket = SSLSocket(socket.socket(),
certs)
socketserver.TCPServer.server_bind(self)
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
@@ -73,6 +72,18 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
socket.setdefaulttimeout(original_timeout)
class SSLSocketTest(unittest.TestCase):
"""Tests for acme.crypto_util.SSLSocket."""
def test_ssl_socket_invalid_arguments(self):
from acme.crypto_util import SSLSocket
with self.assertRaises(ValueError):
_ = SSLSocket(None, {'sni': ('key', 'cert')},
cert_selection=lambda _: None)
with self.assertRaises(ValueError):
_ = SSLSocket(None)
class PyOpenSSLCertOrReqAllNamesTest(unittest.TestCase):
"""Test for acme.crypto_util._pyopenssl_cert_or_req_all_names."""

View File

@@ -1,7 +1,10 @@
"""Tests for acme.errors."""
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
class BadNonceTest(unittest.TestCase):
@@ -35,7 +38,7 @@ class PollErrorTest(unittest.TestCase):
def setUp(self):
from acme.errors import PollError
self.timeout = PollError(
exhausted=set([mock.sentinel.AR]),
exhausted={mock.sentinel.AR},
updated={})
self.invalid = PollError(exhausted=set(), updated={
mock.sentinel.AR: mock.sentinel.AR2})

View File

@@ -2,7 +2,10 @@
import sys
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
class MagicTypingTest(unittest.TestCase):
@@ -18,7 +21,7 @@ class MagicTypingTest(unittest.TestCase):
sys.modules['typing'] = typing_class_mock
if 'acme.magic_typing' in sys.modules:
del sys.modules['acme.magic_typing'] # pragma: no cover
from acme.magic_typing import Text # pylint: disable=no-name-in-module
from acme.magic_typing import Text
self.assertEqual(Text, text_mock)
del sys.modules['acme.magic_typing']
sys.modules['typing'] = temp_typing
@@ -31,7 +34,7 @@ class MagicTypingTest(unittest.TestCase):
sys.modules['typing'] = None
if 'acme.magic_typing' in sys.modules:
del sys.modules['acme.magic_typing'] # pragma: no cover
from acme.magic_typing import Text # pylint: disable=no-name-in-module
from acme.magic_typing import Text
self.assertTrue(Text is None)
del sys.modules['acme.magic_typing']
sys.modules['typing'] = temp_typing

View File

@@ -2,10 +2,12 @@
import unittest
import josepy as jose
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from acme import challenges
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
import test_util
CERT = test_util.load_comparable_cert('cert.der')
@@ -453,6 +455,7 @@ class OrderResourceTest(unittest.TestCase):
'authorizations': None,
})
class NewOrderTest(unittest.TestCase):
"""Tests for acme.messages.NewOrder."""
@@ -467,5 +470,18 @@ class NewOrderTest(unittest.TestCase):
})
class JWSPayloadRFC8555Compliant(unittest.TestCase):
"""Test for RFC8555 compliance of JWS generated from resources/challenges"""
def test_message_payload(self):
from acme.messages import NewAuthorization
new_order = NewAuthorization()
new_order.le_acme_version = 2
jobj = new_order.json_dumps(indent=2).encode()
# RFC8555 states that JWS bodies must not have a resource field.
self.assertEqual(jobj, b'{}')
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -4,13 +4,18 @@ import threading
import unittest
import josepy as jose
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import requests
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
from acme import challenges
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme import crypto_util
from acme import errors
import test_util
@@ -83,6 +88,81 @@ class HTTP01ServerTest(unittest.TestCase):
def test_http01_not_found(self):
self.assertFalse(self._test_http01(add=False))
def test_timely_shutdown(self):
from acme.standalone import HTTP01Server
server = HTTP01Server(('', 0), resources=set(), timeout=0.05)
server_thread = threading.Thread(target=server.serve_forever)
server_thread.start()
client = socket.socket()
client.connect(('localhost', server.socket.getsockname()[1]))
stop_thread = threading.Thread(target=server.shutdown)
stop_thread.start()
server_thread.join(5.)
is_hung = server_thread.is_alive()
try:
client.shutdown(socket.SHUT_RDWR)
except: # pragma: no cover, pylint: disable=bare-except
# may raise error because socket could already be closed
pass
self.assertFalse(is_hung, msg='Server shutdown should not be hung')
@unittest.skipIf(not challenges.TLSALPN01.is_supported(), "pyOpenSSL too old")
class TLSALPN01ServerTest(unittest.TestCase):
"""Test for acme.standalone.TLSALPN01Server."""
def setUp(self):
self.certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa2048_key.pem'),
test_util.load_cert('rsa2048_cert.pem'),
)}
# Use different certificate for challenge.
self.challenge_certs = {b'localhost': (
test_util.load_pyopenssl_private_key('rsa4096_key.pem'),
test_util.load_cert('rsa4096_cert.pem'),
)}
from acme.standalone import TLSALPN01Server
self.server = TLSALPN01Server(("localhost", 0), certs=self.certs,
challenge_certs=self.challenge_certs)
# pylint: disable=no-member
self.thread = threading.Thread(target=self.server.serve_forever)
self.thread.start()
def tearDown(self):
self.server.shutdown() # pylint: disable=no-member
self.thread.join()
# TODO: This is not implemented yet, see comments in standalone.py
# def test_certs(self):
# host, port = self.server.socket.getsockname()[:2]
# cert = crypto_util.probe_sni(
# b'localhost', host=host, port=port, timeout=1)
# # Expect normal cert when connecting without ALPN.
# self.assertEqual(jose.ComparableX509(cert),
# jose.ComparableX509(self.certs[b'localhost'][1]))
def test_challenge_certs(self):
host, port = self.server.socket.getsockname()[:2]
cert = crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1,
alpn_protocols=[b"acme-tls/1"])
# Expect challenge cert when connecting with ALPN.
self.assertEqual(
jose.ComparableX509(cert),
jose.ComparableX509(self.challenge_certs[b'localhost'][1])
)
def test_bad_alpn(self):
host, port = self.server.socket.getsockname()[:2]
with self.assertRaises(errors.Error):
crypto_util.probe_sni(
b'localhost', host=host, port=port, timeout=1,
alpn_protocols=[b"bad-alpn"])
class BaseDualNetworkedServersTest(unittest.TestCase):
"""Test for acme.standalone.BaseDualNetworkedServers."""
@@ -138,7 +218,6 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
class HTTP01DualNetworkedServersTest(unittest.TestCase):
"""Tests for acme.standalone.HTTP01DualNetworkedServers."""
def setUp(self):
self.account_key = jose.JWK.load(
test_util.load_vector('rsa1024_key.pem'))

View File

@@ -4,12 +4,14 @@ to use appropriate extension for vector filenames: .pem for PEM and
The following command has been used to generate test keys:
for x in 256 512 1024 2048; do openssl genrsa -out rsa${k}_key.pem $k; done
for k in 256 512 1024 2048 4096; do openssl genrsa -out rsa${k}_key.pem $k; done
and for the CSR:
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -outform DER > csr.der
and for the certificate:
and for the certificates:
openssl req -key rsa2047_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 -outform DER > cert.der
openssl req -key rsa2048_key.pem -new -subj '/CN=example.com' -x509 > rsa2048_cert.pem
openssl req -key rsa1024_key.pem -new -subj '/CN=example.com' -x509 > rsa1024_cert.pem

13
acme/tests/testdata/rsa1024_cert.pem vendored Normal file
View File

@@ -0,0 +1,13 @@
-----BEGIN CERTIFICATE-----
MIIB/TCCAWagAwIBAgIJAOyRIBs3QT8QMA0GCSqGSIb3DQEBCwUAMBYxFDASBgNV
BAMMC2V4YW1wbGUuY29tMB4XDTE4MDQyMzEwMzE0NFoXDTE4MDUyMzEwMzE0NFow
FjEUMBIGA1UEAwwLZXhhbXBsZS5jb20wgZ8wDQYJKoZIhvcNAQEBBQADgY0AMIGJ
AoGBAJqJ87R8aVwByONxgQA9hwgvQd/QqI1r1UInXhEF2VnEtZGtUWLi100IpIqr
Mq4qusDwNZ3g8cUPtSkvJGs89djoajMDIJP7lQUEKUYnYrI0q755Tr/DgLWSk7iW
l5ezym0VzWUD0/xXUz8yRbNMTjTac80rS5SZk2ja2wWkYlRJAgMBAAGjUzBRMB0G
A1UdDgQWBBSsaX0IVZ4XXwdeffVAbG7gnxSYjTAfBgNVHSMEGDAWgBSsaX0IVZ4X
XwdeffVAbG7gnxSYjTAPBgNVHRMBAf8EBTADAQH/MA0GCSqGSIb3DQEBCwUAA4GB
ADe7SVmvGH2nkwVfONk8TauRUDkePN1CJZKFb2zW1uO9ANJ2v5Arm/OQp0BG/xnI
Djw/aLTNVESF89oe15dkrUErtcaF413MC1Ld5lTCaJLHLGqDKY69e02YwRuxW7jY
qarpt7k7aR5FbcfO5r4V/FK/Gvp4Dmoky8uap7SJIW6x
-----END CERTIFICATE-----

30
acme/tests/testdata/rsa4096_cert.pem vendored Normal file
View File

@@ -0,0 +1,30 @@
-----BEGIN CERTIFICATE-----
MIIFDTCCAvWgAwIBAgIUImqDrP53V69vFROsjP/gL0YtoA4wDQYJKoZIhvcNAQEL
BQAwFjEUMBIGA1UEAwwLZXhhbXBsZS5jb20wHhcNMjAwNTI3MjMyNDE0WhcNMjAw
NjI2MjMyNDE0WjAWMRQwEgYDVQQDDAtleGFtcGxlLmNvbTCCAiIwDQYJKoZIhvcN
AQEBBQADggIPADCCAgoCggIBANY9LKLk9Dxn0MUMQFHwBoTN4ehDSWBws2KcytpF
mc8m9Mfk1wmb4fQSKYtK3wIFMfIyo9HQu0nKqMkkUw52o3ZXyOv+oWwF5qNy2BKu
lh5OMSkaZ0o13zoPpW42e+IUnyxvg70+0urD+sUue4cyTHh/nBIUjrM/05ZJ/ac8
HR0RK3H41YoqBjq69JjMZczZZhbNFit3s6p0R1TbVAgc3ckqbtX5BDyQMQQCP4Ed
m4DgbAFVqdcPUCC5W3F3fmuQiPKHiADzONZnXpy6lUvLDWqcd6loKp+nKHM6OkXX
8hmD7pE1PYMQo4hqOfhBR2IgMjAShwd5qUFjl1m2oo0Qm3PFXOk6i2ZQdS6AA/yd
B5/mX0RnM2oIdFZPb6UZFSmtEgs9sTzn+hMUyNSZQRE54px1ur1xws2R+vbsCyM5
+KoFVxDjVjU9TlZx3GvDvnqz/tbHjji6l8VHZYOBMBUXbKHu2U6pJFZ5Zp7k68/z
a3Fb9Pjtn3iRkXEyC0N5kLgqO4QTlExnxebV8aMvQpWd/qefnMn9qPYIZPEXSQAR
mEBIahkcACb60s+acG0WFFluwBPtBqEr8Q67XlSF0Ibf4iBiRzpPobhlWta1nrFg
4IWHMSoZ0PE75bhIGBEkhrpcXQCAxXmAfxfjKDH7jdJ1fRdnZ/9+OzwYGVX5GH/l
0QDtAgMBAAGjUzBRMB0GA1UdDgQWBBQh3xiz/o1nEU2ySylZ9gxCXvIPGzAfBgNV
HSMEGDAWgBQh3xiz/o1nEU2ySylZ9gxCXvIPGzAPBgNVHRMBAf8EBTADAQH/MA0G
CSqGSIb3DQEBCwUAA4ICAQAELoXz31oR9pdAwidlv9ZBOKiC7KBWy8VMqXNVkfTn
bVRxAUex7zleLFIOkWnqadsMesU9sIwrbLzBcZ8Q/vBY+z2xOPdXcgcAoAmdKWoq
YBQNiqng9r54sqlzB/77QZCf5fdktESe7NTxhCifgx5SAWq7IUQs/lm3tnMUSAfE
5ctuN6M+w8K54y3WDprcfMHpnc3ZHeSPhVQApHM0h/bDvXq0bRS7kmq27Hb153Qm
nH3TwYB5pPSWW38NbUc+s/a7mItO7S8ly8yGbA0j9c/IbN5lM+OCdk06asz3+c8E
uo8nuCBoYO5+6AqC2N7WJ3Tdr/pFA8jTbd6VNVlgCWTIR8ZosL5Fgkfv+4fUBrHt
zdVUqMUzvga5rvZnwnJ5Qfu/drHeAAo9MTNFQNe2QgDlYfWBh5GweolgmFSwrpkY
v/5wLtIyv/ASHKswybbqMIlpttcLTXjx5yuh8swttT6Wh+FQqqQ32KSRB3StiwyK
oH0ZhrwYHiFYNlPxecGX6XUta6rFtTlEdkBGSnXzgiTzL2l+Nc0as0V5B9RninZG
qJ+VOChSQ0OFvg1riSXv7tMvbLdGQnxwTRL3t6BMS8I4LA2m3ZfWUcuXT783ODTH
16f1Q1AgXd2csstTWO9cv+N/0fpX31nqrm6+CrGduSr2u4HjYYnlLIUhmdTvK3fX
Fg==
-----END CERTIFICATE-----

51
acme/tests/testdata/rsa4096_key.pem vendored Normal file
View File

@@ -0,0 +1,51 @@
-----BEGIN RSA PRIVATE KEY-----
MIIJKgIBAAKCAgEA1j0souT0PGfQxQxAUfAGhM3h6ENJYHCzYpzK2kWZzyb0x+TX
CZvh9BIpi0rfAgUx8jKj0dC7ScqoySRTDnajdlfI6/6hbAXmo3LYEq6WHk4xKRpn
SjXfOg+lbjZ74hSfLG+DvT7S6sP6xS57hzJMeH+cEhSOsz/Tlkn9pzwdHRErcfjV
iioGOrr0mMxlzNlmFs0WK3ezqnRHVNtUCBzdySpu1fkEPJAxBAI/gR2bgOBsAVWp
1w9QILlbcXd+a5CI8oeIAPM41mdenLqVS8sNapx3qWgqn6coczo6RdfyGYPukTU9
gxCjiGo5+EFHYiAyMBKHB3mpQWOXWbaijRCbc8Vc6TqLZlB1LoAD/J0Hn+ZfRGcz
agh0Vk9vpRkVKa0SCz2xPOf6ExTI1JlBETninHW6vXHCzZH69uwLIzn4qgVXEONW
NT1OVnHca8O+erP+1seOOLqXxUdlg4EwFRdsoe7ZTqkkVnlmnuTrz/NrcVv0+O2f
eJGRcTILQ3mQuCo7hBOUTGfF5tXxoy9ClZ3+p5+cyf2o9ghk8RdJABGYQEhqGRwA
JvrSz5pwbRYUWW7AE+0GoSvxDrteVIXQht/iIGJHOk+huGVa1rWesWDghYcxKhnQ
8TvluEgYESSGulxdAIDFeYB/F+MoMfuN0nV9F2dn/347PBgZVfkYf+XRAO0CAwEA
AQKCAgEA0hZdTkQtCYtYm9LexDsXeWYX8VcCfrMmBj7xYcg9A3oVMmzDPuYBVwH0
gWbjd6y2hOaJ5TfGYZ99kvmvBRDsTSHaoyopC7BhssjtAKz6Ay/0X3VH8usPQ3WS
aZi+NT65tK6KRqtz08ppgLGLa1G00bl5x/Um1rpxeACI4FU/y4BJ1VMJvJpnT3KE
Z86Qyagqx5NH+UpCApZSWPFX3zjHePzGgcfXErjniCHYOnpZQrFQ2KIzkfSvQ9fg
x01ByKOM2CB2C1B33TCzBAioXRH6zyAu7A59NeCK9ywTduhDvie1a+oEryFC7IQW
4s7I/H3MGX4hsf/pLXlHMy+5CZJOjRaC2h+pypfbbcuiXu6Sn64kHNpiI7SxI5DI
MIRjyG7MdUcrzq0Rt8ogwwpbCoRqrl/w3bhxtqmeZaEZtyxbjlm7reK2YkIFDgyz
JMqiJK5ZAi+9L/8c0xhjjAQQ0sIzrjmjA8U+6YnWL9jU5qXTVnBB8XQucyeeZGgk
yRHyMur71qOXN8z3UEva7MHkDTUBlj8DgTz6sEjqCipaWl0CXfDNa4IhHIXD5qiF
wplhq7OeS0v6EGG/UFa3Q/lFntxtrayxJX7uvvSccGzjPKXTjpWUELLi/FdnIsum
eXT3RgIEYozj4BibDXaBLfHTCVzxOr7AAEvKM9XWSUgLA0paSWECggEBAO9ZBeE1
GWzd1ejTTkcxBC9AK2rNsYG8PdNqiof/iTbuJWNeRqpG+KB/0CNIpjZ2X5xZd0tM
FDpHTFehlP26Roxuq50iRAFc+SN5KoiO0A3JuJAidreIgRTia1saUUrypHqWrYEA
VZVj2AI8Pyg3s1OkR2frFskY7hXBVb/pJNDP/m9xTXXIYiIXYkHYe+4RIJCnAxRv
q5YHKaX+0Ull9YCZJCxmwvcHat8sgu8qkiwUMEM6QSNEkrEbdnWYBABvC1AR6sws
7MP1h9+j22n4Zc/3D6kpFZEL9Erx8nNyhbOZ6q2Tdnf6YKVVjZdyVa8VyNnR0ROl
3BjkFaHb/bg4e4kCggEBAOUk8ZJS3qBeGCOjug384zbHGcnhUBYtYJiOz+RXBtP+
PRksbFtTkgk1sHuSGO8YRddU4Qv7Av1xL8o+DEsLBSD0YQ7pmLrR/LK+iDQ5N63O
Fve9uJH0ybxAOkiua7G24+lTsVUP//KWToL4Wh5zbHBBjL5D2Z9zoeVbcE87xhva
lImMVr4Ex252DqNP9wkZxBjudFyJ/C/TnXrjPcgwhxWTC7sLQMhE5p+490G7c4hX
PywkIKrANbu37KDiAvVS+dC66ZgpL/NUDkeloAmGNO08LGzbV6YKchlvDyWU/AvW
0hYjbL0FUq7K/wp1G9fumolB+fbI25K9c13X93STzUUCggEBAJDsNFUyk5yJjbYW
C/WrRj9d+WwH9Az77+uNPSgvn+O0usq6EMuVgYGdImfa21lqv2Wp/kOHY1AOT7lX
yyD+oyzw7dSNJOQ2aVwDR6+72Vof5DLRy1RBwPbmSd61xrc8yD658YCEtU1pUSe5
VvyBDYH9nIbdn8RP5gkiMUusXXBaIFNWJXLFzDWcNxBrhk6V7EPp/EFphFmpKJyr
+AkbRVWCZJbF+hMdWKadCwLJogwyhS6PnVU/dhrq6AU38GRa2Fy5HJRYN1xH1Oej
DX3Su8L6c28Xw0k6FcczTHx+wVoIPkKvYTIwVkiFzt/+iMckx6KsGo5tBSHFKRwC
WlQrTxECggEBALjUruLnY1oZ7AC7bTUhOimSOfQEgTQSUCtebsRxijlvhtsKYTDd
XRt+qidStjgN7S/+8DRYuZWzOeg5WnMhpXZqiOudcyume922IGl3ibjxVsdoyjs5
J4xohlrgDlBgBMDNWGoTqNGFejjcmNydH+gAh8VlN2INxJYbxqCyx17qVgwJHmLR
uggYxD/pHYvCs9GkbknCp5/wYsOgDtKuihfV741lS1D/esN1UEQ+LrfYIEW7snno
5q7Pcdhn1hkKYCWEzy2Ec4Aj2gzixQ9JqOF/OxpnZvCw1k47rg0TeqcWFYnz8x8Y
7xO8/DH0OoxXk2GJzVXJuItJs4gLzzfCjL0CggEAJFHfC9jisdy7CoWiOpNCSF1B
S0/CWDz77cZdlWkpTdaXGGp1MA/UKUFPIH8sOHfvpKS660+X4G/1ZBHmFb4P5kFF
Qy8UyUMKtSOEdZS6KFlRlfSCAMd5aSTmCvq4OSjYEpMRwUhU/iEJNkn9Z1Soehe0
U3dxJ8KiT1071geO6rRquSHoSJs6Y0WQKriYYQJOhh4Axs3PQihER2eyh+WGk8YJ
02m0mMsjntqnXtdc6IcdKaHp9ko+OpM9QZLsvt19fxBcrXj/i21uUXrzuNtKfO6M
JqGhsOrO2dh8lMhvodENvgKA0DmYDC9N7ogo7bxTNSedcjBF46FhJoqii8m70Q==
-----END RSA PRIVATE KEY-----

View File

@@ -1,8 +1,7 @@
include LICENSE.txt
include README.rst
recursive-include tests *
include certbot_apache/_internal/centos-options-ssl-apache.conf
include certbot_apache/_internal/options-ssl-apache.conf
recursive-include certbot_apache/_internal/augeas_lens *.aug
recursive-include certbot_apache/_internal/tls_configs *.conf
global-exclude __pycache__
global-exclude *.py[cod]

View File

@@ -5,6 +5,8 @@ import logging
import re
import subprocess
import pkg_resources
from certbot import errors
from certbot import util
@@ -128,7 +130,7 @@ def included_in_paths(filepath, paths):
:rtype: bool
"""
return any([fnmatch.fnmatch(filepath, path) for path in paths])
return any(fnmatch.fnmatch(filepath, path) for path in paths)
def parse_defines(apachectl):
@@ -142,7 +144,7 @@ def parse_defines(apachectl):
:rtype: dict
"""
variables = dict()
variables = {}
define_cmd = [apachectl, "-t", "-D",
"DUMP_RUN_CFG"]
matches = parse_from_subprocess(define_cmd, r"Define: ([^ \n]*)")
@@ -223,7 +225,8 @@ def _get_runtime_cfg(command):
command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True)
universal_newlines=True,
env=util.env_no_snap_for_external_calls())
stdout, stderr = proc.communicate()
except (OSError, ValueError):
@@ -241,3 +244,14 @@ def _get_runtime_cfg(command):
"loaded because Apache is misconfigured.")
return stdout
def find_ssl_apache_conf(prefix):
"""
Find a TLS Apache config file in the dedicated storage.
:param str prefix: prefix of the TLS Apache config file to find
:return: the path the TLS Apache config file
:rtype: str
"""
return pkg_resources.resource_filename(
"certbot_apache",
os.path.join("_internal", "tls_configs", "{0}-options-ssl-apache.conf".format(prefix)))

View File

@@ -73,7 +73,7 @@ class ApacheDirectiveNode(ApacheParserNode):
self.metadata == other.metadata)
return False
def set_parameters(self, _parameters):
def set_parameters(self, _parameters): # pragma: no cover
"""Sets the parameters for DirectiveNode"""
return
@@ -97,7 +97,8 @@ class ApacheBlockNode(ApacheDirectiveNode):
self.metadata == other.metadata)
return False
def add_child_block(self, name, parameters=None, position=None): # pylint: disable=unused-argument
# pylint: disable=unused-argument
def add_child_block(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new BlockNode to the sequence of children"""
new_block = ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
@@ -107,7 +108,8 @@ class ApacheBlockNode(ApacheDirectiveNode):
self.children += (new_block,)
return new_block
def add_child_directive(self, name, parameters=None, position=None): # pylint: disable=unused-argument
# pylint: disable=unused-argument
def add_child_directive(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new DirectiveNode to the sequence of children"""
new_dir = ApacheDirectiveNode(name=assertions.PASS,
parameters=assertions.PASS,
@@ -144,7 +146,8 @@ class ApacheBlockNode(ApacheDirectiveNode):
filepath=assertions.PASS,
metadata=self.metadata)]
def find_comments(self, comment, exact=False): # pylint: disable=unused-argument
# pylint: disable=unused-argument
def find_comments(self, comment, exact=False): # pragma: no cover
"""Recursive search of DirectiveNodes from the sequence of children"""
return [ApacheCommentNode(comment=assertions.PASS,
ancestor=self,

View File

@@ -132,9 +132,9 @@ def assertEqualPathsList(first, second): # pragma: no cover
Checks that the two lists of file paths match. This assertion allows for wildcard
paths.
"""
if any([isPass(path) for path in first]):
if any(isPass(path) for path in first):
return
if any([isPass(path) for path in second]):
if any(isPass(path) for path in second):
return
for fpath in first:
assert any([fnmatch.fnmatch(fpath, spath) for spath in second])

View File

@@ -64,7 +64,7 @@ Translates over to:
"/files/etc/apache2/apache2.conf/bLoCk[1]",
]
"""
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set
from certbot import errors
from certbot.compat import os
@@ -339,7 +339,7 @@ class AugeasBlockNode(AugeasDirectiveNode):
def find_blocks(self, name, exclude=True):
"""Recursive search of BlockNodes from the sequence of children"""
nodes = list()
nodes = []
paths = self._aug_find_blocks(name)
if exclude:
paths = self.parser.exclude_dirs(paths)
@@ -351,7 +351,7 @@ class AugeasBlockNode(AugeasDirectiveNode):
def find_directives(self, name, exclude=True):
"""Recursive search of DirectiveNodes from the sequence of children"""
nodes = list()
nodes = []
ownpath = self.metadata.get("augeaspath")
directives = self.parser.find_dir(name, start=ownpath, exclude=exclude)
@@ -374,7 +374,7 @@ class AugeasBlockNode(AugeasDirectiveNode):
:param str comment: Comment content to search for.
"""
nodes = list()
nodes = []
ownpath = self.metadata.get("augeaspath")
comments = self.parser.find_comments(comment, start=ownpath)

View File

@@ -1,25 +0,0 @@
# This file contains important security parameters. If you modify this file
# manually, Certbot will be unable to automatically provide future security
# updates. Instead, Certbot will print and log an error message with a path to
# the up-to-date file that you will need to refer to when manually updating
# this file.
SSLEngine on
# Intermediate configuration, tweak to your needs
SSLProtocol all -SSLv2 -SSLv3
SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
SSLHonorCipherOrder on
SSLOptions +StrictRequire
# Add vhost name to log entries:
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\"" vhost_combined
LogFormat "%v %h %l %u %t \"%r\" %>s %b" vhost_common
#CustomLog /var/log/apache2/access.log vhost_combined
#LogLevel warn
#ErrorLog /var/log/apache2/error.log
# Always ensure Cookies have "Secure" set (JAH 2012/1)
#Header edit Set-Cookie (?i)^(.*)(;\s*secure)??((\s*;)?(.*)) "$1; Secure$3$4"

View File

@@ -1,6 +1,7 @@
"""Apache Configurator."""
# pylint: disable=too-many-lines
from collections import defaultdict
from distutils.version import LooseVersion
import copy
import fnmatch
import logging
@@ -8,17 +9,21 @@ import re
import socket
import time
import pkg_resources
import six
import zope.component
import zope.interface
try:
import apacheconfig
HAS_APACHECONFIG = True
except ImportError: # pragma: no cover
HAS_APACHECONFIG = False
from acme import challenges
from acme.magic_typing import DefaultDict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Union # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import DefaultDict
from acme.magic_typing import Dict
from acme.magic_typing import List
from acme.magic_typing import Set
from acme.magic_typing import Union
from certbot import errors
from certbot import interfaces
from certbot import util
@@ -110,14 +115,30 @@ class ApacheConfigurator(common.Installer):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/apache2",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
bin=None
)
def option(self, key):
"""Get a value from options"""
return self.options.get(key)
def pick_apache_config(self, warn_on_no_mod_ssl=True):
"""
Pick the appropriate TLS Apache configuration file for current version of Apache and OS.
:param bool warn_on_no_mod_ssl: True if we should warn if mod_ssl is not found.
:return: the path to the TLS Apache configuration file to use
:rtype: str
"""
# Disabling TLS session tickets is supported by Apache 2.4.11+ and OpenSSL 1.0.2l+.
# So for old versions of Apache we pick a configuration without this option.
openssl_version = self.openssl_version(warn_on_no_mod_ssl)
if self.version < (2, 4, 11) or not openssl_version or\
LooseVersion(openssl_version) < LooseVersion('1.0.2l'):
return apache_util.find_ssl_apache_conf("old")
return apache_util.find_ssl_apache_conf("current")
def _prepare_options(self):
"""
Set the values possibly changed by command line parameters to
@@ -125,7 +146,7 @@ class ApacheConfigurator(common.Installer):
"""
opts = ["enmod", "dismod", "le_vhost_ext", "server_root", "vhost_root",
"logs_root", "challenge_location", "handle_modules", "handle_sites",
"ctl"]
"ctl", "bin"]
for o in opts:
# Config options use dashes instead of underscores
if self.conf(o.replace("_", "-")) is not None:
@@ -174,6 +195,8 @@ class ApacheConfigurator(common.Installer):
"(Only Ubuntu/Debian currently)")
add("ctl", default=DEFAULTS["ctl"],
help="Full path to Apache control script")
add("bin", default=DEFAULTS["bin"],
help="Full path to apache2/httpd binary")
def __init__(self, *args, **kwargs):
"""Initialize an Apache Configurator.
@@ -184,15 +207,16 @@ class ApacheConfigurator(common.Installer):
"""
version = kwargs.pop("version", None)
use_parsernode = kwargs.pop("use_parsernode", False)
openssl_version = kwargs.pop("openssl_version", None)
super(ApacheConfigurator, self).__init__(*args, **kwargs)
# Add name_server association dict
self.assoc = dict() # type: Dict[str, obj.VirtualHost]
self.assoc = {} # type: Dict[str, obj.VirtualHost]
# Outstanding challenges
self._chall_out = set() # type: Set[KeyAuthorizationAnnotatedChallenge]
# List of vhosts configured per wildcard domain on this run.
# used by deploy_cert() and enhance()
self._wildcard_vhosts = dict() # type: Dict[str, List[obj.VirtualHost]]
self._wildcard_vhosts = {} # type: Dict[str, List[obj.VirtualHost]]
# Maps enhancements to vhosts we've enabled the enhancement for
self._enhanced_vhosts = defaultdict(set) # type: DefaultDict[str, Set[obj.VirtualHost]]
# Temporary state for AutoHSTS enhancement
@@ -209,6 +233,7 @@ class ApacheConfigurator(common.Installer):
self.parser = None
self.parser_root = None
self.version = version
self._openssl_version = openssl_version
self.vhosts = None
self.options = copy.deepcopy(self.OS_DEFAULTS)
self._enhance_func = {"redirect": self._enable_redirect,
@@ -225,6 +250,59 @@ class ApacheConfigurator(common.Installer):
"""Full absolute path to digest of updated SSL configuration file."""
return os.path.join(self.config.config_dir, constants.UPDATED_MOD_SSL_CONF_DIGEST)
def _open_module_file(self, ssl_module_location):
"""Extract the open lines of openssl_version for testing purposes"""
try:
with open(ssl_module_location, mode="rb") as f:
contents = f.read()
except IOError as error:
logger.debug(str(error), exc_info=True)
return None
return contents
def openssl_version(self, warn_on_no_mod_ssl=True):
"""Lazily retrieve openssl version
:param bool warn_on_no_mod_ssl: `True` if we should warn if mod_ssl is not found. Set to
`False` when we know we'll try to enable mod_ssl later. This is currently debian/ubuntu,
when called from `prepare`.
:return: the OpenSSL version as a string, or None.
:rtype: str or None
"""
if self._openssl_version:
return self._openssl_version
# Step 1. Determine the location of ssl_module
try:
ssl_module_location = self.parser.modules['ssl_module']
except KeyError:
if warn_on_no_mod_ssl:
logger.warning("Could not find ssl_module; not disabling session tickets.")
return None
if ssl_module_location:
# Possibility A: ssl_module is a DSO
ssl_module_location = self.parser.standard_path_from_server_root(ssl_module_location)
else:
# Possibility B: ssl_module is statically linked into Apache
if self.option("bin"):
ssl_module_location = self.option("bin")
else:
logger.warning("ssl_module is statically linked but --apache-bin is "
"missing; not disabling session tickets.")
return None
# Step 2. Grep in the binary for openssl version
contents = self._open_module_file(ssl_module_location)
if not contents:
logger.warning("Unable to read ssl_module file; not disabling session tickets.")
return None
# looks like: OpenSSL 1.0.2s 28 May 2019
matches = re.findall(br"OpenSSL ([0-9]\.[^ ]+) ", contents)
if not matches:
logger.warning("Could not find OpenSSL version; not disabling session tickets.")
return None
self._openssl_version = matches[0].decode('UTF-8')
return self._openssl_version
def prepare(self):
"""Prepare the authenticator/installer.
@@ -271,8 +349,12 @@ class ApacheConfigurator(common.Installer):
# Get all of the available vhosts
self.vhosts = self.get_virtual_hosts()
# We may try to enable mod_ssl later. If so, we shouldn't warn if we can't find it now.
# This is currently only true for debian/ubuntu.
warn_on_no_mod_ssl = not self.option("handle_modules")
self.install_ssl_options_conf(self.mod_ssl_conf,
self.updated_mod_ssl_conf_digest)
self.updated_mod_ssl_conf_digest,
warn_on_no_mod_ssl)
# Prevent two Apache plugins from modifying a config at once
try:
@@ -363,11 +445,17 @@ class ApacheConfigurator(common.Installer):
def get_parsernode_root(self, metadata):
"""Initializes the ParserNode parser root instance."""
apache_vars = dict()
apache_vars["defines"] = apache_util.parse_defines(self.option("ctl"))
apache_vars["includes"] = apache_util.parse_includes(self.option("ctl"))
apache_vars["modules"] = apache_util.parse_modules(self.option("ctl"))
metadata["apache_vars"] = apache_vars
if HAS_APACHECONFIG:
apache_vars = {}
apache_vars["defines"] = apache_util.parse_defines(self.option("ctl"))
apache_vars["includes"] = apache_util.parse_includes(self.option("ctl"))
apache_vars["modules"] = apache_util.parse_modules(self.option("ctl"))
metadata["apache_vars"] = apache_vars
with open(self.parser.loc["root"]) as f:
with apacheconfig.make_loader(writable=True,
**apacheconfig.flavors.NATIVE_APACHE) as loader:
metadata["ac_ast"] = loader.loads(f.read())
return dualparser.DualBlockNode(
name=assertions.PASS,
@@ -470,7 +558,7 @@ class ApacheConfigurator(common.Installer):
# Go through the vhosts, making sure that we cover all the names
# present, but preferring the SSL vhosts
filtered_vhosts = dict()
filtered_vhosts = {}
for vhost in vhosts:
for name in vhost.get_names():
if vhost.ssl:
@@ -496,7 +584,7 @@ class ApacheConfigurator(common.Installer):
# Make sure we create SSL vhosts for the ones that are HTTP only
# if requested.
return_vhosts = list()
return_vhosts = []
for vhost in dialog_output:
if not vhost.ssl:
return_vhosts.append(self.make_vhost_ssl(vhost))
@@ -517,6 +605,11 @@ class ApacheConfigurator(common.Installer):
# cert_key... can all be parsed appropriately
self.prepare_server_https("443")
# If we haven't managed to enable mod_ssl by this point, error out
if "ssl_module" not in self.parser.modules:
raise errors.MisconfigurationError("Could not find ssl_module; "
"not installing certificate.")
# Add directives and remove duplicates
self._add_dummy_ssl_directives(vhost.path)
self._clean_vhost(vhost)
@@ -531,21 +624,6 @@ class ApacheConfigurator(common.Installer):
path["chain_path"] = self.parser.find_dir(
"SSLCertificateChainFile", None, vhost.path)
# Handle errors when certificate/key directives cannot be found
if not path["cert_path"]:
logger.warning(
"Cannot find an SSLCertificateFile directive in %s. "
"VirtualHost was not modified", vhost.path)
raise errors.PluginError(
"Unable to find an SSLCertificateFile directive")
elif not path["cert_key"]:
logger.warning(
"Cannot find an SSLCertificateKeyFile directive for "
"certificate in %s. VirtualHost was not modified", vhost.path)
raise errors.PluginError(
"Unable to find an SSLCertificateKeyFile directive for "
"certificate")
logger.info("Deploying Certificate to VirtualHost %s", vhost.filep)
if self.version < (2, 4, 8) or (chain_path and not fullchain_path):
@@ -792,7 +870,7 @@ class ApacheConfigurator(common.Installer):
return util.get_filtered_names(all_names)
def get_name_from_ip(self, addr): # pylint: disable=no-self-use
def get_name_from_ip(self, addr):
"""Returns a reverse dns name if available.
:param addr: IP Address
@@ -907,7 +985,7 @@ class ApacheConfigurator(common.Installer):
"""
v1_vhosts = self.get_virtual_hosts_v1()
if self.USE_PARSERNODE:
if self.USE_PARSERNODE and HAS_APACHECONFIG:
v2_vhosts = self.get_virtual_hosts_v2()
for v1_vh in v1_vhosts:
@@ -1241,6 +1319,14 @@ class ApacheConfigurator(common.Installer):
self.enable_mod("socache_shmcb", temp=temp)
if "ssl_module" not in self.parser.modules:
self.enable_mod("ssl", temp=temp)
# Make sure we're not throwing away any unwritten changes to the config
self.parser.ensure_augeas_state()
self.parser.aug.load()
self.parser.reset_modules() # Reset to load the new ssl_module path
# Call again because now we can gate on openssl version
self.install_ssl_options_conf(self.mod_ssl_conf,
self.updated_mod_ssl_conf_digest,
warn_on_no_mod_ssl=True)
def make_vhost_ssl(self, nonssl_vhost):
"""Makes an ssl_vhost version of a nonssl_vhost.
@@ -1493,7 +1579,7 @@ class ApacheConfigurator(common.Installer):
result.append(comment)
sift = True
result.append('\n'.join(['# ' + l for l in chunk]))
result.append('\n'.join('# ' + l for l in chunk))
else:
result.append('\n'.join(chunk))
return result, sift
@@ -1633,7 +1719,7 @@ class ApacheConfigurator(common.Installer):
for addr in vhost.addrs:
# In Apache 2.2, when a NameVirtualHost directive is not
# set, "*" and "_default_" will conflict when sharing a port
addrs = set((addr,))
addrs = {addr,}
if addr.get_addr() in ("*", "_default_"):
addrs.update(obj.Addr((a, addr.get_port(),))
for a in ("*", "_default_"))
@@ -1726,7 +1812,7 @@ class ApacheConfigurator(common.Installer):
######################################################################
# Enhancements
######################################################################
def supported_enhancements(self): # pylint: disable=no-self-use
def supported_enhancements(self):
"""Returns currently supported enhancements."""
return ["redirect", "ensure-http-header", "staple-ocsp"]
@@ -1824,7 +1910,7 @@ class ApacheConfigurator(common.Installer):
try:
self._autohsts = self.storage.fetch("autohsts")
except KeyError:
self._autohsts = dict()
self._autohsts = {}
def _autohsts_save_state(self):
"""
@@ -1946,7 +2032,7 @@ class ApacheConfigurator(common.Installer):
ssl_vhost.filep)
def _verify_no_matching_http_header(self, ssl_vhost, header_substring):
"""Checks to see if an there is an existing Header directive that
"""Checks to see if there is an existing Header directive that
contains the string header_substring.
:param ssl_vhost: vhost to check
@@ -2292,7 +2378,7 @@ class ApacheConfigurator(common.Installer):
vhost.enabled = True
return
def enable_mod(self, mod_name, temp=False): # pylint: disable=unused-argument
def enable_mod(self, mod_name, temp=False):
"""Enables module in Apache.
Both enables and reloads Apache so module is active.
@@ -2350,7 +2436,7 @@ class ApacheConfigurator(common.Installer):
error = str(err)
raise errors.MisconfigurationError(error)
def config_test(self): # pylint: disable=no-self-use
def config_test(self):
"""Check the configuration of Apache for errors.
:raises .errors.MisconfigurationError: If config_test fails
@@ -2385,7 +2471,7 @@ class ApacheConfigurator(common.Installer):
if len(matches) != 1:
raise errors.PluginError("Unable to find Apache version")
return tuple([int(i) for i in matches[0].split(".")])
return tuple(int(i) for i in matches[0].split("."))
def more_info(self):
"""Human-readable string to help understand the module"""
@@ -2400,7 +2486,7 @@ class ApacheConfigurator(common.Installer):
###########################################################################
# Challenges Section
###########################################################################
def get_chall_pref(self, unused_domain): # pylint: disable=no-self-use
def get_chall_pref(self, unused_domain):
"""Return list of challenge preferences."""
return [challenges.HTTP01]
@@ -2454,14 +2540,19 @@ class ApacheConfigurator(common.Installer):
self.restart()
self.parser.reset_modules()
def install_ssl_options_conf(self, options_ssl, options_ssl_digest):
"""Copy Certbot's SSL options file into the system's config dir if required."""
def install_ssl_options_conf(self, options_ssl, options_ssl_digest, warn_on_no_mod_ssl=True):
"""Copy Certbot's SSL options file into the system's config dir if required.
:param bool warn_on_no_mod_ssl: True if we should warn if mod_ssl is not found.
"""
# XXX if we ever try to enforce a local privilege boundary (eg, running
# certbot for unprivileged users via setuid), this function will need
# to be modified.
return common.install_version_controlled_file(options_ssl, options_ssl_digest,
self.option("MOD_SSL_CONF_SRC"), constants.ALL_SSL_OPTIONS_HASHES)
apache_config_path = self.pick_apache_config(warn_on_no_mod_ssl)
return common.install_version_controlled_file(
options_ssl, options_ssl_digest, apache_config_path, constants.ALL_SSL_OPTIONS_HASHES)
def enable_autohsts(self, _unused_lineage, domains):
"""
@@ -2471,7 +2562,7 @@ class ApacheConfigurator(common.Installer):
:type _unused_lineage: certbot._internal.storage.RenewableCert
:param domains: List of domains in certificate to enhance
:type domains: str
:type domains: `list` of `str`
"""
self._autohsts_fetch_state()

View File

@@ -24,6 +24,9 @@ ALL_SSL_OPTIONS_HASHES = [
'0fcdc81280cd179a07ec4d29d3595068b9326b455c488de4b09f585d5dafc137',
'86cc09ad5415cd6d5f09a947fe2501a9344328b1e8a8b458107ea903e80baa6c',
'06675349e457eae856120cdebb564efe546f0b87399f2264baeb41e442c724c7',
'5cc003edd93fb9cd03d40c7686495f8f058f485f75b5e764b789245a386e6daf',
'007cd497a56a3bb8b6a2c1aeb4997789e7e38992f74e44cc5d13a625a738ac73',
'34783b9e2210f5c4a23bced2dfd7ec289834716673354ed7c7abf69fe30192a3',
]
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""

View File

@@ -21,7 +21,7 @@ def select_vhost_multiple(vhosts):
:rtype: :class:`list`of type `~obj.Vhost`
"""
if not vhosts:
return list()
return []
tags_list = [vhost.display_repr()+"\n" for vhost in vhosts]
# Remove the extra newline from the last entry
if tags_list:
@@ -37,7 +37,7 @@ def select_vhost_multiple(vhosts):
def _reversemap_vhosts(names, vhosts):
"""Helper function for select_vhost_multiple for mapping string
representations back to actual vhost objects"""
return_vhosts = list()
return_vhosts = []
for selection in names:
for vhost in vhosts:

View File

@@ -49,7 +49,7 @@ class DualNodeBase(object):
pass_primary = assertions.isPassNodeList(primary_res)
pass_secondary = assertions.isPassNodeList(secondary_res)
new_nodes = list()
new_nodes = []
if pass_primary and pass_secondary:
# Both unimplemented
@@ -221,7 +221,7 @@ class DualBlockNode(DualNodeBase):
implementations to a list of tuples.
"""
matched = list()
matched = []
for p in primary_list:
match = None
for s in secondary_list:

View File

@@ -1,7 +1,5 @@
""" Entry point for Apache Plugin """
# Pylint does not like disutils.version when running inside a venv.
# See: https://github.com/PyCQA/pylint/issues/73
from distutils.version import LooseVersion # pylint: disable=no-name-in-module,import-error
from distutils.version import LooseVersion
from certbot import util
from certbot_apache._internal import configurator

View File

@@ -1,8 +1,9 @@
"""A class that performs HTTP-01 challenges for Apache"""
import logging
import errno
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List
from acme.magic_typing import Set
from certbot import errors
from certbot.compat import filesystem
from certbot.compat import os
@@ -168,7 +169,15 @@ class ApacheHttp01(common.ChallengePerformer):
def _set_up_challenges(self):
if not os.path.isdir(self.challenge_dir):
filesystem.makedirs(self.challenge_dir, 0o755)
old_umask = filesystem.umask(0o022)
try:
filesystem.makedirs(self.challenge_dir, 0o755)
except OSError as exception:
if exception.errno not in (errno.EEXIST, errno.EISDIR):
raise errors.PluginError(
"Couldn't create root for http-01 challenge")
finally:
filesystem.umask(old_umask)
responses = []
for achall in self.achalls:

View File

@@ -102,7 +102,6 @@ For this reason the internal representation of data should not ignore the case.
import abc
import six
from acme.magic_typing import Any, Dict, Optional, Tuple # pylint: disable=unused-import, no-name-in-module
@six.add_metaclass(abc.ABCMeta)

View File

@@ -1,7 +1,7 @@
"""Module contains classes used by the Apache Configurator."""
import re
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set
from certbot.plugins import common

View File

@@ -1,26 +0,0 @@
# This file contains important security parameters. If you modify this file
# manually, Certbot will be unable to automatically provide future security
# updates. Instead, Certbot will print and log an error message with a path to
# the up-to-date file that you will need to refer to when manually updating
# this file.
SSLEngine on
# Intermediate configuration, tweak to your needs
SSLProtocol all -SSLv2 -SSLv3
SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
SSLHonorCipherOrder on
SSLCompression off
SSLOptions +StrictRequire
# Add vhost name to log entries:
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\"" vhost_combined
LogFormat "%v %h %l %u %t \"%r\" %>s %b" vhost_common
#CustomLog /var/log/apache2/access.log vhost_combined
#LogLevel warn
#ErrorLog /var/log/apache2/error.log
# Always ensure Cookies have "Secure" set (JAH 2012/1)
#Header edit Set-Cookie (?i)^(.*)(;\s*secure)??((\s*;)?(.*)) "$1; Secure$3$4"

View File

@@ -1,9 +1,7 @@
""" Distribution specific override class for Arch Linux """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import configurator
@@ -26,6 +24,5 @@ class ArchConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/httpd/conf",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
bin=None,
)

View File

@@ -1,14 +1,12 @@
""" Distribution specific override class for CentOS family (RHEL, Fedora) """
import logging
import pkg_resources
import zope.interface
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.compat import os
from certbot.errors import MisconfigurationError
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
@@ -37,8 +35,7 @@ class CentOSConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/httpd/conf.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "centos-options-ssl-apache.conf"))
bin=None,
)
def config_test(self):

View File

@@ -1,9 +1,7 @@
""" Distribution specific override class for macOS """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import configurator
@@ -26,6 +24,5 @@ class DarwinConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/apache2/other",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
bin=None,
)

View File

@@ -1,7 +1,6 @@
""" Distribution specific override class for Debian family (Ubuntu/Debian) """
import logging
import pkg_resources
import zope.interface
from certbot import errors
@@ -34,8 +33,7 @@ class DebianConfigurator(configurator.ApacheConfigurator):
handle_modules=True,
handle_sites=True,
challenge_location="/etc/apache2",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
bin=None,
)
def enable_site(self, vhost):

View File

@@ -1,11 +1,9 @@
""" Distribution specific override class for Fedora 29+ """
import pkg_resources
import zope.interface
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache._internal import parser
@@ -31,9 +29,7 @@ class FedoraConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/httpd/conf.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
# TODO: eventually newest version of Fedora will need their own config
"certbot_apache", os.path.join("_internal", "centos-options-ssl-apache.conf"))
bin=None,
)
def config_test(self):

View File

@@ -1,9 +1,7 @@
""" Distribution specific override class for Gentoo Linux """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache._internal import parser
@@ -29,8 +27,7 @@ class GentooConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/apache2/vhosts.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
bin=None,
)
def _prepare_options(self):

View File

@@ -1,9 +1,7 @@
""" Distribution specific override class for OpenSUSE """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import configurator
@@ -26,6 +24,5 @@ class OpenSUSEConfigurator(configurator.ApacheConfigurator):
handle_modules=False,
handle_sites=False,
challenge_location="/etc/apache2/vhosts.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "options-ssl-apache.conf"))
bin=None,
)

View File

@@ -7,9 +7,8 @@ import sys
import six
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict
from acme.magic_typing import List
from certbot import errors
from certbot.compat import os
from certbot_apache._internal import apache_util
@@ -31,7 +30,7 @@ class ApacheParser(object):
"""
arg_var_interpreter = re.compile(r"\$\{[^ \}]*}")
fnmatch_chars = set(["*", "?", "\\", "[", "]"])
fnmatch_chars = {"*", "?", "\\", "[", "]"}
def __init__(self, root, vhostroot=None, version=(2, 4),
configurator=None):
@@ -52,7 +51,7 @@ class ApacheParser(object):
"version 1.2.0 or higher, please make sure you have you have "
"those installed.")
self.modules = set() # type: Set[str]
self.modules = {} # type: Dict[str, str]
self.parser_paths = {} # type: Dict[str, List[str]]
self.variables = {} # type: Dict[str, str]
@@ -249,14 +248,14 @@ class ApacheParser(object):
def add_mod(self, mod_name):
"""Shortcut for updating parser modules."""
if mod_name + "_module" not in self.modules:
self.modules.add(mod_name + "_module")
self.modules[mod_name + "_module"] = None
if "mod_" + mod_name + ".c" not in self.modules:
self.modules.add("mod_" + mod_name + ".c")
self.modules["mod_" + mod_name + ".c"] = None
def reset_modules(self):
"""Reset the loaded modules list. This is called from cleanup to clear
temporarily loaded modules."""
self.modules = set()
self.modules = {}
self.update_modules()
self.parse_modules()
@@ -267,7 +266,7 @@ class ApacheParser(object):
the iteration issue. Else... parse and enable mods at same time.
"""
mods = set() # type: Set[str]
mods = {} # type: Dict[str, str]
matches = self.find_dir("LoadModule")
iterator = iter(matches)
# Make sure prev_size != cur_size for do: while: iteration
@@ -281,8 +280,8 @@ class ApacheParser(object):
mod_name = self.get_arg(match_name)
mod_filename = self.get_arg(match_filename)
if mod_name and mod_filename:
mods.add(mod_name)
mods.add(os.path.basename(mod_filename)[:-2] + "c")
mods[mod_name] = mod_filename
mods[os.path.basename(mod_filename)[:-2] + "c"] = mod_filename
else:
logger.debug("Could not read LoadModule directive from Augeas path: %s",
match_name[6:])
@@ -321,7 +320,7 @@ class ApacheParser(object):
for mod in matches:
self.add_mod(mod.strip())
def filter_args_num(self, matches, args): # pylint: disable=no-self-use
def filter_args_num(self, matches, args):
"""Filter out directives with specific number of arguments.
This function makes the assumption that all related arguments are given
@@ -621,7 +620,7 @@ class ApacheParser(object):
def exclude_dirs(self, matches):
"""Exclude directives that are not loaded into the configuration."""
filters = [("ifmodule", self.modules), ("ifdefine", self.variables)]
filters = [("ifmodule", self.modules.keys()), ("ifdefine", self.variables)]
valid_matches = []
@@ -662,6 +661,25 @@ class ApacheParser(object):
return True
def standard_path_from_server_root(self, arg):
"""Ensure paths are consistent and absolute
:param str arg: Argument of directive
:returns: Standardized argument path
:rtype: str
"""
# Remove beginning and ending quotes
arg = arg.strip("'\"")
# Standardize the include argument based on server root
if not arg.startswith("/"):
# Normpath will condense ../
arg = os.path.normpath(os.path.join(self.root, arg))
else:
arg = os.path.normpath(arg)
return arg
def _get_include_path(self, arg):
"""Converts an Apache Include directive into Augeas path.
@@ -682,16 +700,7 @@ class ApacheParser(object):
# if matchObj.group() != arg:
# logger.error("Error: Invalid regexp characters in %s", arg)
# return []
# Remove beginning and ending quotes
arg = arg.strip("'\"")
# Standardize the include argument based on server root
if not arg.startswith("/"):
# Normpath will condense ../
arg = os.path.normpath(os.path.join(self.root, arg))
else:
arg = os.path.normpath(arg)
arg = self.standard_path_from_server_root(arg)
# Attempts to add a transform to the file if one does not already exist
if os.path.isdir(arg):
@@ -705,7 +714,7 @@ class ApacheParser(object):
split_arg = arg.split("/")
for idx, split in enumerate(split_arg):
if any(char in ApacheParser.fnmatch_chars for char in split):
# Turn it into a augeas regex
# Turn it into an augeas regex
# TODO: Can this instead be an augeas glob instead of regex
split_arg[idx] = ("* [label()=~regexp('%s')]" %
self.fnmatch_to_re(split))
@@ -715,7 +724,7 @@ class ApacheParser(object):
return get_aug_path(arg)
def fnmatch_to_re(self, clean_fn_match): # pylint: disable=no-self-use
def fnmatch_to_re(self, clean_fn_match):
"""Method converts Apache's basic fnmatch to regular expression.
Assumption - Configs are assumed to be well-formed and only writable by
@@ -732,7 +741,7 @@ class ApacheParser(object):
"""
if sys.version_info < (3, 6):
# This strips off final /Z(?ms)
return fnmatch.translate(clean_fn_match)[:-7]
return fnmatch.translate(clean_fn_match)[:-7] # pragma: no cover
# Since Python 3.6, it returns a different pattern like (?s:.*\.load)\Z
return fnmatch.translate(clean_fn_match)[4:-3] # pragma: no cover
@@ -936,8 +945,8 @@ def case_i(string):
:param str string: string to make case i regex
"""
return "".join(["[" + c.upper() + c.lower() + "]"
if c.isalpha() else c for c in re.escape(string)])
return "".join("[" + c.upper() + c.lower() + "]"
if c.isalpha() else c for c in re.escape(string))
def get_aug_path(file_path):

View File

@@ -11,7 +11,7 @@ def validate_kwargs(kwargs, required_names):
:param list required_names: List of required parameter names.
"""
validated_kwargs = dict()
validated_kwargs = {}
for name in required_names:
try:
validated_kwargs[name] = kwargs.pop(name)

View File

@@ -0,0 +1,19 @@
# This file contains important security parameters. If you modify this file
# manually, Certbot will be unable to automatically provide future security
# updates. Instead, Certbot will print and log an error message with a path to
# the up-to-date file that you will need to refer to when manually updating
# this file.
SSLEngine on
# Intermediate configuration, tweak to your needs
SSLProtocol all -SSLv2 -SSLv3 -TLSv1 -TLSv1.1
SSLCipherSuite ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384
SSLHonorCipherOrder off
SSLSessionTickets off
SSLOptions +StrictRequire
# Add vhost name to log entries:
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\"" vhost_combined
LogFormat "%v %h %l %u %t \"%r\" %>s %b" vhost_common

View File

@@ -0,0 +1,18 @@
# This file contains important security parameters. If you modify this file
# manually, Certbot will be unable to automatically provide future security
# updates. Instead, Certbot will print and log an error message with a path to
# the up-to-date file that you will need to refer to when manually updating
# this file.
SSLEngine on
# Intermediate configuration, tweak to your needs
SSLProtocol all -SSLv2 -SSLv3 -TLSv1 -TLSv1.1
SSLCipherSuite ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384
SSLHonorCipherOrder off
SSLOptions +StrictRequire
# Add vhost name to log entries:
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-agent}i\"" vhost_combined
LogFormat "%v %h %l %u %t \"%r\" %>s %b" vhost_common

View File

@@ -1,3 +1,3 @@
# Remember to update setup.py to match the package versions below.
acme[dev]==0.29.0
-e certbot[dev]
certbot[dev]==1.6.0

View File

@@ -1,25 +1,35 @@
from distutils.version import LooseVersion
import sys
from setuptools import __version__ as setuptools_version
from setuptools import find_packages
from setuptools import setup
from setuptools.command.test import test as TestCommand
version = '1.1.0.dev0'
version = '1.6.0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.
install_requires = [
'acme>=0.29.0',
'certbot>=1.0.0.dev0',
'mock',
'certbot>=1.6.0',
'python-augeas',
'setuptools',
'zope.component',
'zope.interface',
]
setuptools_known_environment_markers = (LooseVersion(setuptools_version) >= LooseVersion('36.2'))
if setuptools_known_environment_markers:
install_requires.append('mock ; python_version < "3.3"')
elif 'bdist_wheel' in sys.argv[1:]:
raise RuntimeError('Error, you are trying to build certbot wheels using an old version '
'of setuptools. Version 36.2+ of setuptools is required.')
elif sys.version_info < (3,3):
install_requires.append('mock')
dev_extras = [
'apacheconfig>=0.3.1',
'apacheconfig>=0.3.2',
]
class PyTest(TestCommand):
@@ -45,7 +55,7 @@ setup(
author="Certbot Project",
author_email='client-dev@letsencrypt.org',
license='Apache License 2.0',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*',
classifiers=[
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
@@ -56,7 +66,6 @@ setup(
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
'Programming Language :: Python :: 3.7',

View File

@@ -1,14 +1,27 @@
"""Tests for AugeasParserNode classes"""
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import os
import util
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot_apache._internal import assertions
from certbot_apache._internal import augeasparser
def _get_augeasnode_mock(filepath):
""" Helper function for mocking out DualNode instance with an AugeasNode """
def augeasnode_mock(metadata):
return augeasparser.AugeasBlockNode(
name=assertions.PASS,
ancestor=None,
filepath=filepath,
metadata=metadata)
return augeasnode_mock
class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-methods
"""Test AugeasParserNode using available test configurations"""
@@ -16,8 +29,11 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
def setUp(self): # pylint: disable=arguments-differ
super(AugeasParserNodeTest, self).setUp()
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir, use_parsernode=True)
with mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.get_parsernode_root") as mock_parsernode:
mock_parsernode.side_effect = _get_augeasnode_mock(
os.path.join(self.config_path, "apache2.conf"))
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir, use_parsernode=True)
self.vh_truth = util.get_vh_truth(
self.temp_dir, "debian_apache_2_4/multiple_vhosts")
@@ -111,7 +127,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
name=servernames[0].name,
parameters=["test", "setting", "these"],
ancestor=assertions.PASS,
metadata=servernames[0].primary.metadata
metadata=servernames[0].metadata
)
self.assertTrue(mock_set.called)
self.assertEqual(
@@ -146,26 +162,26 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
self.assertTrue(found)
def test_add_child_comment(self):
newc = self.config.parser_root.primary.add_child_comment("The content")
newc = self.config.parser_root.add_child_comment("The content")
comments = self.config.parser_root.find_comments("The content")
self.assertEqual(len(comments), 1)
self.assertEqual(
newc.metadata["augeaspath"],
comments[0].primary.metadata["augeaspath"]
comments[0].metadata["augeaspath"]
)
self.assertEqual(newc.comment, comments[0].comment)
def test_delete_child(self):
listens = self.config.parser_root.primary.find_directives("Listen")
listens = self.config.parser_root.find_directives("Listen")
self.assertEqual(len(listens), 1)
self.config.parser_root.primary.delete_child(listens[0])
self.config.parser_root.delete_child(listens[0])
listens = self.config.parser_root.primary.find_directives("Listen")
listens = self.config.parser_root.find_directives("Listen")
self.assertEqual(len(listens), 0)
def test_delete_child_not_found(self):
listen = self.config.parser_root.find_directives("Listen")[0]
listen.primary.metadata["augeaspath"] = "/files/something/nonexistent"
listen.metadata["augeaspath"] = "/files/something/nonexistent"
self.assertRaises(
errors.PluginError,
@@ -178,10 +194,10 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
"NewBlock",
["first", "second"]
)
rpath, _, directive = nb.primary.metadata["augeaspath"].rpartition("/")
rpath, _, directive = nb.metadata["augeaspath"].rpartition("/")
self.assertEqual(
rpath,
self.config.parser_root.primary.metadata["augeaspath"]
self.config.parser_root.metadata["augeaspath"]
)
self.assertTrue(directive.startswith("NewBlock"))
@@ -190,8 +206,8 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
"Beginning",
position=0
)
parser = self.config.parser_root.primary.parser
root_path = self.config.parser_root.primary.metadata["augeaspath"]
parser = self.config.parser_root.parser
root_path = self.config.parser_root.metadata["augeaspath"]
# Get first child
first = parser.aug.match("{}/*[1]".format(root_path))
self.assertTrue(first[0].endswith("Beginning"))
@@ -200,8 +216,8 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
self.config.parser_root.add_child_block(
"VeryLast",
)
parser = self.config.parser_root.primary.parser
root_path = self.config.parser_root.primary.metadata["augeaspath"]
parser = self.config.parser_root.parser
root_path = self.config.parser_root.metadata["augeaspath"]
# Get last child
last = parser.aug.match("{}/*[last()]".format(root_path))
self.assertTrue(last[0].endswith("VeryLast"))
@@ -211,8 +227,8 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
"VeryLastAlt",
position=99999
)
parser = self.config.parser_root.primary.parser
root_path = self.config.parser_root.primary.metadata["augeaspath"]
parser = self.config.parser_root.parser
root_path = self.config.parser_root.metadata["augeaspath"]
# Get last child
last = parser.aug.match("{}/*[last()]".format(root_path))
self.assertTrue(last[0].endswith("VeryLastAlt"))
@@ -222,15 +238,15 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
"Middle",
position=5
)
parser = self.config.parser_root.primary.parser
root_path = self.config.parser_root.primary.metadata["augeaspath"]
parser = self.config.parser_root.parser
root_path = self.config.parser_root.metadata["augeaspath"]
# Augeas indices start at 1 :(
middle = parser.aug.match("{}/*[6]".format(root_path))
self.assertTrue(middle[0].endswith("Middle"))
def test_add_child_block_existing_name(self):
parser = self.config.parser_root.primary.parser
root_path = self.config.parser_root.primary.metadata["augeaspath"]
parser = self.config.parser_root.parser
root_path = self.config.parser_root.metadata["augeaspath"]
# There already exists a single VirtualHost in the base config
new_block = parser.aug.match("{}/VirtualHost[2]".format(root_path))
self.assertEqual(len(new_block), 0)
@@ -239,7 +255,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
)
new_block = parser.aug.match("{}/VirtualHost[2]".format(root_path))
self.assertEqual(len(new_block), 1)
self.assertTrue(vh.primary.metadata["augeaspath"].endswith("VirtualHost[2]"))
self.assertTrue(vh.metadata["augeaspath"].endswith("VirtualHost[2]"))
def test_node_init_error_bad_augeaspath(self):
from certbot_apache._internal.augeasparser import AugeasBlockNode
@@ -284,7 +300,7 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
self.assertEqual(len(dirs), 1)
self.assertEqual(dirs[0].parameters, ("with", "parameters"))
# The new directive was added to the very first line of the config
self.assertTrue(dirs[0].primary.metadata["augeaspath"].endswith("[1]"))
self.assertTrue(dirs[0].metadata["augeaspath"].endswith("[1]"))
def test_add_child_directive_exception(self):
self.assertRaises(
@@ -314,6 +330,6 @@ class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-
self.assertTrue(nonmacro_test)
def test_find_ancestors_bad_path(self):
self.config.parser_root.primary.metadata["augeaspath"] = ""
ancs = self.config.parser_root.primary.find_ancestors("Anything")
self.config.parser_root.metadata["augeaspath"] = ""
ancs = self.config.parser_root.find_ancestors("Anything")
self.assertEqual(len(ancs), 0)

View File

@@ -3,7 +3,10 @@
import re
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import six # pylint: disable=unused-import # six is used in mock.patch()
from certbot import errors
@@ -20,10 +23,10 @@ class AutoHSTSTest(util.ApacheTest):
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir)
self.config.parser.modules.add("headers_module")
self.config.parser.modules.add("mod_headers.c")
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules["headers_module"] = None
self.config.parser.modules["mod_headers.c"] = None
self.config.parser.modules["ssl_module"] = None
self.config.parser.modules["mod_ssl.c"] = None
self.vh_truth = util.get_vh_truth(
self.temp_dir, "debian_apache_2_4/multiple_vhosts")
@@ -42,8 +45,8 @@ class AutoHSTSTest(util.ApacheTest):
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.restart")
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.enable_mod")
def test_autohsts_enable_headers_mod(self, mock_enable, _restart):
self.config.parser.modules.discard("headers_module")
self.config.parser.modules.discard("mod_header.c")
self.config.parser.modules.pop("headers_module", None)
self.config.parser.modules.pop("mod_header.c", None)
self.config.enable_autohsts(mock.MagicMock(), ["ocspvhost.com"])
self.assertTrue(mock_enable.called)

View File

@@ -19,12 +19,12 @@ def get_vh_truth(temp_dir, config_name):
obj.VirtualHost(
os.path.join(prefix, "test.example.com.conf"),
os.path.join(aug_pre, "test.example.com.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80")]),
{obj.Addr.fromstring("*:80")},
False, True, "test.example.com"),
obj.VirtualHost(
os.path.join(prefix, "ssl.conf"),
os.path.join(aug_pre, "ssl.conf/VirtualHost"),
set([obj.Addr.fromstring("_default_:443")]),
{obj.Addr.fromstring("_default_:443")},
True, True, None)
]
return vh_truth
@@ -104,7 +104,7 @@ class CentOS6Tests(util.ApacheTest):
pre_loadmods = self.config.parser.find_dir(
"LoadModule", "ssl_module", exclude=False)
# LoadModules are not within IfModule blocks
self.assertFalse(any(["ifmodule" in m.lower() for m in pre_loadmods]))
self.assertFalse(any("ifmodule" in m.lower() for m in pre_loadmods))
self.config.assoc["test.example.com"] = self.vh_truth[0]
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",

View File

@@ -1,7 +1,10 @@
"""Test for certbot_apache._internal.configurator for Centos overrides"""
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot import errors
from certbot.compat import filesystem
@@ -21,12 +24,12 @@ def get_vh_truth(temp_dir, config_name):
obj.VirtualHost(
os.path.join(prefix, "centos.example.com.conf"),
os.path.join(aug_pre, "centos.example.com.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80")]),
{obj.Addr.fromstring("*:80")},
False, True, "centos.example.com"),
obj.VirtualHost(
os.path.join(prefix, "ssl.conf"),
os.path.join(aug_pre, "ssl.conf/VirtualHost"),
set([obj.Addr.fromstring("_default_:443")]),
{obj.Addr.fromstring("_default_:443")},
True, True, None)
]
return vh_truth
@@ -126,7 +129,7 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
return mod_val
return ""
mock_get.side_effect = mock_get_cfg
self.config.parser.modules = set()
self.config.parser.modules = {}
self.config.parser.variables = {}
with mock.patch("certbot.util.get_os_info") as mock_osi:

View File

@@ -2,7 +2,10 @@
import shutil
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot import errors
import util

View File

@@ -6,7 +6,10 @@ import socket
import tempfile
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import six # pylint: disable=unused-import # six is used in mock.patch()
from acme import challenges
@@ -99,7 +102,7 @@ class MultipleVhostsTest(util.ApacheTest):
parserargs = ["server_root", "enmod", "dismod", "le_vhost_ext",
"vhost_root", "logs_root", "challenge_location",
"handle_modules", "handle_sites", "ctl"]
exp = dict()
exp = {}
for k in ApacheConfigurator.OS_DEFAULTS:
if k in parserargs:
@@ -140,11 +143,9 @@ class MultipleVhostsTest(util.ApacheTest):
mock_utility = mock_getutility()
mock_utility.notification = mock.MagicMock(return_value=True)
names = self.config.get_all_names()
self.assertEqual(names, set(
["certbot.demo", "ocspvhost.com", "encryption-example.demo",
self.assertEqual(names, {"certbot.demo", "ocspvhost.com", "encryption-example.demo",
"nonsym.link", "vhost.in.rootconf", "www.certbot.demo",
"duplicate.example.com"]
))
"duplicate.example.com"})
@certbot_util.patch_get_utility()
@mock.patch("certbot_apache._internal.configurator.socket.gethostbyaddr")
@@ -154,9 +155,9 @@ class MultipleVhostsTest(util.ApacheTest):
mock_utility.notification.return_value = True
vhost = obj.VirtualHost(
"fp", "ap",
set([obj.Addr(("8.8.8.8", "443")),
{obj.Addr(("8.8.8.8", "443")),
obj.Addr(("zombo.com",)),
obj.Addr(("192.168.1.2"))]),
obj.Addr(("192.168.1.2"))},
True, False)
self.config.vhosts.append(vhost)
@@ -185,7 +186,7 @@ class MultipleVhostsTest(util.ApacheTest):
def test_bad_servername_alias(self):
ssl_vh1 = obj.VirtualHost(
"fp1", "ap1", set([obj.Addr(("*", "443"))]),
"fp1", "ap1", {obj.Addr(("*", "443"))},
True, False)
# pylint: disable=protected-access
self.config._add_servernames(ssl_vh1)
@@ -198,7 +199,7 @@ class MultipleVhostsTest(util.ApacheTest):
# pylint: disable=protected-access
self.config._add_servernames(self.vh_truth[2])
self.assertEqual(
self.vh_truth[2].get_names(), set(["*.le.co", "ip-172-30-0-17"]))
self.vh_truth[2].get_names(), {"*.le.co", "ip-172-30-0-17"})
def test_get_virtual_hosts(self):
"""Make sure all vhosts are being properly found."""
@@ -269,7 +270,7 @@ class MultipleVhostsTest(util.ApacheTest):
def test_choose_vhost_select_vhost_conflicting_non_ssl(self, mock_select):
mock_select.return_value = self.vh_truth[3]
conflicting_vhost = obj.VirtualHost(
"path", "aug_path", set([obj.Addr.fromstring("*:443")]),
"path", "aug_path", {obj.Addr.fromstring("*:443")},
True, True)
self.config.vhosts.append(conflicting_vhost)
@@ -278,14 +279,14 @@ class MultipleVhostsTest(util.ApacheTest):
def test_find_best_http_vhost_default(self):
vh = obj.VirtualHost(
"fp", "ap", set([obj.Addr.fromstring("_default_:80")]), False, True)
"fp", "ap", {obj.Addr.fromstring("_default_:80")}, False, True)
self.config.vhosts = [vh]
self.assertEqual(self.config.find_best_http_vhost("foo.bar", False), vh)
def test_find_best_http_vhost_port(self):
port = "8080"
vh = obj.VirtualHost(
"fp", "ap", set([obj.Addr.fromstring("*:" + port)]),
"fp", "ap", {obj.Addr.fromstring("*:" + port)},
False, True, "encryption-example.demo")
self.config.vhosts.append(vh)
self.assertEqual(self.config.find_best_http_vhost("foo.bar", False, port), vh)
@@ -313,8 +314,8 @@ class MultipleVhostsTest(util.ApacheTest):
def test_find_best_vhost_variety(self):
# pylint: disable=protected-access
ssl_vh = obj.VirtualHost(
"fp", "ap", set([obj.Addr(("*", "443")),
obj.Addr(("zombo.com",))]),
"fp", "ap", {obj.Addr(("*", "443")),
obj.Addr(("zombo.com",))},
True, False)
self.config.vhosts.append(ssl_vh)
self.assertEqual(self.config._find_best_vhost("zombo.com"), ssl_vh)
@@ -341,9 +342,9 @@ class MultipleVhostsTest(util.ApacheTest):
def test_deploy_cert_enable_new_vhost(self):
# Create
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[0])
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("socache_shmcb_module")
self.config.parser.modules["ssl_module"] = None
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["socache_shmcb_module"] = None
self.assertFalse(ssl_vhost.enabled)
self.config.deploy_cert(
@@ -377,9 +378,9 @@ class MultipleVhostsTest(util.ApacheTest):
# pragma: no cover
def test_deploy_cert(self):
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("socache_shmcb_module")
self.config.parser.modules["ssl_module"] = None
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["socache_shmcb_module"] = None
# Patch _add_dummy_ssl_directives to make sure we write them correctly
# pylint: disable=protected-access
orig_add_dummy = self.config._add_dummy_ssl_directives
@@ -454,41 +455,6 @@ class MultipleVhostsTest(util.ApacheTest):
"SSLCertificateChainFile", "two/cert_chain.pem",
self.vh_truth[1].path))
def test_deploy_cert_invalid_vhost(self):
"""For test cases where the `ApacheConfigurator` class' `_deploy_cert`
method is called with an invalid vhost parameter. Currently this tests
that a PluginError is appropriately raised when important directives
are missing in an SSL module."""
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("socache_shmcb_module")
def side_effect(*args):
"""Mocks case where an SSLCertificateFile directive can be found
but an SSLCertificateKeyFile directive is missing."""
if "SSLCertificateFile" in args:
return ["example/cert.pem"]
return []
mock_find_dir = mock.MagicMock(return_value=[])
mock_find_dir.side_effect = side_effect
self.config.parser.find_dir = mock_find_dir
# Get the default 443 vhost
self.config.assoc["random.demo"] = self.vh_truth[1]
self.assertRaises(
errors.PluginError, self.config.deploy_cert, "random.demo",
"example/cert.pem", "example/key.pem", "example/cert_chain.pem")
# Remove side_effect to mock case where both SSLCertificateFile
# and SSLCertificateKeyFile directives are missing
self.config.parser.find_dir.side_effect = None
self.assertRaises(
errors.PluginError, self.config.deploy_cert, "random.demo",
"example/cert.pem", "example/key.pem", "example/cert_chain.pem")
def test_is_name_vhost(self):
addr = obj.Addr.fromstring("*:80")
self.assertTrue(self.config.is_name_vhost(addr))
@@ -544,7 +510,8 @@ class MultipleVhostsTest(util.ApacheTest):
call_found = True
self.assertTrue(call_found)
def test_prepare_server_https(self):
@mock.patch("certbot_apache._internal.parser.ApacheParser.reset_modules")
def test_prepare_server_https(self, mock_reset):
mock_enable = mock.Mock()
self.config.enable_mod = mock_enable
@@ -570,7 +537,8 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertEqual(mock_add_dir.call_count, 2)
def test_prepare_server_https_named_listen(self):
@mock.patch("certbot_apache._internal.parser.ApacheParser.reset_modules")
def test_prepare_server_https_named_listen(self, mock_reset):
mock_find = mock.Mock()
mock_find.return_value = ["test1", "test2", "test3"]
mock_get = mock.Mock()
@@ -608,7 +576,8 @@ class MultipleVhostsTest(util.ApacheTest):
# self.config.prepare_server_https("8080", temp=True)
# self.assertEqual(self.listens, 0)
def test_prepare_server_https_needed_listen(self):
@mock.patch("certbot_apache._internal.parser.ApacheParser.reset_modules")
def test_prepare_server_https_needed_listen(self, mock_reset):
mock_find = mock.Mock()
mock_find.return_value = ["test1", "test2"]
mock_get = mock.Mock()
@@ -624,8 +593,8 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.prepare_server_https("443")
self.assertEqual(mock_add_dir.call_count, 1)
def test_prepare_server_https_mixed_listen(self):
@mock.patch("certbot_apache._internal.parser.ApacheParser.reset_modules")
def test_prepare_server_https_mixed_listen(self, mock_reset):
mock_find = mock.Mock()
mock_find.return_value = ["test1", "test2"]
mock_get = mock.Mock()
@@ -682,7 +651,7 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertEqual(ssl_vhost.path,
"/files" + ssl_vhost.filep + "/IfModule/Virtualhost")
self.assertEqual(len(ssl_vhost.addrs), 1)
self.assertEqual(set([obj.Addr.fromstring("*:443")]), ssl_vhost.addrs)
self.assertEqual({obj.Addr.fromstring("*:443")}, ssl_vhost.addrs)
self.assertEqual(ssl_vhost.name, "encryption-example.demo")
self.assertTrue(ssl_vhost.ssl)
self.assertFalse(ssl_vhost.enabled)
@@ -904,10 +873,10 @@ class MultipleVhostsTest(util.ApacheTest):
@mock.patch("certbot_apache._internal.display_ops.select_vhost")
@mock.patch("certbot.util.exe_exists")
def test_enhance_unknown_vhost(self, mock_exe, mock_sel_vhost, mock_get):
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
mock_exe.return_value = True
ssl_vh1 = obj.VirtualHost(
"fp1", "ap1", set([obj.Addr(("*", "443"))]),
"fp1", "ap1", {obj.Addr(("*", "443"))},
True, False)
ssl_vh1.name = "satoshi.com"
self.config.vhosts.append(ssl_vh1)
@@ -942,8 +911,8 @@ class MultipleVhostsTest(util.ApacheTest):
@mock.patch("certbot.util.exe_exists")
def test_ocsp_stapling(self, mock_exe):
self.config.parser.update_runtime_variables = mock.Mock()
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("socache_shmcb_module")
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["socache_shmcb_module"] = None
self.config.get_version = mock.Mock(return_value=(2, 4, 7))
mock_exe.return_value = True
@@ -969,8 +938,8 @@ class MultipleVhostsTest(util.ApacheTest):
@mock.patch("certbot.util.exe_exists")
def test_ocsp_stapling_twice(self, mock_exe):
self.config.parser.update_runtime_variables = mock.Mock()
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("socache_shmcb_module")
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["socache_shmcb_module"] = None
self.config.get_version = mock.Mock(return_value=(2, 4, 7))
mock_exe.return_value = True
@@ -997,8 +966,8 @@ class MultipleVhostsTest(util.ApacheTest):
def test_ocsp_unsupported_apache_version(self, mock_exe):
mock_exe.return_value = True
self.config.parser.update_runtime_variables = mock.Mock()
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("socache_shmcb_module")
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["socache_shmcb_module"] = None
self.config.get_version = mock.Mock(return_value=(2, 2, 0))
self.config.choose_vhost("certbot.demo")
@@ -1008,7 +977,7 @@ class MultipleVhostsTest(util.ApacheTest):
def test_get_http_vhost_third_filter(self):
ssl_vh = obj.VirtualHost(
"fp", "ap", set([obj.Addr(("*", "443"))]),
"fp", "ap", {obj.Addr(("*", "443"))},
True, False)
ssl_vh.name = "satoshi.com"
self.config.vhosts.append(ssl_vh)
@@ -1021,8 +990,8 @@ class MultipleVhostsTest(util.ApacheTest):
@mock.patch("certbot.util.exe_exists")
def test_http_header_hsts(self, mock_exe, _):
self.config.parser.update_runtime_variables = mock.Mock()
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("headers_module")
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["headers_module"] = None
mock_exe.return_value = True
# This will create an ssl vhost for certbot.demo
@@ -1042,9 +1011,9 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertEqual(len(hsts_header), 4)
def test_http_header_hsts_twice(self):
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules["mod_ssl.c"] = None
# skip the enable mod
self.config.parser.modules.add("headers_module")
self.config.parser.modules["headers_module"] = None
# This will create an ssl vhost for encryption-example.demo
self.config.choose_vhost("encryption-example.demo")
@@ -1060,8 +1029,8 @@ class MultipleVhostsTest(util.ApacheTest):
@mock.patch("certbot.util.exe_exists")
def test_http_header_uir(self, mock_exe, _):
self.config.parser.update_runtime_variables = mock.Mock()
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("headers_module")
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["headers_module"] = None
mock_exe.return_value = True
@@ -1084,9 +1053,9 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertEqual(len(uir_header), 4)
def test_http_header_uir_twice(self):
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules["mod_ssl.c"] = None
# skip the enable mod
self.config.parser.modules.add("headers_module")
self.config.parser.modules["headers_module"] = None
# This will create an ssl vhost for encryption-example.demo
self.config.choose_vhost("encryption-example.demo")
@@ -1101,7 +1070,7 @@ class MultipleVhostsTest(util.ApacheTest):
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
def test_redirect_well_formed_http(self, mock_exe, _):
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
self.config.parser.update_runtime_variables = mock.Mock()
mock_exe.return_value = True
self.config.get_version = mock.Mock(return_value=(2, 2))
@@ -1127,7 +1096,7 @@ class MultipleVhostsTest(util.ApacheTest):
def test_rewrite_rule_exists(self):
# Skip the enable mod
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
self.config.get_version = mock.Mock(return_value=(2, 3, 9))
self.config.parser.add_dir(
self.vh_truth[3].path, "RewriteRule", ["Unknown"])
@@ -1136,7 +1105,7 @@ class MultipleVhostsTest(util.ApacheTest):
def test_rewrite_engine_exists(self):
# Skip the enable mod
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
self.config.get_version = mock.Mock(return_value=(2, 3, 9))
self.config.parser.add_dir(
self.vh_truth[3].path, "RewriteEngine", "on")
@@ -1146,7 +1115,7 @@ class MultipleVhostsTest(util.ApacheTest):
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
def test_redirect_with_existing_rewrite(self, mock_exe, _):
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
self.config.parser.update_runtime_variables = mock.Mock()
mock_exe.return_value = True
self.config.get_version = mock.Mock(return_value=(2, 2, 0))
@@ -1180,7 +1149,7 @@ class MultipleVhostsTest(util.ApacheTest):
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
def test_redirect_with_old_https_redirection(self, mock_exe, _):
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
self.config.parser.update_runtime_variables = mock.Mock()
mock_exe.return_value = True
self.config.get_version = mock.Mock(return_value=(2, 2, 0))
@@ -1209,10 +1178,10 @@ class MultipleVhostsTest(util.ApacheTest):
def test_redirect_with_conflict(self):
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
ssl_vh = obj.VirtualHost(
"fp", "ap", set([obj.Addr(("*", "443")),
obj.Addr(("zombo.com",))]),
"fp", "ap", {obj.Addr(("*", "443")),
obj.Addr(("zombo.com",))},
True, False)
# No names ^ this guy should conflict.
@@ -1222,7 +1191,7 @@ class MultipleVhostsTest(util.ApacheTest):
def test_redirect_two_domains_one_vhost(self):
# Skip the enable mod
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
self.config.get_version = mock.Mock(return_value=(2, 3, 9))
# Creates ssl vhost for the domain
@@ -1237,7 +1206,7 @@ class MultipleVhostsTest(util.ApacheTest):
def test_redirect_from_previous_run(self):
# Skip the enable mod
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
self.config.get_version = mock.Mock(return_value=(2, 3, 9))
self.config.choose_vhost("red.blue.purple.com")
self.config.enhance("red.blue.purple.com", "redirect")
@@ -1250,22 +1219,22 @@ class MultipleVhostsTest(util.ApacheTest):
self.config.enhance, "green.blue.purple.com", "redirect")
def test_create_own_redirect(self):
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
self.config.get_version = mock.Mock(return_value=(2, 3, 9))
# For full testing... give names...
self.vh_truth[1].name = "default.com"
self.vh_truth[1].aliases = set(["yes.default.com"])
self.vh_truth[1].aliases = {"yes.default.com"}
# pylint: disable=protected-access
self.config._enable_redirect(self.vh_truth[1], "")
self.assertEqual(len(self.config.vhosts), 13)
def test_create_own_redirect_for_old_apache_version(self):
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
self.config.get_version = mock.Mock(return_value=(2, 2))
# For full testing... give names...
self.vh_truth[1].name = "default.com"
self.vh_truth[1].aliases = set(["yes.default.com"])
self.vh_truth[1].aliases = {"yes.default.com"}
# pylint: disable=protected-access
self.config._enable_redirect(self.vh_truth[1], "")
@@ -1326,9 +1295,9 @@ class MultipleVhostsTest(util.ApacheTest):
def test_deploy_cert_not_parsed_path(self):
# Make sure that we add include to root config for vhosts when
# handle-sites is false
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("socache_shmcb_module")
self.config.parser.modules["ssl_module"] = None
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["socache_shmcb_module"] = None
tmp_path = filesystem.realpath(tempfile.mkdtemp("vhostroot"))
filesystem.chmod(tmp_path, 0o755)
mock_p = "certbot_apache._internal.configurator.ApacheConfigurator._get_ssl_vhost_path"
@@ -1345,6 +1314,16 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertTrue(mock_add.called)
shutil.rmtree(tmp_path)
def test_deploy_cert_no_mod_ssl(self):
# Create
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[0])
self.config.parser.modules["socache_shmcb_module"] = None
self.config.prepare_server_https = mock.Mock()
self.assertRaises(errors.MisconfigurationError, self.config.deploy_cert,
"encryption-example.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
@mock.patch("certbot_apache._internal.parser.ApacheParser.parsed_in_original")
def test_choose_vhost_and_servername_addition_parsed(self, mock_parsed):
ret_vh = self.vh_truth[8]
@@ -1441,8 +1420,8 @@ class MultipleVhostsTest(util.ApacheTest):
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator._choose_vhosts_wildcard")
def test_enhance_wildcard_after_install(self, mock_choose):
# pylint: disable=protected-access
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("headers_module")
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["headers_module"] = None
self.vh_truth[3].ssl = True
self.config._wildcard_vhosts["*.certbot.demo"] = [self.vh_truth[3]]
self.config.enhance("*.certbot.demo", "ensure-http-header",
@@ -1453,8 +1432,8 @@ class MultipleVhostsTest(util.ApacheTest):
def test_enhance_wildcard_no_install(self, mock_choose):
self.vh_truth[3].ssl = True
mock_choose.return_value = [self.vh_truth[3]]
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules.add("headers_module")
self.config.parser.modules["mod_ssl.c"] = None
self.config.parser.modules["headers_module"] = None
self.config.enhance("*.certbot.demo", "ensure-http-header",
"Upgrade-Insecure-Requests")
self.assertTrue(mock_choose.called)
@@ -1606,7 +1585,7 @@ class MultiVhostsTest(util.ApacheTest):
self.assertEqual(ssl_vhost.path,
"/files" + ssl_vhost.filep + "/IfModule/VirtualHost")
self.assertEqual(len(ssl_vhost.addrs), 1)
self.assertEqual(set([obj.Addr.fromstring("*:443")]), ssl_vhost.addrs)
self.assertEqual({obj.Addr.fromstring("*:443")}, ssl_vhost.addrs)
self.assertEqual(ssl_vhost.name, "banana.vomit.com")
self.assertTrue(ssl_vhost.ssl)
self.assertFalse(ssl_vhost.enabled)
@@ -1638,7 +1617,7 @@ class MultiVhostsTest(util.ApacheTest):
@certbot_util.patch_get_utility()
def test_make_vhost_ssl_with_existing_rewrite_rule(self, mock_get_utility):
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[4])
@@ -1658,7 +1637,7 @@ class MultiVhostsTest(util.ApacheTest):
@certbot_util.patch_get_utility()
def test_make_vhost_ssl_with_existing_rewrite_conds(self, mock_get_utility):
self.config.parser.modules.add("rewrite_module")
self.config.parser.modules["rewrite_module"] = None
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[3])
@@ -1700,7 +1679,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
self.config.updated_mod_ssl_conf_digest)
def _current_ssl_options_hash(self):
return crypto_util.sha256sum(self.config.option("MOD_SSL_CONF_SRC"))
return crypto_util.sha256sum(self.config.pick_apache_config())
def _assert_current_file(self):
self.assertTrue(os.path.isfile(self.config.mod_ssl_conf))
@@ -1736,7 +1715,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
self.assertFalse(mock_logger.warning.called)
self.assertTrue(os.path.isfile(self.config.mod_ssl_conf))
self.assertEqual(crypto_util.sha256sum(
self.config.option("MOD_SSL_CONF_SRC")),
self.config.pick_apache_config()),
self._current_ssl_options_hash())
self.assertNotEqual(crypto_util.sha256sum(self.config.mod_ssl_conf),
self._current_ssl_options_hash())
@@ -1752,19 +1731,118 @@ class InstallSslOptionsConfTest(util.ApacheTest):
"%s has been manually modified; updated file "
"saved to %s. We recommend updating %s for security purposes.")
self.assertEqual(crypto_util.sha256sum(
self.config.option("MOD_SSL_CONF_SRC")),
self.config.pick_apache_config()),
self._current_ssl_options_hash())
# only print warning once
with mock.patch("certbot.plugins.common.logger") as mock_logger:
self._call()
self.assertFalse(mock_logger.warning.called)
def test_current_file_hash_in_all_hashes(self):
def test_ssl_config_files_hash_in_all_hashes(self):
"""
It is really critical that all TLS Apache config files have their SHA256 hash registered in
constants.ALL_SSL_OPTIONS_HASHES. Otherwise Certbot will mistakenly assume that the config
file has been manually edited by the user, and will refuse to update it.
This test ensures that all necessary hashes are present.
"""
from certbot_apache._internal.constants import ALL_SSL_OPTIONS_HASHES
self.assertTrue(self._current_ssl_options_hash() in ALL_SSL_OPTIONS_HASHES,
"Constants.ALL_SSL_OPTIONS_HASHES must be appended"
" with the sha256 hash of self.config.mod_ssl_conf when it is updated.")
import pkg_resources
tls_configs_dir = pkg_resources.resource_filename(
"certbot_apache", os.path.join("_internal", "tls_configs"))
all_files = [os.path.join(tls_configs_dir, name) for name in os.listdir(tls_configs_dir)
if name.endswith('options-ssl-apache.conf')]
self.assertTrue(all_files)
for one_file in all_files:
file_hash = crypto_util.sha256sum(one_file)
self.assertTrue(file_hash in ALL_SSL_OPTIONS_HASHES,
"Constants.ALL_SSL_OPTIONS_HASHES must be appended with the sha256 "
"hash of {0} when it is updated.".format(one_file))
def test_openssl_version(self):
self.config._openssl_version = None
some_string_contents = b"""
SSLOpenSSLConfCmd
OpenSSL configuration command
SSLv3 not supported by this version of OpenSSL
'%s': invalid OpenSSL configuration command
OpenSSL 1.0.2g 1 Mar 2016
OpenSSL
AH02407: "SSLOpenSSLConfCmd %s %s" failed for %s
AH02556: "SSLOpenSSLConfCmd %s %s" applied to %s
OpenSSL 1.0.2g 1 Mar 2016
"""
# ssl_module as a DSO
self.config.parser.modules['ssl_module'] = '/fake/path'
with mock.patch("certbot_apache._internal.configurator."
"ApacheConfigurator._open_module_file") as mock_omf:
mock_omf.return_value = some_string_contents
self.assertEqual(self.config.openssl_version(), "1.0.2g")
# ssl_module statically linked
self.config._openssl_version = None
self.config.parser.modules['ssl_module'] = None
self.config.options['bin'] = '/fake/path/to/httpd'
with mock.patch("certbot_apache._internal.configurator."
"ApacheConfigurator._open_module_file") as mock_omf:
mock_omf.return_value = some_string_contents
self.assertEqual(self.config.openssl_version(), "1.0.2g")
def test_current_version(self):
self.config.version = (2, 4, 10)
self.config._openssl_version = '1.0.2m'
self.assertTrue('old' in self.config.pick_apache_config())
self.config.version = (2, 4, 11)
self.config._openssl_version = '1.0.2m'
self.assertTrue('current' in self.config.pick_apache_config())
self.config._openssl_version = '1.0.2a'
self.assertTrue('old' in self.config.pick_apache_config())
def test_openssl_version_warns(self):
self.config._openssl_version = '1.0.2a'
self.assertEqual(self.config.openssl_version(), '1.0.2a')
self.config._openssl_version = None
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
self.assertEqual(self.config.openssl_version(), None)
self.assertTrue("Could not find ssl_module" in mock_log.call_args[0][0])
# When no ssl_module is present at all
self.config._openssl_version = None
self.assertTrue("ssl_module" not in self.config.parser.modules)
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
self.assertEqual(self.config.openssl_version(), None)
self.assertTrue("Could not find ssl_module" in mock_log.call_args[0][0])
# When ssl_module is statically linked but --apache-bin not provided
self.config._openssl_version = None
self.config.options['bin'] = None
self.config.parser.modules['ssl_module'] = None
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
self.assertEqual(self.config.openssl_version(), None)
self.assertTrue("ssl_module is statically linked but" in mock_log.call_args[0][0])
self.config.parser.modules['ssl_module'] = "/fake/path"
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
# Check that correct logger.warning was printed
self.assertEqual(self.config.openssl_version(), None)
self.assertTrue("Unable to read" in mock_log.call_args[0][0])
contents_missing_openssl = b"these contents won't match the regex"
with mock.patch("certbot_apache._internal.configurator."
"ApacheConfigurator._open_module_file") as mock_omf:
mock_omf.return_value = contents_missing_openssl
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
# Check that correct logger.warning was printed
self.assertEqual(self.config.openssl_version(), None)
self.assertTrue("Could not find OpenSSL" in mock_log.call_args[0][0])
def test_open_module_file(self):
mock_open = mock.mock_open(read_data="testing 12 3")
with mock.patch("six.moves.builtins.open", mock_open):
self.assertEqual(self.config._open_module_file("/nonsense/"), "testing 12 3")
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -2,7 +2,10 @@
import shutil
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot import errors
from certbot.compat import os
@@ -61,8 +64,8 @@ class MultipleVhostsTestDebian(util.ApacheTest):
def test_deploy_cert_enable_new_vhost(self):
# Create
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[0])
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules["ssl_module"] = None
self.config.parser.modules["mod_ssl.c"] = None
self.assertFalse(ssl_vhost.enabled)
self.config.deploy_cert(
"encryption-example.demo", "example/cert.pem", "example/key.pem",
@@ -92,8 +95,8 @@ class MultipleVhostsTestDebian(util.ApacheTest):
self.config_path, self.vhost_path, self.config_dir,
self.work_dir, version=(2, 4, 16))
self.config = self.mock_deploy_cert(self.config)
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules["ssl_module"] = None
self.config.parser.modules["mod_ssl.c"] = None
# Get the default 443 vhost
self.config.assoc["random.demo"] = self.vh_truth[1]
@@ -128,8 +131,8 @@ class MultipleVhostsTestDebian(util.ApacheTest):
self.config_path, self.vhost_path, self.config_dir,
self.work_dir, version=(2, 4, 16))
self.config = self.mock_deploy_cert(self.config)
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules["ssl_module"] = None
self.config.parser.modules["mod_ssl.c"] = None
# Get the default 443 vhost
self.config.assoc["random.demo"] = self.vh_truth[1]
@@ -143,8 +146,8 @@ class MultipleVhostsTestDebian(util.ApacheTest):
self.config_path, self.vhost_path, self.config_dir,
self.work_dir, version=(2, 4, 7))
self.config = self.mock_deploy_cert(self.config)
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules["ssl_module"] = None
self.config.parser.modules["mod_ssl.c"] = None
# Get the default 443 vhost
self.config.assoc["random.demo"] = self.vh_truth[1]
@@ -157,7 +160,7 @@ class MultipleVhostsTestDebian(util.ApacheTest):
@mock.patch("certbot.util.exe_exists")
def test_ocsp_stapling_enable_mod(self, mock_exe, _):
self.config.parser.update_runtime_variables = mock.Mock()
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules["mod_ssl.c"] = None
self.config.get_version = mock.Mock(return_value=(2, 4, 7))
mock_exe.return_value = True
# This will create an ssl vhost for certbot.demo
@@ -169,7 +172,7 @@ class MultipleVhostsTestDebian(util.ApacheTest):
@mock.patch("certbot.util.exe_exists")
def test_ensure_http_header_enable_mod(self, mock_exe, _):
self.config.parser.update_runtime_variables = mock.Mock()
self.config.parser.modules.add("mod_ssl.c")
self.config.parser.modules["mod_ssl.c"] = None
mock_exe.return_value = True
# This will create an ssl vhost for certbot.demo

View File

@@ -1,7 +1,10 @@
"""Test certbot_apache._internal.display_ops."""
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot import errors
from certbot.display import util as display_util
@@ -93,9 +96,9 @@ class SelectVhostTest(unittest.TestCase):
self.vhosts.append(
obj.VirtualHost(
"path", "aug_path", set([obj.Addr.fromstring("*:80")]),
"path", "aug_path", {obj.Addr.fromstring("*:80")},
False, False,
"wildcard.com", set(["*.wildcard.com"])))
"wildcard.com", {"*.wildcard.com"}))
self.assertEqual(self.vhosts[5], self._call(self.vhosts))

View File

@@ -1,7 +1,10 @@
"""Tests for DualParserNode implementation"""
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot_apache._internal import assertions
from certbot_apache._internal import augeasparser

View File

@@ -1,7 +1,10 @@
"""Test for certbot_apache._internal.entrypoint for override class resolution"""
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot_apache._internal import configurator
from certbot_apache._internal import entrypoint

View File

@@ -1,7 +1,10 @@
"""Test for certbot_apache._internal.configurator for Fedora 29+ overrides"""
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot import errors
from certbot.compat import filesystem
@@ -120,7 +123,7 @@ class MultipleVhostsTestFedora(util.ApacheTest):
return mod_val
return ""
mock_get.side_effect = mock_get_cfg
self.config.parser.modules = set()
self.config.parser.modules = {}
self.config.parser.variables = {}
with mock.patch("certbot.util.get_os_info") as mock_osi:

View File

@@ -1,7 +1,10 @@
"""Test for certbot_apache._internal.configurator for Gentoo overrides"""
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot import errors
from certbot.compat import filesystem
@@ -21,19 +24,19 @@ def get_vh_truth(temp_dir, config_name):
obj.VirtualHost(
os.path.join(prefix, "gentoo.example.com.conf"),
os.path.join(aug_pre, "gentoo.example.com.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80")]),
{obj.Addr.fromstring("*:80")},
False, True, "gentoo.example.com"),
obj.VirtualHost(
os.path.join(prefix, "00_default_vhost.conf"),
os.path.join(aug_pre, "00_default_vhost.conf/IfDefine/VirtualHost"),
set([obj.Addr.fromstring("*:80")]),
{obj.Addr.fromstring("*:80")},
False, True, "localhost"),
obj.VirtualHost(
os.path.join(prefix, "00_default_ssl_vhost.conf"),
os.path.join(aug_pre,
"00_default_ssl_vhost.conf" +
"/IfDefine/IfDefine/IfModule/VirtualHost"),
set([obj.Addr.fromstring("_default_:443")]),
{obj.Addr.fromstring("_default_:443")},
True, True, "localhost")
]
return vh_truth
@@ -117,7 +120,7 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
return mod_val
return None # pragma: no cover
mock_get.side_effect = mock_get_cfg
self.config.parser.modules = set()
self.config.parser.modules = {}
with mock.patch("certbot.util.get_os_info") as mock_osi:
# Make sure we have the have the Gentoo httpd constants

View File

@@ -1,10 +1,13 @@
"""Test for certbot_apache._internal.http_01."""
import unittest
import errno
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from acme import challenges
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from certbot import achallenges
from certbot import errors
from certbot.compat import filesystem
@@ -40,8 +43,8 @@ class ApacheHttp01Test(util.ApacheTest):
modules = ["ssl", "rewrite", "authz_core", "authz_host"]
for mod in modules:
self.config.parser.modules.add("mod_{0}.c".format(mod))
self.config.parser.modules.add(mod + "_module")
self.config.parser.modules["mod_{0}.c".format(mod)] = None
self.config.parser.modules[mod + "_module"] = None
from certbot_apache._internal.http_01 import ApacheHttp01
self.http = ApacheHttp01(self.config)
@@ -52,24 +55,24 @@ class ApacheHttp01Test(util.ApacheTest):
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.enable_mod")
def test_enable_modules_apache_2_2(self, mock_enmod):
self.config.version = (2, 2)
self.config.parser.modules.remove("authz_host_module")
self.config.parser.modules.remove("mod_authz_host.c")
del self.config.parser.modules["authz_host_module"]
del self.config.parser.modules["mod_authz_host.c"]
enmod_calls = self.common_enable_modules_test(mock_enmod)
self.assertEqual(enmod_calls[0][0][0], "authz_host")
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.enable_mod")
def test_enable_modules_apache_2_4(self, mock_enmod):
self.config.parser.modules.remove("authz_core_module")
self.config.parser.modules.remove("mod_authz_core.c")
del self.config.parser.modules["authz_core_module"]
del self.config.parser.modules["mod_authz_host.c"]
enmod_calls = self.common_enable_modules_test(mock_enmod)
self.assertEqual(enmod_calls[0][0][0], "authz_core")
def common_enable_modules_test(self, mock_enmod):
"""Tests enabling mod_rewrite and other modules."""
self.config.parser.modules.remove("rewrite_module")
self.config.parser.modules.remove("mod_rewrite.c")
del self.config.parser.modules["rewrite_module"]
del self.config.parser.modules["mod_rewrite.c"]
self.http.prepare_http01_modules()
@@ -197,6 +200,12 @@ class ApacheHttp01Test(util.ApacheTest):
self.assertTrue(os.path.exists(challenge_dir))
@mock.patch("certbot_apache._internal.http_01.filesystem.makedirs")
def test_failed_makedirs(self, mock_makedirs):
mock_makedirs.side_effect = OSError(errno.EACCES, "msg")
self.http.add_chall(self.achalls[0])
self.assertRaises(errors.PluginError, self.http.perform)
def _test_challenge_conf(self):
with open(self.http.challenge_conf_pre) as f:
pre_conf_contents = f.read()

View File

@@ -14,13 +14,13 @@ class VirtualHostTest(unittest.TestCase):
self.addr_default = Addr.fromstring("_default_:443")
self.vhost1 = VirtualHost(
"filep", "vh_path", set([self.addr1]), False, False, "localhost")
"filep", "vh_path", {self.addr1}, False, False, "localhost")
self.vhost1b = VirtualHost(
"filep", "vh_path", set([self.addr1]), False, False, "localhost")
"filep", "vh_path", {self.addr1}, False, False, "localhost")
self.vhost2 = VirtualHost(
"fp", "vhp", set([self.addr2]), False, False, "localhost")
"fp", "vhp", {self.addr2}, False, False, "localhost")
def test_repr(self):
self.assertEqual(repr(self.addr2),
@@ -42,7 +42,7 @@ class VirtualHostTest(unittest.TestCase):
complex_vh = VirtualHost(
"fp", "vhp",
set([Addr.fromstring("*:443"), Addr.fromstring("1.2.3.4:443")]),
{Addr.fromstring("*:443"), Addr.fromstring("1.2.3.4:443")},
False, False)
self.assertTrue(complex_vh.conflicts([self.addr1]))
self.assertTrue(complex_vh.conflicts([self.addr2]))
@@ -57,14 +57,14 @@ class VirtualHostTest(unittest.TestCase):
def test_same_server(self):
from certbot_apache._internal.obj import VirtualHost
no_name1 = VirtualHost(
"fp", "vhp", set([self.addr1]), False, False, None)
"fp", "vhp", {self.addr1}, False, False, None)
no_name2 = VirtualHost(
"fp", "vhp", set([self.addr2]), False, False, None)
"fp", "vhp", {self.addr2}, False, False, None)
no_name3 = VirtualHost(
"fp", "vhp", set([self.addr_default]),
"fp", "vhp", {self.addr_default},
False, False, None)
no_name4 = VirtualHost(
"fp", "vhp", set([self.addr2, self.addr_default]),
"fp", "vhp", {self.addr2, self.addr_default},
False, False, None)
self.assertTrue(self.vhost1.same_server(self.vhost2))

View File

@@ -2,7 +2,10 @@
import shutil
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
from certbot import errors
from certbot.compat import os
@@ -114,7 +117,7 @@ class BasicParserTest(util.ParserTest):
"""
from certbot_apache._internal.parser import get_aug_path
# This makes sure that find_dir will work
self.parser.modules.add("mod_ssl.c")
self.parser.modules["mod_ssl.c"] = "/fake/path"
self.parser.add_dir_to_ifmodssl(
get_aug_path(self.parser.loc["default"]),
@@ -128,7 +131,7 @@ class BasicParserTest(util.ParserTest):
def test_add_dir_to_ifmodssl_multiple(self):
from certbot_apache._internal.parser import get_aug_path
# This makes sure that find_dir will work
self.parser.modules.add("mod_ssl.c")
self.parser.modules["mod_ssl.c"] = "/fake/path"
self.parser.add_dir_to_ifmodssl(
get_aug_path(self.parser.loc["default"]),
@@ -260,7 +263,7 @@ class BasicParserTest(util.ParserTest):
expected_vars = {"TEST": "", "U_MICH": "", "TLS": "443",
"example_path": "Documents/path"}
self.parser.modules = set()
self.parser.modules = {}
with mock.patch(
"certbot_apache._internal.parser.ApacheParser.parse_file") as mock_parse:
self.parser.update_runtime_variables()
@@ -282,7 +285,7 @@ class BasicParserTest(util.ParserTest):
os.path.dirname(self.parser.loc["root"]))
mock_cfg.return_value = inc_val
self.parser.modules = set()
self.parser.modules = {}
with mock.patch(
"certbot_apache._internal.parser.ApacheParser.parse_file") as mock_parse:

View File

@@ -1,11 +1,21 @@
"""Tests for ApacheConfigurator for AugeasParserNode classes"""
import unittest
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import util
try:
import apacheconfig
HAS_APACHECONFIG = True
except ImportError: # pragma: no cover
HAS_APACHECONFIG = False
@unittest.skipIf(not HAS_APACHECONFIG, reason='Tests require apacheconfig dependency')
class ConfiguratorParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-methods
"""Test AugeasParserNode using available test configurations"""

View File

@@ -26,7 +26,7 @@ Listen 443
# Pass Phrase Dialog:
# Configure the pass phrase gathering process.
# The filtering dialog program (`builtin' is a internal
# The filtering dialog program (`builtin' is an internal
# terminal dialog) has to provide the pass phrase on stdout.
SSLPassPhraseDialog builtin

View File

@@ -702,7 +702,7 @@ IndexIgnore .??* *~ *# HEADER* README* RCS CVS *,v *,t
# English (en) - Esperanto (eo) - Estonian (et) - French (fr) - German (de)
# Greek-Modern (el) - Hebrew (he) - Italian (it) - Japanese (ja)
# Korean (ko) - Luxembourgeois* (ltz) - Norwegian Nynorsk (nn)
# Norwegian (no) - Polish (pl) - Portugese (pt)
# Norwegian (no) - Polish (pl) - Portuguese (pt)
# Brazilian Portuguese (pt-BR) - Russian (ru) - Swedish (sv)
# Simplified Chinese (zh-CN) - Spanish (es) - Traditional Chinese (zh-TW)
#

View File

@@ -13,7 +13,7 @@ Listen 443 https
# Pass Phrase Dialog:
# Configure the pass phrase gathering process.
# The filtering dialog program (`builtin' is a internal
# The filtering dialog program (`builtin' is an internal
# terminal dialog) has to provide the pass phrase on stdout.
SSLPassPhraseDialog exec:/usr/libexec/httpd-ssl-pass-dialog

View File

@@ -31,7 +31,7 @@
# Pass Phrase Dialog:
# Configure the pass phrase gathering process.
# The filtering dialog program (`builtin' is a internal
# The filtering dialog program (`builtin' is an internal
# terminal dialog) has to provide the pass phrase on stdout.
SSLPassPhraseDialog exec:/usr/share/apache2/ask-for-passphrase

View File

@@ -31,7 +31,7 @@
# Pass Phrase Dialog:
# Configure the pass phrase gathering process.
# The filtering dialog program (`builtin' is a internal
# The filtering dialog program (`builtin' is an internal
# terminal dialog) has to provide the pass phrase on stdout.
SSLPassPhraseDialog exec:/usr/share/apache2/ask-for-passphrase

View File

@@ -31,7 +31,7 @@
# Pass Phrase Dialog:
# Configure the pass phrase gathering process.
# The filtering dialog program (`builtin' is a internal
# The filtering dialog program (`builtin' is an internal
# terminal dialog) has to provide the pass phrase on stdout.
SSLPassPhraseDialog exec:/usr/share/apache2/ask-for-passphrase

View File

@@ -33,7 +33,7 @@
# English (en) - Esperanto (eo) - Estonian (et) - French (fr) - German (de)
# Greek-Modern (el) - Hebrew (he) - Italian (it) - Japanese (ja)
# Korean (ko) - Luxembourgeois* (ltz) - Norwegian Nynorsk (nn)
# Norwegian (no) - Polish (pl) - Portugese (pt)
# Norwegian (no) - Polish (pl) - Portuguese (pt)
# Brazilian Portuguese (pt-BR) - Russian (ru) - Swedish (sv)
# Simplified Chinese (zh-CN) - Spanish (es) - Traditional Chinese (zh-TW)
AddLanguage ca .ca

View File

@@ -43,7 +43,7 @@ SSLRandomSeed connect builtin
## Pass Phrase Dialog:
# Configure the pass phrase gathering process. The filtering dialog program
# (`builtin' is a internal terminal dialog) has to provide the pass phrase on
# (`builtin' is an internal terminal dialog) has to provide the pass phrase on
# stdout.
SSLPassPhraseDialog builtin

View File

@@ -5,7 +5,10 @@ import unittest
import augeas
import josepy as jose
import mock
try:
import mock
except ImportError: # pragma: no cover
from unittest import mock # type: ignore
import zope.component
from certbot.compat import os
@@ -85,7 +88,8 @@ def get_apache_configurator(
config_dir, work_dir, version=(2, 4, 7),
os_info="generic",
conf_vhost_path=None,
use_parsernode=False):
use_parsernode=False,
openssl_version="1.1.1a"):
"""Create an Apache Configurator with the specified options.
:param conf: Function that returns binary paths. self.conf in Configurator
@@ -118,7 +122,8 @@ def get_apache_configurator(
except KeyError:
config_class = configurator.ApacheConfigurator
config = config_class(config=mock_le_config, name="apache",
version=version, use_parsernode=use_parsernode)
version=version, use_parsernode=use_parsernode,
openssl_version=openssl_version)
if not conf_vhost_path:
config_class.OS_DEFAULTS["vhost_root"] = vhost_path
else:
@@ -140,71 +145,71 @@ def get_vh_truth(temp_dir, config_name):
obj.VirtualHost(
os.path.join(prefix, "encryption-example.conf"),
os.path.join(aug_pre, "encryption-example.conf/Virtualhost"),
set([obj.Addr.fromstring("*:80")]),
{obj.Addr.fromstring("*:80")},
False, True, "encryption-example.demo"),
obj.VirtualHost(
os.path.join(prefix, "default-ssl.conf"),
os.path.join(aug_pre,
"default-ssl.conf/IfModule/VirtualHost"),
set([obj.Addr.fromstring("_default_:443")]), True, True),
{obj.Addr.fromstring("_default_:443")}, True, True),
obj.VirtualHost(
os.path.join(prefix, "000-default.conf"),
os.path.join(aug_pre, "000-default.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80"),
obj.Addr.fromstring("[::]:80")]),
{obj.Addr.fromstring("*:80"),
obj.Addr.fromstring("[::]:80")},
False, True, "ip-172-30-0-17"),
obj.VirtualHost(
os.path.join(prefix, "certbot.conf"),
os.path.join(aug_pre, "certbot.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80")]), False, True,
{obj.Addr.fromstring("*:80")}, False, True,
"certbot.demo", aliases=["www.certbot.demo"]),
obj.VirtualHost(
os.path.join(prefix, "mod_macro-example.conf"),
os.path.join(aug_pre,
"mod_macro-example.conf/Macro/VirtualHost"),
set([obj.Addr.fromstring("*:80")]), False, True,
{obj.Addr.fromstring("*:80")}, False, True,
modmacro=True),
obj.VirtualHost(
os.path.join(prefix, "default-ssl-port-only.conf"),
os.path.join(aug_pre, ("default-ssl-port-only.conf/"
"IfModule/VirtualHost")),
set([obj.Addr.fromstring("_default_:443")]), True, True),
{obj.Addr.fromstring("_default_:443")}, True, True),
obj.VirtualHost(
os.path.join(prefix, "wildcard.conf"),
os.path.join(aug_pre, "wildcard.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80")]), False, True,
{obj.Addr.fromstring("*:80")}, False, True,
"ip-172-30-0-17", aliases=["*.blue.purple.com"]),
obj.VirtualHost(
os.path.join(prefix, "ocsp-ssl.conf"),
os.path.join(aug_pre, "ocsp-ssl.conf/IfModule/VirtualHost"),
set([obj.Addr.fromstring("10.2.3.4:443")]), True, True,
{obj.Addr.fromstring("10.2.3.4:443")}, True, True,
"ocspvhost.com"),
obj.VirtualHost(
os.path.join(prefix, "non-symlink.conf"),
os.path.join(aug_pre, "non-symlink.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80")]), False, True,
{obj.Addr.fromstring("*:80")}, False, True,
"nonsym.link"),
obj.VirtualHost(
os.path.join(prefix, "default-ssl-port-only.conf"),
os.path.join(aug_pre,
"default-ssl-port-only.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80")]), True, True, ""),
{obj.Addr.fromstring("*:80")}, True, True, ""),
obj.VirtualHost(
os.path.join(temp_dir, config_name,
"apache2/apache2.conf"),
"/files" + os.path.join(temp_dir, config_name,
"apache2/apache2.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80")]), False, True,
{obj.Addr.fromstring("*:80")}, False, True,
"vhost.in.rootconf"),
obj.VirtualHost(
os.path.join(prefix, "duplicatehttp.conf"),
os.path.join(aug_pre, "duplicatehttp.conf/VirtualHost"),
set([obj.Addr.fromstring("10.2.3.4:80")]), False, True,
{obj.Addr.fromstring("10.2.3.4:80")}, False, True,
"duplicate.example.com"),
obj.VirtualHost(
os.path.join(prefix, "duplicatehttps.conf"),
os.path.join(aug_pre, "duplicatehttps.conf/IfModule/VirtualHost"),
set([obj.Addr.fromstring("10.2.3.4:443")]), True, True,
{obj.Addr.fromstring("10.2.3.4:443")}, True, True,
"duplicate.example.com")]
return vh_truth
if config_name == "debian_apache_2_4/multi_vhosts":
@@ -215,27 +220,27 @@ def get_vh_truth(temp_dir, config_name):
obj.VirtualHost(
os.path.join(prefix, "default.conf"),
os.path.join(aug_pre, "default.conf/VirtualHost[1]"),
set([obj.Addr.fromstring("*:80")]),
{obj.Addr.fromstring("*:80")},
False, True, "ip-172-30-0-17"),
obj.VirtualHost(
os.path.join(prefix, "default.conf"),
os.path.join(aug_pre, "default.conf/VirtualHost[2]"),
set([obj.Addr.fromstring("*:80")]),
{obj.Addr.fromstring("*:80")},
False, True, "banana.vomit.com"),
obj.VirtualHost(
os.path.join(prefix, "multi-vhost.conf"),
os.path.join(aug_pre, "multi-vhost.conf/VirtualHost[1]"),
set([obj.Addr.fromstring("*:80")]),
{obj.Addr.fromstring("*:80")},
False, True, "1.multi.vhost.tld"),
obj.VirtualHost(
os.path.join(prefix, "multi-vhost.conf"),
os.path.join(aug_pre, "multi-vhost.conf/IfModule/VirtualHost"),
set([obj.Addr.fromstring("*:80")]),
{obj.Addr.fromstring("*:80")},
False, True, "2.multi.vhost.tld"),
obj.VirtualHost(
os.path.join(prefix, "multi-vhost.conf"),
os.path.join(aug_pre, "multi-vhost.conf/VirtualHost[2]"),
set([obj.Addr.fromstring("*:80")]),
{obj.Addr.fromstring("*:80")},
False, True, "3.multi.vhost.tld")]
return vh_truth
return None # pragma: no cover

View File

@@ -31,7 +31,7 @@ if [ -z "$VENV_PATH" ]; then
fi
VENV_BIN="$VENV_PATH/bin"
BOOTSTRAP_VERSION_PATH="$VENV_PATH/certbot-auto-bootstrap-version.txt"
LE_AUTO_VERSION="1.0.0"
LE_AUTO_VERSION="1.6.0"
BASENAME=$(basename $0)
USAGE="Usage: $BASENAME [OPTIONS]
A self-updating wrapper script for the Certbot ACME client. When run, updates
@@ -256,20 +256,28 @@ DeprecationBootstrap() {
fi
}
MIN_PYTHON_VERSION="2.7"
MIN_PYVER=$(echo "$MIN_PYTHON_VERSION" | sed 's/\.//')
MIN_PYTHON_2_VERSION="2.7"
MIN_PYVER2=$(echo "$MIN_PYTHON_2_VERSION" | sed 's/\.//')
MIN_PYTHON_3_VERSION="3.5"
MIN_PYVER3=$(echo "$MIN_PYTHON_3_VERSION" | sed 's/\.//')
# Sets LE_PYTHON to Python version string and PYVER to the first two
# digits of the python version
# digits of the python version.
# MIN_PYVER and MIN_PYTHON_VERSION are also set by this function, and their
# values depend on if we try to use Python 3 or Python 2.
DeterminePythonVersion() {
# Arguments: "NOCRASH" if we shouldn't crash if we don't find a good python
#
# If no Python is found, PYVER is set to 0.
if [ "$USE_PYTHON_3" = 1 ]; then
MIN_PYVER=$MIN_PYVER3
MIN_PYTHON_VERSION=$MIN_PYTHON_3_VERSION
for LE_PYTHON in "$LE_PYTHON" python3; do
# Break (while keeping the LE_PYTHON value) if found.
$EXISTS "$LE_PYTHON" > /dev/null && break
done
else
MIN_PYVER=$MIN_PYVER2
MIN_PYTHON_VERSION=$MIN_PYTHON_2_VERSION
for LE_PYTHON in "$LE_PYTHON" python2.7 python27 python2 python; do
# Break (while keeping the LE_PYTHON value) if found.
$EXISTS "$LE_PYTHON" > /dev/null && break
@@ -285,7 +293,7 @@ DeterminePythonVersion() {
fi
fi
PYVER=`"$LE_PYTHON" -V 2>&1 | cut -d" " -f 2 | cut -d. -f1,2 | sed 's/\.//'`
PYVER=$("$LE_PYTHON" -V 2>&1 | cut -d" " -f 2 | cut -d. -f1,2 | sed 's/\.//')
if [ "$PYVER" -lt "$MIN_PYVER" ]; then
if [ "$1" != "NOCRASH" ]; then
error "You have an ancient version of Python entombed in your operating system..."
@@ -368,7 +376,9 @@ BootstrapDebCommon() {
# Sets TOOL to the name of the package manager
# Sets appropriate values for YES_FLAG and QUIET_FLAG based on $ASSUME_YES and $QUIET_FLAG.
# Enables EPEL if applicable and possible.
# Note: this function is called both while selecting the bootstrap scripts and
# during the actual bootstrap. Some things like prompting to user can be done in the latter
# case, but not in the former one.
InitializeRPMCommonBase() {
if type dnf 2>/dev/null
then
@@ -388,26 +398,6 @@ InitializeRPMCommonBase() {
if [ "$QUIET" = 1 ]; then
QUIET_FLAG='--quiet'
fi
if ! $TOOL list *virtualenv >/dev/null 2>&1; then
echo "To use Certbot, packages from the EPEL repository need to be installed."
if ! $TOOL list epel-release >/dev/null 2>&1; then
error "Enable the EPEL repository and try running Certbot again."
exit 1
fi
if [ "$ASSUME_YES" = 1 ]; then
/bin/echo -n "Enabling the EPEL repository in 3 seconds..."
sleep 1s
/bin/echo -ne "\e[0K\rEnabling the EPEL repository in 2 seconds..."
sleep 1s
/bin/echo -e "\e[0K\rEnabling the EPEL repository in 1 second..."
sleep 1s
fi
if ! $TOOL install $YES_FLAG $QUIET_FLAG epel-release; then
error "Could not enable EPEL. Aborting bootstrap!"
exit 1
fi
fi
}
BootstrapRpmCommonBase() {
@@ -488,13 +478,91 @@ BootstrapRpmCommon() {
BootstrapRpmCommonBase "$python_pkgs"
}
# If new packages are installed by BootstrapRpmPython3 below, this version
# number must be increased.
BOOTSTRAP_RPM_PYTHON3_LEGACY_VERSION=1
# Checks if rh-python36 can be installed.
Python36SclIsAvailable() {
InitializeRPMCommonBase >/dev/null 2>&1;
if "${TOOL}" list rh-python36 >/dev/null 2>&1; then
return 0
fi
if "${TOOL}" list centos-release-scl >/dev/null 2>&1; then
return 0
fi
return 1
}
# Try to enable rh-python36 from SCL if it is necessary and possible.
EnablePython36SCL() {
if "$EXISTS" python3.6 > /dev/null 2> /dev/null; then
return 0
fi
if [ ! -f /opt/rh/rh-python36/enable ]; then
return 0
fi
set +e
if ! . /opt/rh/rh-python36/enable; then
error 'Unable to enable rh-python36!'
exit 1
fi
set -e
}
# This bootstrap concerns old RedHat-based distributions that do not ship by default
# with Python 2.7, but only Python 2.6. We bootstrap them by enabling SCL and installing
# Python 3.6. Some of these distributions are: CentOS/RHEL/OL/SL 6.
BootstrapRpmPython3Legacy() {
# Tested with:
# - CentOS 6
InitializeRPMCommonBase
if ! "${TOOL}" list rh-python36 >/dev/null 2>&1; then
echo "To use Certbot on this operating system, packages from the SCL repository need to be installed."
if ! "${TOOL}" list centos-release-scl >/dev/null 2>&1; then
error "Enable the SCL repository and try running Certbot again."
exit 1
fi
if [ "${ASSUME_YES}" = 1 ]; then
/bin/echo -n "Enabling the SCL repository in 3 seconds... (Press Ctrl-C to cancel)"
sleep 1s
/bin/echo -ne "\e[0K\rEnabling the SCL repository in 2 seconds... (Press Ctrl-C to cancel)"
sleep 1s
/bin/echo -e "\e[0K\rEnabling the SCL repository in 1 second... (Press Ctrl-C to cancel)"
sleep 1s
fi
if ! "${TOOL}" install "${YES_FLAG}" "${QUIET_FLAG}" centos-release-scl; then
error "Could not enable SCL. Aborting bootstrap!"
exit 1
fi
fi
# CentOS 6 must use rh-python36 from SCL
if "${TOOL}" list rh-python36 >/dev/null 2>&1; then
python_pkgs="rh-python36-python
rh-python36-python-virtualenv
rh-python36-python-devel
"
else
error "No supported Python package available to install. Aborting bootstrap!"
exit 1
fi
BootstrapRpmCommonBase "${python_pkgs}"
# Enable SCL rh-python36 after bootstrapping.
EnablePython36SCL
}
# If new packages are installed by BootstrapRpmPython3 below, this version
# number must be increased.
BOOTSTRAP_RPM_PYTHON3_VERSION=1
BootstrapRpmPython3() {
# Tested with:
# - CentOS 6
# - Fedora 29
InitializeRPMCommonBase
@@ -505,12 +573,6 @@ BootstrapRpmPython3() {
python3-virtualenv
python3-devel
"
# EPEL uses python34
elif $TOOL list python34 >/dev/null 2>&1; then
python_pkgs="python34
python34-devel
python34-tools
"
else
error "No supported Python package available to install. Aborting bootstrap!"
exit 1
@@ -758,6 +820,11 @@ elif [ -f /etc/redhat-release ]; then
RPM_DIST_NAME=`(. /etc/os-release 2> /dev/null && echo $ID) || echo "unknown"`
if [ "$PYVER" -eq 26 -a $(uname -m) != 'x86_64' ]; then
# 32 bits CentOS 6 and affiliates are not supported anymore by certbot-auto.
DEPRECATED_OS=1
fi
# Set RPM_DIST_VERSION to VERSION_ID from /etc/os-release after splitting on
# '.' characters (e.g. "8.0" becomes "8"). If the command exits with an
# error, RPM_DIST_VERSION is set to "unknown".
@@ -769,31 +836,50 @@ elif [ -f /etc/redhat-release ]; then
RPM_DIST_VERSION=0
fi
# Starting to Fedora 29, python2 is on a deprecation path. Let's move to python3 then.
# RHEL 8 also uses python3 by default.
if [ "$RPM_DIST_NAME" = "fedora" -a "$RPM_DIST_VERSION" -ge 29 -o "$PYVER" -eq 26 ]; then
RPM_USE_PYTHON_3=1
elif [ "$RPM_DIST_NAME" = "rhel" -a "$RPM_DIST_VERSION" -ge 8 ]; then
RPM_USE_PYTHON_3=1
elif [ "$RPM_DIST_NAME" = "centos" -a "$RPM_DIST_VERSION" -ge 8 ]; then
RPM_USE_PYTHON_3=1
else
RPM_USE_PYTHON_3=0
fi
# Handle legacy RPM distributions
if [ "$PYVER" -eq 26 ]; then
# Check if an automated bootstrap can be achieved on this system.
if ! Python36SclIsAvailable; then
INTERACTIVE_BOOTSTRAP=1
fi
if [ "$RPM_USE_PYTHON_3" = 1 ]; then
Bootstrap() {
BootstrapMessage "RedHat-based OSes that will use Python3"
BootstrapRpmPython3
BootstrapMessage "Legacy RedHat-based OSes that will use Python3"
BootstrapRpmPython3Legacy
}
USE_PYTHON_3=1
BOOTSTRAP_VERSION="BootstrapRpmPython3 $BOOTSTRAP_RPM_PYTHON3_VERSION"
BOOTSTRAP_VERSION="BootstrapRpmPython3Legacy $BOOTSTRAP_RPM_PYTHON3_LEGACY_VERSION"
# Try now to enable SCL rh-python36 for systems already bootstrapped
# NB: EnablePython36SCL has been defined along with BootstrapRpmPython3Legacy in certbot-auto
EnablePython36SCL
else
Bootstrap() {
BootstrapMessage "RedHat-based OSes"
BootstrapRpmCommon
}
BOOTSTRAP_VERSION="BootstrapRpmCommon $BOOTSTRAP_RPM_COMMON_VERSION"
# Starting to Fedora 29, python2 is on a deprecation path. Let's move to python3 then.
# RHEL 8 also uses python3 by default.
if [ "$RPM_DIST_NAME" = "fedora" -a "$RPM_DIST_VERSION" -ge 29 ]; then
RPM_USE_PYTHON_3=1
elif [ "$RPM_DIST_NAME" = "rhel" -a "$RPM_DIST_VERSION" -ge 8 ]; then
RPM_USE_PYTHON_3=1
elif [ "$RPM_DIST_NAME" = "centos" -a "$RPM_DIST_VERSION" -ge 8 ]; then
RPM_USE_PYTHON_3=1
else
RPM_USE_PYTHON_3=0
fi
if [ "$RPM_USE_PYTHON_3" = 1 ]; then
Bootstrap() {
BootstrapMessage "RedHat-based OSes that will use Python3"
BootstrapRpmPython3
}
USE_PYTHON_3=1
BOOTSTRAP_VERSION="BootstrapRpmPython3 $BOOTSTRAP_RPM_PYTHON3_VERSION"
else
Bootstrap() {
BootstrapMessage "RedHat-based OSes"
BootstrapRpmCommon
}
BOOTSTRAP_VERSION="BootstrapRpmCommon $BOOTSTRAP_RPM_COMMON_VERSION"
fi
fi
LE_PYTHON="$prev_le_python"
@@ -824,20 +910,11 @@ elif [ -f /etc/manjaro-release ]; then
}
BOOTSTRAP_VERSION="BootstrapArchCommon $BOOTSTRAP_ARCH_COMMON_VERSION"
elif [ -f /etc/gentoo-release ]; then
Bootstrap() {
DeprecationBootstrap "Gentoo" BootstrapGentooCommon
}
BOOTSTRAP_VERSION="BootstrapGentooCommon $BOOTSTRAP_GENTOO_COMMON_VERSION"
DEPRECATED_OS=1
elif uname | grep -iq FreeBSD ; then
Bootstrap() {
DeprecationBootstrap "FreeBSD" BootstrapFreeBsd
}
BOOTSTRAP_VERSION="BootstrapFreeBsd $BOOTSTRAP_FREEBSD_VERSION"
DEPRECATED_OS=1
elif uname | grep -iq Darwin ; then
Bootstrap() {
DeprecationBootstrap "macOS" BootstrapMac
}
BOOTSTRAP_VERSION="BootstrapMac $BOOTSTRAP_MAC_VERSION"
DEPRECATED_OS=1
elif [ -f /etc/issue ] && grep -iq "Amazon Linux" /etc/issue ; then
Bootstrap() {
ExperimentalBootstrap "Amazon Linux" BootstrapRpmCommon
@@ -870,6 +947,13 @@ if [ "$NO_BOOTSTRAP" = 1 ]; then
unset BOOTSTRAP_VERSION
fi
if [ "$DEPRECATED_OS" = 1 ]; then
Bootstrap() {
error "Skipping bootstrap because certbot-auto is deprecated on this system."
}
unset BOOTSTRAP_VERSION
fi
# Sets PREV_BOOTSTRAP_VERSION to the identifier for the bootstrap script used
# to install OS dependencies on this system. PREV_BOOTSTRAP_VERSION isn't set
# if it is unknown how OS dependencies were installed on this system.
@@ -1067,6 +1151,28 @@ if [ "$1" = "--le-auto-phase2" ]; then
# Phase 2: Create venv, install LE, and run.
shift 1 # the --le-auto-phase2 arg
if [ "$DEPRECATED_OS" = 1 ]; then
# Phase 2 damage control mode for deprecated OSes.
# In this situation, we bypass any bootstrap or certbot venv setup.
error "Your system is not supported by certbot-auto anymore."
if [ ! -d "$VENV_PATH" ] && OldVenvExists; then
VENV_BIN="$OLD_VENV_PATH/bin"
fi
if [ -f "$VENV_BIN/letsencrypt" -a "$INSTALL_ONLY" != 1 ]; then
error "Certbot will no longer receive updates."
error "Please visit https://certbot.eff.org/ to check for other alternatives."
"$VENV_BIN/letsencrypt" "$@"
exit 0
else
error "Certbot cannot be installed."
error "Please visit https://certbot.eff.org/ to check for other alternatives."
exit 1
fi
fi
SetPrevBootstrapVersion
if [ -z "$PHASE_1_VERSION" -a "$USE_PYTHON_3" = 1 ]; then
@@ -1078,8 +1184,15 @@ if [ "$1" = "--le-auto-phase2" ]; then
# If the selected Bootstrap function isn't a noop and it differs from the
# previously used version
if [ -n "$BOOTSTRAP_VERSION" -a "$BOOTSTRAP_VERSION" != "$PREV_BOOTSTRAP_VERSION" ]; then
# if non-interactive mode or stdin and stdout are connected to a terminal
if [ \( "$NONINTERACTIVE" = 1 \) -o \( \( -t 0 \) -a \( -t 1 \) \) ]; then
# Check if we can rebootstrap without manual user intervention: this requires that
# certbot-auto is in non-interactive mode AND selected bootstrap does not claim to
# require a manual user intervention.
if [ "$NONINTERACTIVE" = 1 -a "$INTERACTIVE_BOOTSTRAP" != 1 ]; then
CAN_REBOOTSTRAP=1
fi
# Check if rebootstrap can be done non-interactively and current shell is non-interactive
# (true if stdin and stdout are not attached to a terminal).
if [ \( "$CAN_REBOOTSTRAP" = 1 \) -o \( \( -t 0 \) -a \( -t 1 \) \) ]; then
if [ -d "$VENV_PATH" ]; then
rm -rf "$VENV_PATH"
fi
@@ -1090,12 +1203,21 @@ if [ "$1" = "--le-auto-phase2" ]; then
ln -s "$VENV_PATH" "$OLD_VENV_PATH"
fi
RerunWithArgs "$@"
# Otherwise bootstrap needs to be done manually by the user.
else
error "Skipping upgrade because new OS dependencies may need to be installed."
error
error "To upgrade to a newer version, please run this script again manually so you can"
error "approve changes or with --non-interactive on the command line to automatically"
error "install any required packages."
# If it is because bootstrapping is interactive, --non-interactive will be of no use.
if [ "$INTERACTIVE_BOOTSTRAP" = 1 ]; then
error "Skipping upgrade because new OS dependencies may need to be installed."
error "This requires manual user intervention: please run this script again manually."
# If this is because of the environment (eg. non interactive shell without
# --non-interactive flag set), help the user in that direction.
else
error "Skipping upgrade because new OS dependencies may need to be installed."
error
error "To upgrade to a newer version, please run this script again manually so you can"
error "approve changes or with --non-interactive on the command line to automatically"
error "install any required packages."
fi
# Set INSTALLED_VERSION to be the same so we don't update the venv
INSTALLED_VERSION="$LE_AUTO_VERSION"
# Continue to use OLD_VENV_PATH if the new venv doesn't exist
@@ -1143,45 +1265,40 @@ if [ "$1" = "--le-auto-phase2" ]; then
# pip install hashin
# hashin -r dependency-requirements.txt cryptography==1.5.2
# ```
ConfigArgParse==0.14.0 \
--hash=sha256:2e2efe2be3f90577aca9415e32cb629aa2ecd92078adbe27b53a03e53ff12e91
certifi==2019.9.11 \
--hash=sha256:e4f3620cfea4f83eedc95b24abd9cd56f3c4b146dd0177e83a21b4eb49e21e50 \
--hash=sha256:fd7c7c74727ddcf00e9acd26bba8da604ffec95bf1c2144e67aff7a8b50e6cef
cffi==1.13.2 \
--hash=sha256:0b49274afc941c626b605fb59b59c3485c17dc776dc3cc7cc14aca74cc19cc42 \
--hash=sha256:0e3ea92942cb1168e38c05c1d56b0527ce31f1a370f6117f1d490b8dcd6b3a04 \
--hash=sha256:135f69aecbf4517d5b3d6429207b2dff49c876be724ac0c8bf8e1ea99df3d7e5 \
--hash=sha256:19db0cdd6e516f13329cba4903368bff9bb5a9331d3410b1b448daaadc495e54 \
--hash=sha256:2781e9ad0e9d47173c0093321bb5435a9dfae0ed6a762aabafa13108f5f7b2ba \
--hash=sha256:291f7c42e21d72144bb1c1b2e825ec60f46d0a7468f5346841860454c7aa8f57 \
--hash=sha256:2c5e309ec482556397cb21ede0350c5e82f0eb2621de04b2633588d118da4396 \
--hash=sha256:2e9c80a8c3344a92cb04661115898a9129c074f7ab82011ef4b612f645939f12 \
--hash=sha256:32a262e2b90ffcfdd97c7a5e24a6012a43c61f1f5a57789ad80af1d26c6acd97 \
--hash=sha256:3c9fff570f13480b201e9ab69453108f6d98244a7f495e91b6c654a47486ba43 \
--hash=sha256:415bdc7ca8c1c634a6d7163d43fb0ea885a07e9618a64bda407e04b04333b7db \
--hash=sha256:42194f54c11abc8583417a7cf4eaff544ce0de8187abaf5d29029c91b1725ad3 \
--hash=sha256:4424e42199e86b21fc4db83bd76909a6fc2a2aefb352cb5414833c030f6ed71b \
--hash=sha256:4a43c91840bda5f55249413037b7a9b79c90b1184ed504883b72c4df70778579 \
--hash=sha256:599a1e8ff057ac530c9ad1778293c665cb81a791421f46922d80a86473c13346 \
--hash=sha256:5c4fae4e9cdd18c82ba3a134be256e98dc0596af1e7285a3d2602c97dcfa5159 \
--hash=sha256:5ecfa867dea6fabe2a58f03ac9186ea64da1386af2159196da51c4904e11d652 \
--hash=sha256:62f2578358d3a92e4ab2d830cd1c2049c9c0d0e6d3c58322993cc341bdeac22e \
--hash=sha256:6471a82d5abea994e38d2c2abc77164b4f7fbaaf80261cb98394d5793f11b12a \
--hash=sha256:6d4f18483d040e18546108eb13b1dfa1000a089bcf8529e30346116ea6240506 \
--hash=sha256:71a608532ab3bd26223c8d841dde43f3516aa5d2bf37b50ac410bb5e99053e8f \
--hash=sha256:74a1d8c85fb6ff0b30fbfa8ad0ac23cd601a138f7509dc617ebc65ef305bb98d \
--hash=sha256:7b93a885bb13073afb0aa73ad82059a4c41f4b7d8eb8368980448b52d4c7dc2c \
--hash=sha256:7d4751da932caaec419d514eaa4215eaf14b612cff66398dd51129ac22680b20 \
--hash=sha256:7f627141a26b551bdebbc4855c1157feeef18241b4b8366ed22a5c7d672ef858 \
--hash=sha256:8169cf44dd8f9071b2b9248c35fc35e8677451c52f795daa2bb4643f32a540bc \
--hash=sha256:aa00d66c0fab27373ae44ae26a66a9e43ff2a678bf63a9c7c1a9a4d61172827a \
--hash=sha256:ccb032fda0873254380aa2bfad2582aedc2959186cce61e3a17abc1a55ff89c3 \
--hash=sha256:d754f39e0d1603b5b24a7f8484b22d2904fa551fe865fd0d4c3332f078d20d4e \
--hash=sha256:d75c461e20e29afc0aee7172a0950157c704ff0dd51613506bd7d82b718e7410 \
--hash=sha256:dcd65317dd15bc0451f3e01c80da2216a31916bdcffd6221ca1202d96584aa25 \
--hash=sha256:e570d3ab32e2c2861c4ebe6ffcad6a8abf9347432a37608fe1fbd157b3f0036b \
--hash=sha256:fd43a88e045cf992ed09fa724b5315b790525f2676883a6ea64e3263bae6549d
ConfigArgParse==1.2.3 \
--hash=sha256:edd17be986d5c1ba2e307150b8e5f5107aba125f3574dddd02c85d5cdcfd37dc
certifi==2020.4.5.1 \
--hash=sha256:1d987a998c75633c40847cc966fcf5904906c920a7f17ef374f5aa4282abd304 \
--hash=sha256:51fcb31174be6e6664c5f69e3e1691a2d72a1a12e90f872cbdb1567eb47b6519
cffi==1.14.0 \
--hash=sha256:001bf3242a1bb04d985d63e138230802c6c8d4db3668fb545fb5005ddf5bb5ff \
--hash=sha256:00789914be39dffba161cfc5be31b55775de5ba2235fe49aa28c148236c4e06b \
--hash=sha256:028a579fc9aed3af38f4892bdcc7390508adabc30c6af4a6e4f611b0c680e6ac \
--hash=sha256:14491a910663bf9f13ddf2bc8f60562d6bc5315c1f09c704937ef17293fb85b0 \
--hash=sha256:1cae98a7054b5c9391eb3249b86e0e99ab1e02bb0cc0575da191aedadbdf4384 \
--hash=sha256:2089ed025da3919d2e75a4d963d008330c96751127dd6f73c8dc0c65041b4c26 \
--hash=sha256:2d384f4a127a15ba701207f7639d94106693b6cd64173d6c8988e2c25f3ac2b6 \
--hash=sha256:337d448e5a725bba2d8293c48d9353fc68d0e9e4088d62a9571def317797522b \
--hash=sha256:399aed636c7d3749bbed55bc907c3288cb43c65c4389964ad5ff849b6370603e \
--hash=sha256:3b911c2dbd4f423b4c4fcca138cadde747abdb20d196c4a48708b8a2d32b16dd \
--hash=sha256:3d311bcc4a41408cf5854f06ef2c5cab88f9fded37a3b95936c9879c1640d4c2 \
--hash=sha256:62ae9af2d069ea2698bf536dcfe1e4eed9090211dbaafeeedf5cb6c41b352f66 \
--hash=sha256:66e41db66b47d0d8672d8ed2708ba91b2f2524ece3dee48b5dfb36be8c2f21dc \
--hash=sha256:675686925a9fb403edba0114db74e741d8181683dcf216be697d208857e04ca8 \
--hash=sha256:7e63cbcf2429a8dbfe48dcc2322d5f2220b77b2e17b7ba023d6166d84655da55 \
--hash=sha256:8a6c688fefb4e1cd56feb6c511984a6c4f7ec7d2a1ff31a10254f3c817054ae4 \
--hash=sha256:8c0ffc886aea5df6a1762d0019e9cb05f825d0eec1f520c51be9d198701daee5 \
--hash=sha256:95cd16d3dee553f882540c1ffe331d085c9e629499ceadfbda4d4fde635f4b7d \
--hash=sha256:99f748a7e71ff382613b4e1acc0ac83bf7ad167fb3802e35e90d9763daba4d78 \
--hash=sha256:b8c78301cefcf5fd914aad35d3c04c2b21ce8629b5e4f4e45ae6812e461910fa \
--hash=sha256:c420917b188a5582a56d8b93bdd8e0f6eca08c84ff623a4c16e809152cd35793 \
--hash=sha256:c43866529f2f06fe0edc6246eb4faa34f03fe88b64a0a9a942561c8e22f4b71f \
--hash=sha256:cab50b8c2250b46fe738c77dbd25ce017d5e6fb35d3407606e7a4180656a5a6a \
--hash=sha256:cef128cb4d5e0b3493f058f10ce32365972c554572ff821e175dbc6f8ff6924f \
--hash=sha256:cf16e3cf6c0a5fdd9bc10c21687e19d29ad1fe863372b5543deaec1039581a30 \
--hash=sha256:e56c744aa6ff427a607763346e4170629caf7e48ead6921745986db3692f987f \
--hash=sha256:e577934fc5f8779c554639376beeaa5657d54349096ef24abe8c74c5d9c117c3 \
--hash=sha256:f2b0fa0c01d8a0c7483afd9f31d7ecf2d71760ca24499c8697aeb5ca37dc090c
chardet==3.0.4 \
--hash=sha256:84ab92ed1c4d4f16916e05906b6b75a6c0fb5db821cc65e70cbd64a3e2a5eaae \
--hash=sha256:fc323ffcaeaed0e0a02bf4d117757b98aed530d9ed4531e3e15460124c106691
@@ -1209,67 +1326,66 @@ cryptography==2.8 \
--hash=sha256:df6b4dca2e11865e6cfbfb708e800efb18370f5a46fd601d3755bc7f85b3a8a2 \
--hash=sha256:ecadccc7ba52193963c0475ac9f6fa28ac01e01349a2ca48509667ef41ffd2cf \
--hash=sha256:fb81c17e0ebe3358486cd8cc3ad78adbae58af12fc2bf2bc0bb84e8090fa5ce8
distro==1.4.0 \
--hash=sha256:362dde65d846d23baee4b5c058c8586f219b5a54be1cf5fc6ff55c4578392f57 \
--hash=sha256:eedf82a470ebe7d010f1872c17237c79ab04097948800029994fa458e52fb4b4
enum34==1.1.6 \
--hash=sha256:2d81cbbe0e73112bdfe6ef8576f2238f2ba27dd0d55752a776c41d38b7da2850 \
--hash=sha256:644837f692e5f550741432dd3f223bbb9852018674981b1664e5dc339387588a \
--hash=sha256:6bd0f6ad48ec2aa117d3d141940d484deccda84d4fcd884f5c3d93c23ecd8c79 \
--hash=sha256:8ad8c4783bf61ded74527bffb48ed9b54166685e4230386a9ed9b1279e2df5b1
distro==1.5.0 \
--hash=sha256:0e58756ae38fbd8fc3020d54badb8eae17c5b9dcbed388b17bb55b8a5928df92 \
--hash=sha256:df74eed763e18d10d0da624258524ae80486432cd17392d9c3d96f5e83cd2799
enum34==1.1.10; python_version < '3.4' \
--hash=sha256:a98a201d6de3f2ab3db284e70a33b0f896fbf35f8086594e8c9e74b909058d53 \
--hash=sha256:c3858660960c984d6ab0ebad691265180da2b43f07e061c0f8dca9ef3cffd328 \
--hash=sha256:cce6a7477ed816bd2542d03d53db9f0db935dd013b70f336a95c73979289f248
funcsigs==1.0.2 \
--hash=sha256:330cc27ccbf7f1e992e69fef78261dc7c6569012cf397db8d3de0234e6c937ca \
--hash=sha256:a7bb0f2cf3a3fd1ab2732cb49eba4252c2af4240442415b4abce3b87022a8f50
future==0.18.2 \
--hash=sha256:b1bead90b70cf6ec3f0710ae53a525360fa360d306a86583adc6bf83a4db537d
idna==2.8 \
--hash=sha256:c357b3f628cf53ae2c4c05627ecc484553142ca23264e593d327bcde5e9c3407 \
--hash=sha256:ea8b7f6188e6fa117537c3df7da9fc686d485087abf6ac197f9c46432f7e4a3c
idna==2.9 \
--hash=sha256:7588d1c14ae4c77d74036e8c22ff447b26d0fde8f007354fd48a7814db15b7cb \
--hash=sha256:a068a21ceac8a4d63dbfd964670474107f541babbd2250d61922f029858365fa
ipaddress==1.0.23 \
--hash=sha256:6e0f4a39e66cb5bb9a137b00276a2eff74f93b71dcbdad6f10ff7df9d3557fcc \
--hash=sha256:b7f8e0369580bb4a24d5ba1d7cc29660a4a6987763faf1d8a8046830e020e7e2
josepy==1.2.0 \
--hash=sha256:8ea15573203f28653c00f4ac0142520777b1c59d9eddd8da3f256c6ba3cac916 \
--hash=sha256:9cec9a839fe9520f0420e4f38e7219525daccce4813296627436fe444cd002d3
josepy==1.3.0 \
--hash=sha256:c341ffa403399b18e9eae9012f804843045764d1390f9cb4648980a7569b1619 \
--hash=sha256:e54882c64be12a2a76533f73d33cba9e331950fda9e2731e843490b774e7a01c
mock==1.3.0 \
--hash=sha256:1e247dbecc6ce057299eb7ee019ad68314bb93152e81d9a6110d35f4d5eca0f6 \
--hash=sha256:3f573a18be94de886d1191f27c168427ef693e8dcfcecf95b170577b2eb69cbb
parsedatetime==2.4 \
--hash=sha256:3d817c58fb9570d1eec1dd46fa9448cd644eeed4fb612684b02dfda3a79cb84b \
--hash=sha256:9ee3529454bf35c40a77115f5a596771e59e1aee8c53306f346c461b8e913094
pbr==5.4.3 \
--hash=sha256:2c8e420cd4ed4cec4e7999ee47409e876af575d4c35a45840d59e8b5f3155ab8 \
--hash=sha256:b32c8ccaac7b1a20c0ce00ce317642e6cf231cf038f9875e0280e28af5bf7ac9
pyOpenSSL==19.0.0 \
--hash=sha256:aeca66338f6de19d1aa46ed634c3b9ae519a64b458f8468aec688e7e3c20f200 \
--hash=sha256:c727930ad54b10fc157015014b666f2d8b41f70c0d03e83ab67624fd3dd5d1e6
parsedatetime==2.5 \
--hash=sha256:3b835fc54e472c17ef447be37458b400e3fefdf14bb1ffdedb5d2c853acf4ba1 \
--hash=sha256:d2e9ddb1e463de871d32088a3f3cea3dc8282b1b2800e081bd0ef86900451667
pbr==5.4.5 \
--hash=sha256:07f558fece33b05caf857474a366dfcc00562bca13dd8b47b2b3e22d9f9bf55c \
--hash=sha256:579170e23f8e0c2f24b0de612f71f648eccb79fb1322c814ae6b3c07b5ba23e8
pyOpenSSL==19.1.0 \
--hash=sha256:621880965a720b8ece2f1b2f54ea2071966ab00e2970ad2ce11d596102063504 \
--hash=sha256:9a24494b2602aaf402be5c9e30a0b82d4a5c67528fe8fb475e3f3bc00dd69507
pyRFC3339==1.1 \
--hash=sha256:67196cb83b470709c580bb4738b83165e67c6cc60e1f2e4f286cfcb402a926f4 \
--hash=sha256:81b8cbe1519cdb79bed04910dd6fa4e181faf8c88dff1e1b987b5f7ab23a5b1a
pycparser==2.19 \
--hash=sha256:a988718abfad80b6b157acce7bf130a30876d27603738ac39f140993246b25b3
pyparsing==2.4.5 \
--hash=sha256:20f995ecd72f2a1f4bf6b072b63b22e2eb457836601e76d6e5dfcd75436acc1f \
--hash=sha256:4ca62001be367f01bd3e92ecbb79070272a9d4964dce6a48a82ff0b8bc7e683a
pycparser==2.20 \
--hash=sha256:2d475327684562c3a96cc71adf7dc8c4f0565175cf86b6d7a404ff4c771f15f0 \
--hash=sha256:7582ad22678f0fcd81102833f60ef8d0e57288b6b5fb00323d101be910e35705
pyparsing==2.4.7 \
--hash=sha256:c203ec8783bf771a155b207279b9bccb8dea02d8f0c9e5f8ead507bc3246ecc1 \
--hash=sha256:ef9d7589ef3c200abe66653d3f1ab1033c3c419ae9b9bdb1240a85b024efc88b
python-augeas==0.5.0 \
--hash=sha256:67d59d66cdba8d624e0389b87b2a83a176f21f16a87553b50f5703b23f29bac2
pytz==2019.3 \
--hash=sha256:1c557d7d0e871de1f5ccd5833f60fb2550652da6be2693c1e02300743d21500d \
--hash=sha256:b02c06db6cf09c12dd25137e563b31700d3b80fcc4ad23abb7a315f2789819be
requests==2.21.0 \
--hash=sha256:502a824f31acdacb3a35b6690b5fbf0bc41d63a24a45c4004352b0242707598e \
--hash=sha256:7bf2a778576d825600030a110f3c0e3e8edc51dfaafe1c146e39a2027784957b
pytz==2020.1 \
--hash=sha256:a494d53b6d39c3c6e44c3bec237336e14305e4f29bbf800b599253057fbb79ed \
--hash=sha256:c35965d010ce31b23eeb663ed3cc8c906275d6be1a34393a1d73a41febf4a048
requests==2.23.0 \
--hash=sha256:43999036bfa82904b6af1d99e4882b560e5e2c68e5c4b0aa03b655f3d7d73fee \
--hash=sha256:b3f43d496c6daba4493e7c431722aeb7dbc6288f52a6e04e7b6023b0247817e6
requests-toolbelt==0.9.1 \
--hash=sha256:380606e1d10dc85c3bd47bf5a6095f815ec007be7a8b69c878507068df059e6f \
--hash=sha256:968089d4584ad4ad7c171454f0a5c6dac23971e9472521ea3b6d49d610aa6fc0
six==1.13.0 \
--hash=sha256:1f1b7d42e254082a9db6279deae68afb421ceba6158efa6131de7b3003ee93fd \
--hash=sha256:30f610279e8b2578cab6db20741130331735c781b56053c59c4076da27f06b66
urllib3==1.24.3 \
--hash=sha256:2393a695cd12afedd0dcb26fe5d50d0cf248e5a66f75dbd89a3d4eb333a61af4 \
--hash=sha256:a637e5fae88995b256e3409dc4d52c2e2e0ba32c42a6365fee8bbd2238de3cfb
zope.component==4.6 \
--hash=sha256:ec2afc5bbe611dcace98bb39822c122d44743d635dafc7315b9aef25097db9e6
six==1.15.0 \
--hash=sha256:30639c035cdb23534cd4aa2dd52c3bf48f06e5f4a941509c8bafd8ce11080259 \
--hash=sha256:8b74bedcbbbaca38ff6d7491d76f2b06b3592611af620f8426e82dddb04a5ced
urllib3==1.25.9 \
--hash=sha256:3018294ebefce6572a474f0604c2021e33b3fd8006ecd11d62107a5d2a963527 \
--hash=sha256:88206b0eb87e6d677d424843ac5209e3fb9d0190d0ee169599165ec25e9d9115
zope.component==4.6.1 \
--hash=sha256:bfbe55d4a93e70a78b10edc3aad4de31bb8860919b7cbd8d66f717f7d7b279ac \
--hash=sha256:d9c7c27673d787faff8a83797ce34d6ebcae26a370e25bddb465ac2182766aca
zope.deferredimport==4.3.1 \
--hash=sha256:57b2345e7b5eef47efcd4f634ff16c93e4265de3dcf325afc7315ade48d909e1 \
--hash=sha256:9a0c211df44aa95f1c4e6d2626f90b400f56989180d3ef96032d708da3d23e0a
@@ -1279,87 +1395,129 @@ zope.deprecation==4.4.0 \
zope.event==4.4 \
--hash=sha256:69c27debad9bdacd9ce9b735dad382142281ac770c4a432b533d6d65c4614bcf \
--hash=sha256:d8e97d165fd5a0997b45f5303ae11ea3338becfe68c401dd88ffd2113fe5cae7
zope.hookable==4.2.0 \
--hash=sha256:22886e421234e7e8cedc21202e1d0ab59960e40a47dd7240e9659a2d82c51370 \
--hash=sha256:39912f446e45b4e1f1951b5ffa2d5c8b074d25727ec51855ae9eab5408f105ab \
--hash=sha256:3adb7ea0871dbc56b78f62c4f5c024851fc74299f4f2a95f913025b076cde220 \
--hash=sha256:3d7c4b96341c02553d8b8d71065a9366ef67e6c6feca714f269894646bb8268b \
--hash=sha256:4e826a11a529ed0464ffcecf34b0b7bd1b4928dd5848c5c61bedd7833e8f4801 \
--hash=sha256:700d68cc30728de1c4c62088a981c6daeaefdf20a0d81995d2c0b7f442c5f88c \
--hash=sha256:77c82a430cedfbf508d1aa406b2f437363c24fa90c73f577ead0fb5295749b83 \
--hash=sha256:c1df3929a3666fc5a0c80d60a0c1e6f6ef97c7f6ed2f1b7cf49f3e6f3d4dde15 \
--hash=sha256:dba8b2dd2cd41cb5f37bfa3f3d82721b8ae10e492944e48ddd90a439227f2893 \
--hash=sha256:f492540305b15b5591bd7195d61f28946bb071de071cee5d68b6b8414da90fd2
zope.interface==4.6.0 \
--hash=sha256:086707e0f413ff8800d9c4bc26e174f7ee4c9c8b0302fbad68d083071822316c \
--hash=sha256:1157b1ec2a1f5bf45668421e3955c60c610e31913cc695b407a574efdbae1f7b \
--hash=sha256:11ebddf765bff3bbe8dbce10c86884d87f90ed66ee410a7e6c392086e2c63d02 \
--hash=sha256:14b242d53f6f35c2d07aa2c0e13ccb710392bcd203e1b82a1828d216f6f6b11f \
--hash=sha256:1b3d0dcabc7c90b470e59e38a9acaa361be43b3a6ea644c0063951964717f0e5 \
--hash=sha256:20a12ab46a7e72b89ce0671e7d7a6c3c1ca2c2766ac98112f78c5bddaa6e4375 \
--hash=sha256:298f82c0ab1b182bd1f34f347ea97dde0fffb9ecf850ecf7f8904b8442a07487 \
--hash=sha256:2f6175722da6f23dbfc76c26c241b67b020e1e83ec7fe93c9e5d3dd18667ada2 \
--hash=sha256:3b877de633a0f6d81b600624ff9137312d8b1d0f517064dfc39999352ab659f0 \
--hash=sha256:4265681e77f5ac5bac0905812b828c9fe1ce80c6f3e3f8574acfb5643aeabc5b \
--hash=sha256:550695c4e7313555549aa1cdb978dc9413d61307531f123558e438871a883d63 \
--hash=sha256:5f4d42baed3a14c290a078e2696c5f565501abde1b2f3f1a1c0a94fbf6fbcc39 \
--hash=sha256:62dd71dbed8cc6a18379700701d959307823b3b2451bdc018594c48956ace745 \
--hash=sha256:7040547e5b882349c0a2cc9b50674b1745db551f330746af434aad4f09fba2cc \
--hash=sha256:7e099fde2cce8b29434684f82977db4e24f0efa8b0508179fce1602d103296a2 \
--hash=sha256:7e5c9a5012b2b33e87980cee7d1c82412b2ebabcb5862d53413ba1a2cfde23aa \
--hash=sha256:81295629128f929e73be4ccfdd943a0906e5fe3cdb0d43ff1e5144d16fbb52b1 \
--hash=sha256:95cc574b0b83b85be9917d37cd2fad0ce5a0d21b024e1a5804d044aabea636fc \
--hash=sha256:968d5c5702da15c5bf8e4a6e4b67a4d92164e334e9c0b6acf080106678230b98 \
--hash=sha256:9e998ba87df77a85c7bed53240a7257afe51a07ee6bc3445a0bf841886da0b97 \
--hash=sha256:a0c39e2535a7e9c195af956610dba5a1073071d2d85e9d2e5d789463f63e52ab \
--hash=sha256:a15e75d284178afe529a536b0e8b28b7e107ef39626a7809b4ee64ff3abc9127 \
--hash=sha256:a6a6ff82f5f9b9702478035d8f6fb6903885653bff7ec3a1e011edc9b1a7168d \
--hash=sha256:b639f72b95389620c1f881d94739c614d385406ab1d6926a9ffe1c8abbea23fe \
--hash=sha256:bad44274b151d46619a7567010f7cde23a908c6faa84b97598fd2f474a0c6891 \
--hash=sha256:bbcef00d09a30948756c5968863316c949d9cedbc7aabac5e8f0ffbdb632e5f1 \
--hash=sha256:d788a3999014ddf416f2dc454efa4a5dbeda657c6aba031cf363741273804c6b \
--hash=sha256:eed88ae03e1ef3a75a0e96a55a99d7937ed03e53d0cffc2451c208db445a2966 \
--hash=sha256:f99451f3a579e73b5dd58b1b08d1179791d49084371d9a47baad3b22417f0317
zope.proxy==4.3.3 \
--hash=sha256:04646ac04ffa9c8e32fb2b5c3cd42995b2548ea14251f3c21ca704afae88e42c \
--hash=sha256:07b6bceea232559d24358832f1cd2ed344bbf05ca83855a5b9698b5f23c5ed60 \
--hash=sha256:1ef452cc02e0e2f8e3c917b1a5b936ef3280f2c2ca854ee70ac2164d1655f7e6 \
--hash=sha256:22bf61857c5977f34d4e391476d40f9a3b8c6ab24fb0cac448d42d8f8b9bf7b2 \
--hash=sha256:299870e3428cbff1cd9f9b34144e76ecdc1d9e3192a8cf5f1b0258f47a239f58 \
--hash=sha256:2bfc36bfccbe047671170ea5677efd3d5ab730a55d7e45611d76d495e5b96766 \
--hash=sha256:32e82d5a640febc688c0789e15ea875bf696a10cf358f049e1ed841f01710a9b \
--hash=sha256:3b2051bdc4bc3f02fa52483f6381cf40d4d48167645241993f9d7ebbd142ed9b \
--hash=sha256:3f734bd8a08f5185a64fb6abb8f14dc97ec27a689ca808fb7a83cdd38d745e4f \
--hash=sha256:3f78dd8de3112df8bbd970f0916ac876dc3fbe63810bd1cf7cc5eec4cbac4f04 \
--hash=sha256:4eabeb48508953ba1f3590ad0773b8daea9e104eec66d661917e9bbcd7125a67 \
--hash=sha256:4f05ecc33808187f430f249cb1ccab35c38f570b181f2d380fbe253da94b18d8 \
--hash=sha256:4f4f4cbf23d3afc1526294a31e7b3eaa0f682cc28ac5366065dc1d6bb18bd7be \
--hash=sha256:5483d5e70aacd06f0aa3effec9fed597c0b50f45060956eeeb1203c44d4338c3 \
--hash=sha256:56a5f9b46892b115a75d0a1f2292431ad5988461175826600acc69a24cb3edee \
--hash=sha256:64bb63af8a06f736927d260efdd4dfc5253d42244f281a8063e4b9eea2ddcbc5 \
--hash=sha256:653f8cbefcf7c6ac4cece2cdef367c4faa2b7c19795d52bd7cbec11a8739a7c1 \
--hash=sha256:664211d63306e4bd4eec35bf2b4bd9db61c394037911cf2d1804c43b511a49f1 \
--hash=sha256:6651e6caed66a8fff0fef1a3e81c0ed2253bf361c0fdc834500488732c5d16e9 \
--hash=sha256:6c1fba6cdfdf105739d3069cf7b07664f2944d82a8098218ab2300a82d8f40fc \
--hash=sha256:6e64246e6e9044a4534a69dca1283c6ddab6e757be5e6874f69024329b3aa61f \
--hash=sha256:838390245c7ec137af4993c0c8052f49d5ec79e422b4451bfa37fee9b9ccaa01 \
--hash=sha256:856b410a14793069d8ba35f33fff667213ea66f2df25a0024cc72a7493c56d4c \
--hash=sha256:8b932c364c1d1605a91907a41128ed0ee8a2d326fc0fafb2c55cd46f545f4599 \
--hash=sha256:9086cf6d20f08dae7f296a78f6c77d1f8d24079d448f023ee0eb329078dd35e1 \
--hash=sha256:9698533c14afa0548188de4968a7932d1f3f965f3f5ba1474de673596bb875af \
--hash=sha256:9b12b05dd7c28f5068387c1afee8cb94f9d02501e7ef495a7c5c7e27139b96ad \
--hash=sha256:a884c7426a5bc6fb7fc71a55ad14e66818e13f05b78b20a6f37175f324b7acb8 \
--hash=sha256:abe9e7f1a3e76286c5f5baf2bf5162d41dc0310da493b34a2c36555f38d928f7 \
--hash=sha256:bd6fde63b015a27262be06bd6bbdd895273cc2bdf2d4c7e1c83711d26a8fbace \
--hash=sha256:bda7c62c954f47b87ed9a89f525eee1b318ec7c2162dfdba76c2ccfa334e0caa \
--hash=sha256:be8a4908dd3f6e965993c0068b006bdbd0474fbcbd1da4893b49356e73fc1557 \
--hash=sha256:ced65fc3c7d7205267506d854bb1815bb445899cca9d21d1d4b949070a635546 \
--hash=sha256:dac4279aa05055d3897ab5e5ee5a7b39db121f91df65a530f8b1ac7f9bd93119 \
--hash=sha256:e4f1863056e3e4f399c285b67fa816f411a7bfa1c81ef50e186126164e396e59 \
--hash=sha256:ecd85f68b8cd9ab78a0141e87ea9a53b2f31fd9b1350a1c44da1f7481b5363ef \
--hash=sha256:ed269b83750413e8fc5c96276372f49ee3fcb7ed61c49fe8e5a67f54459a5a4a \
--hash=sha256:f19b0b80cba73b204dee68501870b11067711d21d243fb6774256d3ca2e5391f \
--hash=sha256:ffdafb98db7574f9da84c489a10a5d582079a888cb43c64e9e6b0e3fe1034685
zope.hookable==5.0.1 \
--hash=sha256:0194b9b9e7f614abba60c90b231908861036578297515d3d6508eb10190f266d \
--hash=sha256:0c2977473918bdefc6fa8dfb311f154e7f13c6133957fe649704deca79b92093 \
--hash=sha256:17b8bdb3b77e03a152ca0d5ca185a7ae0156f5e5a2dbddf538676633a1f7380f \
--hash=sha256:29d07681a78042cdd15b268ae9decffed9ace68a53eebeb61d65ae931d158841 \
--hash=sha256:36fb1b35d1150267cb0543a1ddd950c0bc2c75ed0e6e92e3aaa6ac2e29416cb7 \
--hash=sha256:3aed60c2bb5e812bbf9295c70f25b17ac37c233f30447a96c67913ba5073642f \
--hash=sha256:3cac1565cc768911e72ca9ec4ddf5c5109e1fef0104f19f06649cf1874943b60 \
--hash=sha256:3d4bc0cc4a37c3cd3081063142eeb2125511db3c13f6dc932d899c512690378e \
--hash=sha256:3f73096f27b8c28be53ffb6604f7b570fbbb82f273c6febe5f58119009b59898 \
--hash=sha256:522d1153d93f2d48aa0bd9fb778d8d4500be2e4dcf86c3150768f0e3adbbc4ef \
--hash=sha256:523d2928fb7377bbdbc9af9c0b14ad73e6eaf226349f105733bdae27efd15b5a \
--hash=sha256:5848309d4fc5c02150a45e8f8d2227e5bfda386a508bbd3160fed7c633c5a2fa \
--hash=sha256:6781f86e6d54a110980a76e761eb54590630fd2af2a17d7edf02a079d2646c1d \
--hash=sha256:6fd27921ebf3aaa945fa25d790f1f2046204f24dba4946f82f5f0a442577c3e9 \
--hash=sha256:70d581862863f6bf9e175e85c9d70c2d7155f53fb04dcdb2f73cf288ca559a53 \
--hash=sha256:81867c23b0dc66c8366f351d00923f2bc5902820a24c2534dfd7bf01a5879963 \
--hash=sha256:81db29edadcbb740cd2716c95a297893a546ed89db1bfe9110168732d7f0afdd \
--hash=sha256:86bd12624068cea60860a0759af5e2c3adc89c12aef6f71cf12f577e28deefe3 \
--hash=sha256:9c184d8f9f7a76e1ced99855ccf390ffdd0ec3765e5cbf7b9cada600accc0a1e \
--hash=sha256:acc789e8c29c13555e43fe4bf9fcd15a65512c9645e97bbaa5602e3201252b02 \
--hash=sha256:afaa740206b7660d4cc3b8f120426c85761f51379af7a5b05451f624ad12b0af \
--hash=sha256:b5f5fa323f878bb16eae68ea1ba7f6c0419d4695d0248bed4b18f51d7ce5ab85 \
--hash=sha256:bd89e0e2c67bf4ac3aca2a19702b1a37269fb1923827f68324ac2e7afd6e3406 \
--hash=sha256:c212de743283ec0735db24ec6ad913758df3af1b7217550ff270038062afd6ae \
--hash=sha256:ca553f524293a0bdea05e7f44c3e685e4b7b022cb37d87bc4a3efa0f86587a8d \
--hash=sha256:cab67065a3db92f636128d3157cc5424a145f82d96fb47159c539132833a6d36 \
--hash=sha256:d3b3b3eedfdbf6b02898216e85aa6baf50207f4378a2a6803d6d47650cd37031 \
--hash=sha256:d9f4a5a72f40256b686d31c5c0b1fde503172307beb12c1568296e76118e402c \
--hash=sha256:df5067d87aaa111ed5d050e1ee853ba284969497f91806efd42425f5348f1c06 \
--hash=sha256:e2587644812c6138f05b8a41594a8337c6790e3baf9a01915e52438c13fc6bef \
--hash=sha256:e27fd877662db94f897f3fd532ef211ca4901eb1a70ba456f15c0866a985464a \
--hash=sha256:e427ebbdd223c72e06ba94c004bb04e996c84dec8a0fa84e837556ae145c439e \
--hash=sha256:e583ad4309c203ef75a09d43434cf9c2b4fa247997ecb0dcad769982c39411c7 \
--hash=sha256:e760b2bc8ece9200804f0c2b64d10147ecaf18455a2a90827fbec4c9d84f3ad5 \
--hash=sha256:ea9a9cc8bcc70e18023f30fa2f53d11ae069572a162791224e60cd65df55fb69 \
--hash=sha256:ecb3f17dce4803c1099bd21742cd126b59817a4e76a6544d31d2cca6e30dbffd \
--hash=sha256:ed794e3b3de42486d30444fb60b5561e724ee8a2d1b17b0c2e0f81e3ddaf7a87 \
--hash=sha256:ee885d347279e38226d0a437b6a932f207f691c502ee565aba27a7022f1285df \
--hash=sha256:fd5e7bc5f24f7e3d490698f7b854659a9851da2187414617cd5ed360af7efd63 \
--hash=sha256:fe45f6870f7588ac7b2763ff1ce98cce59369717afe70cc353ec5218bc854bcc
zope.interface==5.1.0 \
--hash=sha256:0103cba5ed09f27d2e3de7e48bb320338592e2fabc5ce1432cf33808eb2dfd8b \
--hash=sha256:14415d6979356629f1c386c8c4249b4d0082f2ea7f75871ebad2e29584bd16c5 \
--hash=sha256:1ae4693ccee94c6e0c88a4568fb3b34af8871c60f5ba30cf9f94977ed0e53ddd \
--hash=sha256:1b87ed2dc05cb835138f6a6e3595593fea3564d712cb2eb2de963a41fd35758c \
--hash=sha256:269b27f60bcf45438e8683269f8ecd1235fa13e5411de93dae3b9ee4fe7f7bc7 \
--hash=sha256:27d287e61639d692563d9dab76bafe071fbeb26818dd6a32a0022f3f7ca884b5 \
--hash=sha256:39106649c3082972106f930766ae23d1464a73b7d30b3698c986f74bf1256a34 \
--hash=sha256:40e4c42bd27ed3c11b2c983fecfb03356fae1209de10686d03c02c8696a1d90e \
--hash=sha256:461d4339b3b8f3335d7e2c90ce335eb275488c587b61aca4b305196dde2ff086 \
--hash=sha256:4f98f70328bc788c86a6a1a8a14b0ea979f81ae6015dd6c72978f1feff70ecda \
--hash=sha256:558a20a0845d1a5dc6ff87cd0f63d7dac982d7c3be05d2ffb6322a87c17fa286 \
--hash=sha256:562dccd37acec149458c1791da459f130c6cf8902c94c93b8d47c6337b9fb826 \
--hash=sha256:5e86c66a6dea8ab6152e83b0facc856dc4d435fe0f872f01d66ce0a2131b7f1d \
--hash=sha256:60a207efcd8c11d6bbeb7862e33418fba4e4ad79846d88d160d7231fcb42a5ee \
--hash=sha256:645a7092b77fdbc3f68d3cc98f9d3e71510e419f54019d6e282328c0dd140dcd \
--hash=sha256:6874367586c020705a44eecdad5d6b587c64b892e34305bb6ed87c9bbe22a5e9 \
--hash=sha256:74bf0a4f9091131de09286f9a605db449840e313753949fe07c8d0fe7659ad1e \
--hash=sha256:7b726194f938791a6691c7592c8b9e805fc6d1b9632a833b9c0640828cd49cbc \
--hash=sha256:8149ded7f90154fdc1a40e0c8975df58041a6f693b8f7edcd9348484e9dc17fe \
--hash=sha256:8cccf7057c7d19064a9e27660f5aec4e5c4001ffcf653a47531bde19b5aa2a8a \
--hash=sha256:911714b08b63d155f9c948da2b5534b223a1a4fc50bb67139ab68b277c938578 \
--hash=sha256:a5f8f85986197d1dd6444763c4a15c991bfed86d835a1f6f7d476f7198d5f56a \
--hash=sha256:a744132d0abaa854d1aad50ba9bc64e79c6f835b3e92521db4235a1991176813 \
--hash=sha256:af2c14efc0bb0e91af63d00080ccc067866fb8cbbaca2b0438ab4105f5e0f08d \
--hash=sha256:b054eb0a8aa712c8e9030065a59b5e6a5cf0746ecdb5f087cca5ec7685690c19 \
--hash=sha256:b0becb75418f8a130e9d465e718316cd17c7a8acce6fe8fe07adc72762bee425 \
--hash=sha256:b1d2ed1cbda2ae107283befd9284e650d840f8f7568cb9060b5466d25dc48975 \
--hash=sha256:ba4261c8ad00b49d48bbb3b5af388bb7576edfc0ca50a49c11dcb77caa1d897e \
--hash=sha256:d1fe9d7d09bb07228650903d6a9dc48ea649e3b8c69b1d263419cc722b3938e8 \
--hash=sha256:d7804f6a71fc2dda888ef2de266727ec2f3915373d5a785ed4ddc603bbc91e08 \
--hash=sha256:da2844fba024dd58eaa712561da47dcd1e7ad544a257482392472eae1c86d5e5 \
--hash=sha256:dcefc97d1daf8d55199420e9162ab584ed0893a109f45e438b9794ced44c9fd0 \
--hash=sha256:dd98c436a1fc56f48c70882cc243df89ad036210d871c7427dc164b31500dc11 \
--hash=sha256:e74671e43ed4569fbd7989e5eecc7d06dc134b571872ab1d5a88f4a123814e9f \
--hash=sha256:eb9b92f456ff3ec746cd4935b73c1117538d6124b8617bc0fe6fda0b3816e345 \
--hash=sha256:ebb4e637a1fb861c34e48a00d03cffa9234f42bef923aec44e5625ffb9a8e8f9 \
--hash=sha256:ef739fe89e7f43fb6494a43b1878a36273e5924869ba1d866f752c5812ae8d58 \
--hash=sha256:f40db0e02a8157d2b90857c24d89b6310f9b6c3642369852cdc3b5ac49b92afc \
--hash=sha256:f68bf937f113b88c866d090fea0bc52a098695173fc613b055a17ff0cf9683b6 \
--hash=sha256:fb55c182a3f7b84c1a2d6de5fa7b1a05d4660d866b91dbf8d74549c57a1499e8
zope.proxy==4.3.5 \
--hash=sha256:00573dfa755d0703ab84bb23cb6ecf97bb683c34b340d4df76651f97b0bab068 \
--hash=sha256:092049280f2848d2ba1b57b71fe04881762a220a97b65288bcb0968bb199ec30 \
--hash=sha256:0cbd27b4d3718b5ec74fc65ffa53c78d34c65c6fd9411b8352d2a4f855220cf1 \
--hash=sha256:17fc7e16d0c81f833a138818a30f366696653d521febc8e892858041c4d88785 \
--hash=sha256:19577dfeb70e8a67249ba92c8ad20589a1a2d86a8d693647fa8385408a4c17b0 \
--hash=sha256:207aa914576b1181597a1516e1b90599dc690c095343ae281b0772e44945e6a4 \
--hash=sha256:219a7db5ed53e523eb4a4769f13105118b6d5b04ed169a283c9775af221e231f \
--hash=sha256:2b50ea79849e46b5f4f2b0247a3687505d32d161eeb16a75f6f7e6cd81936e43 \
--hash=sha256:5903d38362b6c716e66bbe470f190579c530a5baf03dbc8500e5c2357aa569a5 \
--hash=sha256:5c24903675e271bd688c6e9e7df5775ac6b168feb87dbe0e4bcc90805f21b28f \
--hash=sha256:5ef6bc5ed98139e084f4e91100f2b098a0cd3493d4e76f9d6b3f7b95d7ad0f06 \
--hash=sha256:61b55ae3c23a126a788b33ffb18f37d6668e79a05e756588d9e4d4be7246ab1c \
--hash=sha256:63ddb992931a5e616c87d3d89f5a58db086e617548005c7f9059fac68c03a5cc \
--hash=sha256:6943da9c09870490dcfd50c4909c0cc19f434fa6948f61282dc9cb07bcf08160 \
--hash=sha256:6ad40f85c1207803d581d5d75e9ea25327cd524925699a83dfc03bf8e4ba72b7 \
--hash=sha256:6b44433a79bdd7af0e3337bd7bbcf53dd1f9b0fa66bf21bcb756060ce32a96c1 \
--hash=sha256:6bbaa245015d933a4172395baad7874373f162955d73612f0b66b6c2c33b6366 \
--hash=sha256:7007227f4ea85b40a2f5e5a244479f6a6dfcf906db9b55e812a814a8f0e2c28d \
--hash=sha256:74884a0aec1f1609190ec8b34b5d58fb3b5353cf22b96161e13e0e835f13518f \
--hash=sha256:7d25fe5571ddb16369054f54cdd883f23de9941476d97f2b92eb6d7d83afe22d \
--hash=sha256:7e162bdc5e3baad26b2262240be7d2bab36991d85a6a556e48b9dfb402370261 \
--hash=sha256:814d62678dc3a30f4aa081982d830b7c342cf230ffc9d030b020cb154eeebf9e \
--hash=sha256:8878a34c5313ee52e20aa50b03138af8d472bae465710fb954d133a9bfd3c38d \
--hash=sha256:a66a0d94e5b081d5d695e66d6667e91e74d79e273eee95c1747717ba9cb70792 \
--hash=sha256:a69f5cbf4addcfdf03dda564a671040127a6b7c34cf9fe4973582e68441b63fa \
--hash=sha256:b00f9f0c334d07709d3f73a7cb8ae63c6ca1a90c790a63b5e7effa666ef96021 \
--hash=sha256:b6ed71e4a7b4690447b626f499d978aa13197a0e592950e5d7020308f6054698 \
--hash=sha256:bdf5041e5851526e885af579d2f455348dba68d74f14a32781933569a327fddf \
--hash=sha256:be034360dd34e62608419f86e799c97d389c10a0e677a25f236a971b2f40dac9 \
--hash=sha256:cc8f590a5eed30b314ae6b0232d925519ade433f663de79cc3783e4b10d662ba \
--hash=sha256:cd7a318a15fe6cc4584bf3c4426f092ed08c0fd012cf2a9173114234fe193e11 \
--hash=sha256:cf19b5f63a59c20306e034e691402b02055c8f4e38bf6792c23cad489162a642 \
--hash=sha256:cfc781ce442ec407c841e9aa51d0e1024f72b6ec34caa8fdb6ef9576d549acf2 \
--hash=sha256:dea9f6f8633571e18bc20cad83603072e697103a567f4b0738d52dd0211b4527 \
--hash=sha256:e4a86a1d5eb2cce83c5972b3930c7c1eac81ab3508464345e2b8e54f119d5505 \
--hash=sha256:e7106374d4a74ed9ff00c46cc00f0a9f06a0775f8868e423f85d4464d2333679 \
--hash=sha256:e98a8a585b5668aa9e34d10f7785abf9545fe72663b4bfc16c99a115185ae6a5 \
--hash=sha256:f64840e68483316eb58d82c376ad3585ca995e69e33b230436de0cdddf7363f9 \
--hash=sha256:f8f4b0a9e6683e43889852130595c8854d8ae237f2324a053cdd884de936aa9b \
--hash=sha256:fc45a53219ed30a7f670a6d8c98527af0020e6fd4ee4c0a8fb59f147f06d816c
# Contains the requirements for the letsencrypt package.
#
@@ -1372,18 +1530,18 @@ letsencrypt==0.7.0 \
--hash=sha256:105a5fb107e45bcd0722eb89696986dcf5f08a86a321d6aef25a0c7c63375ade \
--hash=sha256:c36e532c486a7e92155ee09da54b436a3c420813ec1c590b98f635d924720de9
certbot==1.0.0 \
--hash=sha256:8d074cff89dee002dec1c47cb0da04ea8e0ede8d68838b6d54aa41580d9262df \
--hash=sha256:86b82d31db19fffffb0d6b218951e2121ef514e3ff659aa042deaf92a33e302a
acme==1.0.0 \
--hash=sha256:f6972e436e76f7f1e395e81e149f8713ca8462d465b14993bddc53fb18a40644 \
--hash=sha256:6a08f12f848ce563b50bca421ba9db653df9f82cfefeaf8aba517f046d1386c2
certbot-apache==1.0.0 \
--hash=sha256:e591d0cf773ad33ee978f7adb1b69288eac2c8847c643b06e70260e707626f8e \
--hash=sha256:7335ab5687a0a47d9041d9e13f3a2d67d0e8372da97ab639edb31c14b787cd68
certbot-nginx==1.0.0 \
--hash=sha256:ce8a2e51165da7c15bfdc059cd6572d0f368c078f1e1a77633a2773310b2f231 \
--hash=sha256:63b4ae09d4f1c9ef0a1a2a49c3f651d8a7cb30303ec6f954239e987c5da45dc4
certbot==1.6.0 \
--hash=sha256:7237ac851ef7f3ff2d5ddb49e692e4bd5346273734cbc531531e4ad56d14d460 \
--hash=sha256:d373ee0f24ab06f561efa2b00f68cff43521b003d87fbf4d9e869e7cc7395481
acme==1.6.0 \
--hash=sha256:dc532fee475dde07a843232f69f54b185ba23af6cce9d2e1a1dc132ce2e34f64 \
--hash=sha256:fe76e06ae1e9b12304f9e9691ff901da6d2fd588fea2765f891b8cd15d6b3f2b
certbot-apache==1.6.0 \
--hash=sha256:d6080664fe24fc5dc1e519382ebe5a5215f3b886ceaa335336a1db2c1b1ed95e \
--hash=sha256:e0232a1f1c5513701de06bccb88b57b7d76d9db28c6559fba8539f88293c85ea
certbot-nginx==1.6.0 \
--hash=sha256:6ef97185d9c07ea97656e7b439e7ccfa8e5090f6802e9162e8f5a79080bc5a76 \
--hash=sha256:facc59e066d7e5623fbc068fe2fcc5e1f802c2441d148e37ff96ad90b893600a
UNLIKELY_EOF
# -------------------------------------------------------------------------
@@ -1617,6 +1775,9 @@ UNLIKELY_EOF
say "Installation succeeded."
fi
# If you're modifying any of the code after this point in this current `if` block, you
# may need to update the "$DEPRECATED_OS" = 1 case at the beginning of phase 2 as well.
if [ "$INSTALL_ONLY" = 1 ]; then
say "Certbot is installed."
exit 0
@@ -1828,30 +1989,35 @@ UNLIKELY_EOF
error "WARNING: unable to check for updates."
fi
LE_VERSION_STATE=`CompareVersions "$LE_PYTHON" "$LE_AUTO_VERSION" "$REMOTE_VERSION"`
if [ "$LE_VERSION_STATE" = "UNOFFICIAL" ]; then
say "Unofficial certbot-auto version detected, self-upgrade is disabled: $LE_AUTO_VERSION"
elif [ "$LE_VERSION_STATE" = "OUTDATED" ]; then
say "Upgrading certbot-auto $LE_AUTO_VERSION to $REMOTE_VERSION..."
# If for any reason REMOTE_VERSION is not set, let's assume certbot-auto is up-to-date,
# and do not go into the self-upgrading process.
if [ -n "$REMOTE_VERSION" ]; then
LE_VERSION_STATE=`CompareVersions "$LE_PYTHON" "$LE_AUTO_VERSION" "$REMOTE_VERSION"`
# Now we drop into Python so we don't have to install even more
# dependencies (curl, etc.), for better flow control, and for the option of
# future Windows compatibility.
"$LE_PYTHON" "$TEMP_DIR/fetch.py" --le-auto-script "v$REMOTE_VERSION"
if [ "$LE_VERSION_STATE" = "UNOFFICIAL" ]; then
say "Unofficial certbot-auto version detected, self-upgrade is disabled: $LE_AUTO_VERSION"
elif [ "$LE_VERSION_STATE" = "OUTDATED" ]; then
say "Upgrading certbot-auto $LE_AUTO_VERSION to $REMOTE_VERSION..."
# Install new copy of certbot-auto.
# TODO: Deal with quotes in pathnames.
say "Replacing certbot-auto..."
# Clone permissions with cp. chmod and chown don't have a --reference
# option on macOS or BSD, and stat -c on Linux is stat -f on macOS and BSD:
cp -p "$0" "$TEMP_DIR/letsencrypt-auto.permission-clone"
cp "$TEMP_DIR/letsencrypt-auto" "$TEMP_DIR/letsencrypt-auto.permission-clone"
# Using mv rather than cp leaves the old file descriptor pointing to the
# original copy so the shell can continue to read it unmolested. mv across
# filesystems is non-atomic, doing `rm dest, cp src dest, rm src`, but the
# cp is unlikely to fail if the rm doesn't.
mv -f "$TEMP_DIR/letsencrypt-auto.permission-clone" "$0"
fi # A newer version is available.
# Now we drop into Python so we don't have to install even more
# dependencies (curl, etc.), for better flow control, and for the option of
# future Windows compatibility.
"$LE_PYTHON" "$TEMP_DIR/fetch.py" --le-auto-script "v$REMOTE_VERSION"
# Install new copy of certbot-auto.
# TODO: Deal with quotes in pathnames.
say "Replacing certbot-auto..."
# Clone permissions with cp. chmod and chown don't have a --reference
# option on macOS or BSD, and stat -c on Linux is stat -f on macOS and BSD:
cp -p "$0" "$TEMP_DIR/letsencrypt-auto.permission-clone"
cp "$TEMP_DIR/letsencrypt-auto" "$TEMP_DIR/letsencrypt-auto.permission-clone"
# Using mv rather than cp leaves the old file descriptor pointing to the
# original copy so the shell can continue to read it unmolested. mv across
# filesystems is non-atomic, doing `rm dest, cp src dest, rm src`, but the
# cp is unlikely to fail if the rm doesn't.
mv -f "$TEMP_DIR/letsencrypt-auto.permission-clone" "$0"
fi # A newer version is available.
fi
fi # Self-upgrading is allowed.
RerunWithArgs --le-auto-phase2 "$@"

View File

@@ -1,4 +1,5 @@
#!/usr/bin/env python
from __future__ import print_function
import os
import sys
@@ -7,5 +8,4 @@ if hook_script_type == 'deploy' and ('RENEWED_DOMAINS' not in os.environ or 'REN
sys.stderr.write('Environment variables not properly set!\n')
sys.exit(1)
with open(sys.argv[2], 'a') as file_h:
file_h.write(hook_script_type + '\n')
print(hook_script_type)

View File

@@ -1,4 +1,5 @@
"""This module contains advanced assertions for the certbot integration tests."""
import io
import os
try:
@@ -21,7 +22,8 @@ def assert_hook_execution(probe_path, probe_content):
:param probe_path: path to the file that received the hook output
:param probe_content: content expected when the hook is executed
"""
with open(probe_path, 'r') as file:
encoding = 'utf-8' if POSIX_MODE else 'utf-16'
with io.open(probe_path, 'rt', encoding=encoding) as file:
data = file.read()
lines = [line.strip() for line in data.splitlines()]

View File

@@ -1,5 +1,4 @@
"""Module to handle the context of integration tests."""
import logging
import os
import shutil
import sys

View File

@@ -9,6 +9,8 @@ import shutil
import subprocess
import time
from cryptography.x509 import NameOID
import pytest
from certbot_integration_tests.certbot_tests import context as certbot_context
@@ -595,6 +597,23 @@ def test_ocsp_status_live(context):
assert output.count('REVOKED') == 1, 'Expected {0} to be REVOKED'.format(cert)
def test_ocsp_renew(context):
"""Test that revoked certificates are renewed."""
# Obtain a certificate
certname = context.get_domain('ocsp-renew')
context.certbot(['--domains', certname])
# Test that "certbot renew" does not renew the certificate
assert_cert_count_for_lineage(context.config_dir, certname, 1)
context.certbot(['renew'], force_renew=False)
assert_cert_count_for_lineage(context.config_dir, certname, 1)
# Revoke the certificate and test that it does renew the certificate
context.certbot(['revoke', '--cert-name', certname, '--no-delete-after-revoke'])
context.certbot(['renew'], force_renew=False)
assert_cert_count_for_lineage(context.config_dir, certname, 2)
def test_dry_run_deactivate_authzs(context):
"""Test that Certbot deactivates authorizations when performing a dry run"""
@@ -611,3 +630,31 @@ def test_dry_run_deactivate_authzs(context):
context.certbot(args)
with open(join(context.workspace, 'logs', 'letsencrypt.log'), 'r') as f:
assert log_line in f.read(), 'Second order should have been recreated due to authz reuse'
def test_preferred_chain(context):
"""Test that --preferred-chain results in the correct chain.pem being produced"""
try:
issuers = misc.get_acme_issuers(context)
except NotImplementedError:
pytest.skip('This ACME server does not support alternative issuers.')
names = [i.issuer.get_attributes_for_oid(NameOID.COMMON_NAME)[0].value \
for i in issuers]
domain = context.get_domain('preferred-chain')
cert_path = join(context.config_dir, 'live', domain, 'chain.pem')
conf_path = join(context.config_dir, 'renewal', '{}.conf'.format(domain))
for (requested, expected) in [(n, n) for n in names] + [('nonexistent', names[0])]:
args = ['certonly', '--cert-name', domain, '-d', domain,
'--preferred-chain', requested, '--force-renewal']
context.certbot(args)
dumped = misc.read_certificate(cert_path)
assert 'Issuer: CN={}'.format(expected) in dumped, \
'Expected chain issuer to be {} when preferring {}'.format(expected, requested)
with open(conf_path, 'r') as f:
assert 'preferred_chain = {}'.format(requested) in f.read(), \
'Expected preferred_chain to be set in renewal config'

Some files were not shown because too many files have changed in this diff Show More