Compare commits

...

351 Commits

Author SHA1 Message Date
Joona Hoikkala
50fdb44f6a Py2 and Py3 compatibility for metaclass interfaces 2018-05-02 16:03:17 +03:00
Joona Hoikkala
552bfa5eb7 New interfaces for installers to run tasks on renew verb (#5879)
* ServerTLSUpdater and InstallerSpecificUpdater implementation

* Fixed tests and added disables for linter :/

* Added error logging for misconfigurationerror from plugin check

* Remove redundant parameter from interfaces

* Renaming the interfaces

* Finalize interface renaming and move tests to own file

* Refactored the runners

* Refactor the cli params

* Fix the interface args

* Fixed documentation

* Documentation and naming fixes

* Remove ServerTLSConfigurationUpdater

* Remove unnecessary linter disable

* Rename run_renewal_updaters to run_generic_updaters

* Do not raise exception, but make log message more informative and visible for the user

* Run renewal deployer before installer restart
2018-05-02 10:52:54 +03:00
Jacob Hoffman-Andrews
bf30226c69 Restore parallel waiting to Route53 plugin (#5712)
* Bring back parallel updates to route53.

* Re-add try

* Fix TTL.

* Remove unnecessary wait.

* Add pylint exceptions.

* Add dummy perform.

* review.feedback

* Fix underscore.

* Fix lint.
2018-04-25 15:09:50 -07:00
Brad Warren
f510f4bddf Update good volunteer task to good first issue. (#5891) 2018-04-24 06:38:15 -07:00
Aleksandr Volochnev
9c15fd354f Updated base image to python:2-alpine3.7 (#5889)
Updated base image from python:2-alpine to python:2-alpine3.7. Python:2-alpine internally utilises alpine version 3.4 which end-of-life date is the first of May 2018.
2018-04-20 15:17:05 -07:00
Brad Warren
726f3ce8b3 Remove EOL'd Ubuntu from targets.yaml (#5887)
See https://wiki.ubuntu.com/Releases.

Ubuntu 15.* repositories have been shut down for months now causing our tests
to always fail on these systems. While the tests on Ubuntu 12.04 still work, it
has been unsupported by Canonical for almost a year and I don't think we should
hamstring ourselves trying to continue to support it ourselves.
2018-04-19 17:57:41 -07:00
Erik Rose
f40e04401f Don't install enum34 when using Python 3.4 or later. Fix #5456. (#5846)
The re stdlib module requires attrs that don't exist in the backported 3.4 version.

Technically, we are changing our install behavior beyond what is necessary. Previously, enum34 was used for 3.4 and 3.5 as well, and it happened not to conflict, but I think it's better to use the latest bug-fixed stdlib versions as long as they meet the needs of `cryptography`, which is what depends on enum34. That way, at least the various stdlib modules are guaranteed not to conflict with each other.
2018-04-19 16:35:21 -07:00
Jeremy Gillula
398bd4a2cd Emphasizing the warnings in the READMEs in /etc/letsencrypt/live/exam… (#5871)
* Emphasizing the warnings in the READMEs in /etc/letsencrypt/live/example.com

* Making the warning more of a statement
2018-04-18 16:46:39 -07:00
Joona Hoikkala
a024aaf59d Enhance verb (#5596)
* Add the cli parameters

* Tests and error messages

* Requested fixes

* Only handle SSL vhosts

* Interactive cert-name selection if not defined on CLI

* Address PR comments

* Address review comments

* Added tests and addressed review comments

* Move cert manager tests to correct file

* Add display ops tests

* Use display util constants instead of hardcoded values in tests
2018-04-18 11:08:30 -07:00
Brad Warren
261d063b10 Revert fix-macos-pytest (#5853)
* Revert "Fix pytest on macOS in Travis (#5360)"

This reverts commit 5388842e5b.

* remove oldest passenv
2018-04-18 10:02:31 -07:00
Brad Warren
a9e01ade4c Revert "use older boulder version (#5852)" (#5855)
This reverts commit 6b29d159a2.
2018-04-17 17:17:15 -07:00
Kiel C
5c7fc07ccf Adjust file paths message from Warning to Info. (#5743) 2018-04-17 13:52:39 -07:00
Brad Warren
6dc8b66760 Add _choose_lineagename and update docs (#5650) 2018-04-17 11:27:45 -07:00
ohemorange
590ec375ec Get mypy running in travis for easier review (#5875) 2018-04-13 16:10:58 -07:00
Axel
523cdc578d Add port option for rfc2136 plugin (#5844) 2018-04-13 19:17:08 +03:00
Brad Warren
e0a5b1229f help clarify version number (#5865)
(Hopefully) helps make it clearer that that 22 and 24 corresponds to Apache 2.2 and 2.4.
2018-04-12 19:02:40 -07:00
ohemorange
6253acf335 Get mypy running with clean output (#5864)
Fixes #5849.

* extract mypy flags into mypy.ini file

* Get mypy running with clean output
2018-04-12 18:53:07 -07:00
Brad Warren
a708504d5b remove test metaclass (#5863) 2018-04-12 18:28:00 -07:00
ohemorange
2d31598484 Get mypy tox env running in the current setup (#5861)
* get mypy tox env running in the current setup

* use any python3 with mypy

* pin mypy dependencies
2018-04-12 15:47:39 -07:00
Brad Warren
6b29d159a2 use older boulder version (#5852) 2018-04-11 16:14:55 -07:00
sydneyli
88ceaa38d5 Bump certbot's acme depenency to 0.22.1 (#5826) 2018-04-11 14:36:53 -07:00
Brad Warren
e7db97df87 Update CHANGELOG for 0.23.0 (#5822)
* Update CHANGELOG for 0.23.0

* correct date
2018-04-11 11:16:12 -07:00
Joona Hoikkala
4a8e35289c PluginStorage to store variables between invocations. (#5468)
The base class for Installer plugins `certbot.plugins.common.Installer` now provides functionality of `PluginStorage` to all installer plugins. This allows a plugin to save and retrieve variables in between of invocations.

The on disk storage is basically a JSON file at `config_dir`/`.pluginstorage.json`, usually `/etc/letsencrypt/.pluginstorage.json`. The JSON structure is automatically namespaced using the internal plugin name as a namespace key. Because the actual storage is JSON, the supported data types are: dict, list, tuple, str, unicode, int, long, float, boolean and nonetype.

To add a variable from inside the plugin class:
`self.storage.put("my_variable_name", my_var)`

To fetch a variable from inside the plugin class:
`my_var = self.storage.fetch("my_variable_key")`

The storage state isn't written on disk automatically, but needs to be called:
`self.storage.save()`

* Plugin storage implementation

* Added config_dir to existing test mocks

* PluginStorage test cases

* Saner handling of bad config_dir paths

* Storage moved to Installer and not initialized on plugin __init__

* Finetuning and renaming
2018-04-11 08:54:55 -07:00
Brad Warren
58626c3197 Double max_rounds (#5842) 2018-04-09 16:58:58 -07:00
schoen
56fb667e15 Merge pull request #5754 from edaleeta/updating-developer-guide
Add note to developer guide docs about installing docs extras. (#4946)
2018-04-09 16:36:36 -07:00
Brad Warren
0153c04af3 Merge pull request #5829 from certbot/candidate-0.23.0
Update certbot-auto and versions to reflect 0.23.0 release
2018-04-09 12:45:15 -07:00
Peter Linss
db938dcc0e Update messages.py (#5759)
Add wildcard field to AuthorizationResource
2018-04-05 13:39:35 -07:00
Brad Warren
0e30621355 Bump version to 0.24.0 2018-04-04 15:05:08 -07:00
Brad Warren
16b2539f72 Release 0.23.0 2018-04-04 15:04:43 -07:00
Brad Warren
b6afba0d64 Include testdata (#5827) 2018-04-04 14:33:41 -07:00
Brad Warren
b24d9dddc3 Revert ACMEv2 default (#5819)
* Revert "document default is ACMEv2 (#5818)"

This reverts commit 2c502e6f8b.

* Revert "Update default to ACMEv2 server (#5722)"

This reverts commit 4d706ac77e.
2018-04-03 17:55:12 -07:00
Joona Hoikkala
9996730fb1 If restart fails, try alternative restart command if available (#5500)
* Use alternative restart command if available in distro overrides
2018-04-03 14:05:37 -07:00
Brad Warren
2c502e6f8b document default is ACMEv2 (#5818) 2018-04-03 14:04:51 -07:00
ohemorange
bdaccb645b Support quoted server names in Nginx (#5811)
* Support quoted server names in Nginx

* add unit test to check that we strip quotes

* update configurator test
2018-04-03 12:14:23 -07:00
Joona Hoikkala
f5ad08047b Fix comparison to check values (#5815) 2018-04-03 12:04:57 -07:00
Brad Warren
8fd3f6c64c fixes #5380 (#5812) 2018-04-03 11:44:13 -07:00
Joshua Bowman
4d706ac77e Update default to ACMEv2 server (#5722) 2018-03-30 17:16:48 -07:00
sydneyli
8231b1a19c Pin Lexicon version to 2.2.1 (#5803) 2018-03-29 17:09:21 -07:00
ohemorange
5ff7f2211e Explicitly add six as a dependency in letsencrypt-auto-source dockerfiles (#5808)
* update documentation

* explicitly add six as a dependency in letsencrypt-auto-source dockerfiles

* pin six version
2018-03-29 15:34:38 -07:00
Brad Warren
7630550ac4 Revert "Update oldest tests to test against 0.22.0 versions (#5800)" (#5809)
This reverts commit 336950c0b9.
2018-03-29 14:15:59 -07:00
Brad Warren
336950c0b9 Update oldest tests to test against 0.22.0 versions (#5800) 2018-03-28 08:37:00 -07:00
ohemorange
a779e06d47 Add integration tests for nginx plugin (#5441)
* Add a rewrite directive for the .well-known location so we don't hit existing rewrites

* add comment

* Add (nonexistent) document root so we don't use the default value

* Add integration tests for nginx plugin

* add a sleep 5 to test on travis

* put sleep 5 in the right spot

* test return status of grep respecting -e and note that we're actually not posix compliant

* redelete newline
2018-03-27 17:33:48 -07:00
ohemorange
669312d248 We don't try to add location blocks through a mechanism that checks REPEATABLE_DIRECTIVES, and it wouldn't work as an accurate check even if we did, so just remove it (#5787) 2018-03-27 15:25:34 -07:00
ohemorange
4d082e22e6 Remove ipv6only=on from duplicated vhosts (#5793)
* rename delete_default to remove_singleton_listen_params

* update docstring

* add documentation to obj.py

* add test for remove duplicate ipv6only

* Remove ipv6only=on from duplicated vhosts

* add test to make sure ipv6only=on is not erroneously removed
2018-03-27 15:11:39 -07:00
sydneyli
af2cce4ca8 fix(auth_handler): cleanup is always called (#5779)
* fix(auth_handler): cleanup is always called

* test(auth_handler): tests for various error cases
2018-03-26 17:09:02 -07:00
ohemorange
804fd4b78a factor out location_directive_for_achall (#5794) 2018-03-26 16:28:30 -07:00
Andrew Starr-Bochicchio
8cdb213a61 Google DNS: Mock API discovery to run tests without internet connection. (#5791)
* Google DNS: Mock API discovery to run tests without internet connection.

* Allow test to pass when run from main cerbot package.
2018-03-26 16:12:55 -07:00
ohemorange
e9707ebc26 Allow 'default' along with 'default_server' in Nginx (#5788)
* test default detection

* Allow 'default' along with 'default_server' in Nginx

* Test that default gets written out as default_server in canonical string

* remove superfulous parens
2018-03-26 14:56:31 -07:00
ohemorange
8d0d42a739 Refactor _add_directive into separate functions (#5786)
* Refactor _add_directive to separate functions

* UnspacedList isn't idempotent

* refactor parser in add_server_directives and update_or_add_server_directives

* update parser tests

* remove replace=False and add to update_or_add for replace=True in configurator

* remove replace=False and add to update_or_add for replace=True in http01

* update documentation
2018-03-23 16:30:13 -07:00
Alokin Software Pvt Ltd
693cb1d162 Support Openresty in the NGINX plugin (#5467)
* fixes #4919 openresty_support

* making the regex more general

* reformatting warning to pass lint

* Fix string formatting in logging function

* Fix LE_AUTO_VERSION
2018-03-22 17:50:05 -07:00
Delan Azabani
8e9a4447ff make pip_install.sh compatible with POSIX sh(1) again (#5622) 2018-03-22 12:24:53 -07:00
sydneyli
bca0aa48c2 logging: log timestamps as local timezone instead of UTC (#5607)
* logging: log timestamps as local timezone instead of UTC

* test(logging): expect localtime instead of gmtime

* linter fix in logging
2018-03-21 15:41:33 -07:00
Brad Warren
afb6260c34 update changelog for 0.22.1 and 0.22.2 (#5770) 2018-03-21 11:21:35 -07:00
Brad Warren
3f291e51c6 Update certbot auto to reflect 0.22 point releases (#5768)
* Release 0.22.1

(cherry picked from commit 05c75e34e2)

* Bump version to 0.23.0

(cherry picked from commit 6fd3a57791)

* Release 0.22.2

(cherry picked from commit ea445ed11e)

* Bump version to 0.23.0

(cherry picked from commit cbe87d451c66931a084f4e513d899aae085a37d3)
2018-03-21 11:21:09 -07:00
Sebastiaan Lokhorst
fe8e0c98c5 Update docs for Apache plugin (#5776)
The supported OSs are now listed in another file. The table also contradicted the text below.
2018-03-21 11:18:39 -07:00
Harlan Lieberman-Berg
cbd827382e Documentation on cron renewal (#5460) 2018-03-21 08:17:06 -07:00
Edelita Valdez
f01aa1295f Add quotes to command for docs extras. 2018-03-20 23:40:44 -07:00
noci2012
c0dc31fd88 Allow _acme-challenge as a zone (#5707)
* Allow _acme-challenge as a zone

Like described here:
https://github.com/lukas2511/dehydrated/wiki/example-dns-01-nsupdate-script

Not using this patch may be an issue if the parent zone has been (where a wildcard certificate has been requested.) signed by DNSSEC.

Please consider this also for inclusion before dns-01 will be allowed for wildcards.

* Update dns_rfc2136.py

forgot one domain_name reference

* Update dns_rfc2136.py

moved domain up & added assignment.

* Update dns_rfc2136_test.py

tests adjusted to new calls.

* Update dns_rfc2136_test.py

Forgot on DOMAIN...

* Update dns_rfc2136_test.py

* Update dns_rfc2136.py

pydoc updates.

* Update dns_rfc2136.py
2018-03-20 13:29:24 -07:00
Brad Warren
41ce108881 Fix cleanup_challenges call (#5761)
* fixes cleanup_challenges

* add test to prevent regressions
2018-03-19 16:51:01 -07:00
Gopal Adhikari
41ed6367b4 Fix typo: damain -> domain (#5756)
Fix typo: damain -> domain in certbot/util.py:607
2018-03-19 11:08:45 -07:00
Edelita Valdez
a26a78e84e Add note to developer guide docs about installing docs extras. (#4946) 2018-03-17 19:24:14 -07:00
sydneyli
3077b51500 Merge pull request #5749 from certbot/fix-docker-link
Fix Docker link
2018-03-16 18:15:05 -07:00
Brad Warren
d4834da0f4 fix docker link 2018-03-16 17:48:46 -07:00
Brad Warren
ba6bdb5099 Fix acme.client.Client.__init__ (#5747)
* fixes #5738

* add test to prevent regressions
2018-03-16 17:45:46 -07:00
sydneyli
79d90d6745 feat(nginx plugin): add HSTS enhancement (#5463)
* feat(nginx plugin): add HSTS enhancement

* chore(nginx): factor out block-splitting code from redirect & hsts enhancements!

* chore(nginx): merge fixes

* address comments

* fix linter: remove a space

* fix(config): remove SSL directives in HTTP block after block split, and remove_directive removes 'Managed by certbot' comment

* chore(nginx-hsts): Move added SSL directives to a constant on Configurator class

* fix(nginx-hsts): rebase on wildcard cert changes
2018-03-16 15:27:39 -07:00
ohemorange
5ecb68f2ed Update instances of acme-staging url to acme-staging-v02 (#5734)
* update instances of acme-staging url to acme-staging-v02

* keep example client as v1

* keep deactivate script as v1
2018-03-16 15:24:55 -07:00
Brad Warren
b3e73bd2ab removes blank line from chain.pem (#5730) 2018-03-14 17:38:37 -07:00
Spencer Eick
065e923bc9 Improve "cannot find cert of key directive" error (#5525) (#5679)
- Fix code to log separate error messages when either SSLCertificateFile or SSLCertificateKeyFile -
 directives are not found.
- Update the section in install.rst where the relevant error is referenced.
- Edit a docstring where 'cert' previously referred to certificate.
- Edit test_deploy_cert_invalid_vhost in the test suite to cover changes.

Fixes #5525.
2018-03-14 12:59:13 -07:00
cclauss
e405aaa4c1 Fix print() and xrange() for Python 3 (#5590) 2018-03-14 09:37:29 -07:00
Brad Warren
9ea14d2e2b Add docs about --server (#5713)
* Add docs about --server

* address review comments

* mention server in Docker docs

* correct server URL

* Use prod ACMEv2 example
2018-03-14 08:48:40 -07:00
Brad Warren
1d0e3b1bfa Add documentation about DNS plugins and Docker (#5710)
* make binding port optional

* Add DNS docker docs

* add basic DNS plugin docs

* Add link to DNS plugin docs from Docker docs

* Shrink table size
2018-03-13 07:08:01 -07:00
Brad Warren
d310ad18c7 Put API link at the bottom of DNS plugin docs (#5699)
* Put link to API at the bottom for future docs.

* Put API link at the bottom of existing docs.
2018-03-12 17:10:23 -07:00
Brad Warren
53c6b9a08f Merge pull request #5682 from certbot/candidate-0.22.0
Release 0.22.0
2018-03-12 13:06:30 -07:00
Brad Warren
64d647774e Update the changelog to reflect 0.22.0 (#5691) 2018-03-12 10:57:46 -07:00
Brad Warren
f13fdccf04 document resps param (#5695) 2018-03-12 10:51:45 -07:00
Brad Warren
2e6d65d9ec Add readthedocs requirements files (#5696)
* Add readthedocs requirements files.

* Only install docs extras for plugin.
2018-03-08 17:24:30 -08:00
Brad Warren
cc24b4e40a Fix --allow-subset-of-names (#5690)
* Remove aauthzr instance variable

* If domain begins with fail, fail the challenge.

* test --allow-subset-of-names

* Fix renewal and add extra check

* test after hook checks
2018-03-08 11:12:33 -08:00
Brad Warren
cc18da926e Quiet pylint (#5689) 2018-03-08 11:09:31 -08:00
sydneyli
f4bac423fb fix(acme): client._revoke sends default content_type (#5687) 2018-03-07 15:09:47 -08:00
Brad Warren
7a495f2656 Bump version to 0.23.0 2018-03-07 10:26:08 -08:00
Brad Warren
77fdb4d7d6 Release 0.22.0 2018-03-07 10:25:42 -08:00
Brad Warren
e0ae356aa3 Upgrade pipstrap to 1.5.1 (#5681)
* upgrade pipstrap to 1.5.1

* build leauto
2018-03-07 09:10:47 -08:00
Brad Warren
6357e051f4 Fallback without dns.resourceRecordSets.list permission (#5678)
* Add rrset list fallback

* List dns.resourceRecordSets.list as required

* Handle list failures differently for add and del

* Quote record content

* disable not-callable for iter_entry_points

* List update permission
2018-03-06 15:32:22 -08:00
Brad Warren
d62c56f9c9 Remove the assumption the domain is unique in the manual plugin (#5670)
* use entire achall as key

* Add manual cleanup hook

* use manual cleanup hook
2018-03-06 07:21:01 -08:00
Brad Warren
cee9ac586e Don't report coverage on Apache during integration tests (#5669)
* ignore Apache coverage

* drop min coverage to 67
2018-03-06 07:20:34 -08:00
Brad Warren
a643877f88 Merge pull request #5672 from certbot/route53_acmev2v2
Version 2 of ACMEv2 support for Route53 plugin
2018-03-06 07:19:46 -08:00
Brad Warren
7bc45121a1 Remove the need for route53:ListResourceRecordSets
* add test_change_txt_record_delete
2018-03-05 18:58:32 -08:00
Joona Hoikkala
fe682e779b ACMEv2 support for Route53 plugin 2018-03-05 18:58:27 -08:00
Joona Hoikkala
441625c610 Allow Google DNS plugin to write multiple TXT record values (#5652)
* Allow Google DNS plugin to write multiple TXT record values in same resourcerecord

* Atomic updates

* Split rrsets request
2018-03-05 12:49:02 -08:00
Brad Warren
cc344bfd1e Break lockstep between our packages (#5655)
Fixes #5490.

There's a lot of possibilities discussed in #5490, but I'll try and explain what I actually did here as succinctly as I can. Unfortunately, there's a fair bit to explain. My goal was to break lockstep and give us tests to ensure the minimum specified versions are correct without taking the time now to refactor our whole test setup.

To handle specifying each package's minimum acme/certbot version, I added a requirements file to each package. This won't actually be included in the shipped package (because it's not in the MANIFEST).

After creating these files and modifying tools/pip_install.sh to use them, I created a separate tox env for most packages (I kept the DNS plugins together for convenience). The reason this is necessary is because we currently use a single environment for each plugin, but if we used this approach for these tests we'd hit issues due to different installed plugins requiring different versions of acme/certbot. There's a lot more discussion about this in #5490 if you're interested in this piece. I unfortunately wasted a lot of time trying to remove the boilerplate this approach causes in tox.ini, but to do this I think we need negations described at complex factor conditions which hasn't made it into a tox release yet.

The biggest missing piece here is how to make sure the oldest versions that are currently pinned to master get updated. Currently, they'll stay pinned that way without manual intervention and won't be properly testing the oldest version. I think we should solve this during the larger test/repo refactoring after the release because the tests are using the correct values now and I don't see a simple way around the problem.

Once this lands, I'm planning on updating the test-everything tests to do integration tests with the "oldest" versions here.

* break lockstep between packages

* Use per package requirements files

* add local oldest requirements files

* update tox.ini

* work with dev0 versions

* Install requirements in separate step.

* don't error when we don't have requirements

* install latest packages in editable mode

* Update .travis.yml

* Add reminder comments

* move dev to requirements

* request acme[dev]

* Update pip_install documentation
2018-03-05 09:50:19 -08:00
Brad Warren
e1878593d5 Ensure fullchain_pem in the order is unicode/str (#5654)
* Decode fullchain_pem in ACMEv1

* Convert back to bytes in Certbot

* document bytes are returned
2018-03-05 07:27:44 -08:00
Brad Warren
31805c5a5f Merge pull request #5628 from certbot/dns-docker
Add DNS Dockerfiles
2018-03-02 11:36:16 -08:00
ohemorange
8bc9cd67f0 Fix ipv6only detection (#5648)
* Fix ipv6only detection

* move str() to inside ipv6_info

* add regression test

* Update to choose_vhosts
2018-03-01 15:08:53 -08:00
Brad Warren
d8a54dc444 Remove leading *. from default cert name. (#5639) 2018-03-01 14:55:45 -08:00
Brad Warren
8121acf2c1 Add user friendly wildcard error for ACMEv1 (#5636)
* add WildcardUnsupportedError

* Add friendly unsupported wildcard error msg

* correct documentation

* add version specifier
2018-03-01 14:54:48 -08:00
ohemorange
f0b337532c Nginx plugin wildcard support for ACMEv2 (#5619)
* support wildcards for deploy_cert

* support wildcards for enhance

* redirect enhance and some tests

* update tests

* add display_ops and display_repr

* update display_ops_test and errors found

* say server block

* match redirects properly

* functional code

* start adding tests and lint errors

* add configurator tests

* lint

* change message to be generic to installation and enhancement

* remove _wildcard_domain

* take selecting vhosts out of loop

* remove extra newline

* filter wildcard vhosts by port

* lint

* don't filter by domain

* [^.]+

* lint

* make vhost hashable

* one more tuple
2018-03-01 14:05:49 -08:00
Brad Warren
559220c2ef Add basic ACMEv2 integration tests (#5635)
* Use newer boulder config

* Use ACMEv2 endpoint if requested

* Add v2 integration tests

* Work with unset variables

* Add wildcard issuance test

* quote domains
2018-03-01 10:11:15 -08:00
Brad Warren
38d5144fff Drop min coverage to 63 (#5641) 2018-03-01 08:25:32 -08:00
Brad Warren
78735fa2c3 Suggest DNS authenticator when it's needed (#5638) 2018-02-28 16:08:06 -08:00
Joona Hoikkala
e9bc4a319b Apache plugin wildcard support for ACMEv2 (#5608)
In `deploy_cert()` and `enhance()`, the user will be presented with a dialog to choose from the VirtualHosts that can be covered by the wildcard domain name. The (multiple) selection result will then be handled in a similar way that we previously handled a single VirtualHost that was returned by the `_find_best_vhost()`.

Additionally the selected VirtualHosts are added to a dictionary that maps selections to a wildcard domain to be reused in the later `enhance()` call and not forcing the user to select the same VirtualHosts again.

* Apache plugin wildcard support

* Present dialog only once per domain, added tests

* Raise exception if no VHosts selected for wildcard domain
2018-02-28 11:31:47 -08:00
Brad Warren
a39d2fe55b Fix wildcard issuance (#5620)
* Add is_wildcard_domain to certbot.util.

* Error with --allow-subset-of-names and wildcards.

* Fix issue preventing wildcard cert issuance.

* Kill assumption domain is unique in auth_handler

* fix typo and add test

* update comments
2018-02-27 18:05:33 -08:00
Brad Warren
b18696b6a0 Don't run tests with Python 2.6 (#5627)
* Don't run tests with Python 2.6.

* Revert "Don't run tests with Python 2.6."

This reverts commit 4a9d778cca62ae2bec4cf060726e88f1fd66f374.

* Revert changes to auto_test.py.
2018-02-27 16:47:43 -08:00
Brad Warren
6f86267a26 Fix revocation in ACMEv2 (#5626)
* Allow revoke to pass in a url

* Add revocation support to ACMEv2.

* Provide regr for account based revocation.

* Add revoke wrapper to BackwardsCompat client
2018-02-27 12:42:13 -08:00
Brad Warren
57bdc590df Add DNS Dockerfiles 2018-02-26 16:27:38 -08:00
Brad Warren
43ba9cbf33 Merge pull request #5605 from certbot/rm-eol-2.6
Drop Python 2.6 and 3.3 support
2018-02-26 13:34:50 -08:00
Nick Bebout
f3a0deba84 Remove min version of setuptools (#5617) 2018-02-23 13:26:11 -08:00
Brad Warren
1e46d26ac3 Fix ACMEv2 issues (#5612)
* Add post wrapper to automatically add acme_version

* Add uri to authzr.

* Only add kid when account is set.

* Add content_type when downloading certificate.

* Only save new_authz URL when it exists.

* Handle combinations in ACMEv1 and ACMEv2.

* Add tests for ACMEv2 "combinations".
2018-02-22 16:28:50 -08:00
ohemorange
990b211a76 Remove extra :returns: (#5611) 2018-02-22 12:33:55 -08:00
ohemorange
457269b005 Add finalize_order to shim object, update Certbot to use it (#5601)
* update order object with returned authorizations

* major structure of finalize_order shim refactor

* util methods and imports for finalize_order shim refactor

* update certbot.tests.client_test.py

* extraneous client_test imports

* remove correct import

* update renewal call

* add test for acme.dump_pyopenssl_chain

* Add test for certbot.crypto_util.cert_and_chain_from_fullchain

* add tests for acme.client and change to fetch chain failure to TimeoutError

* s/rytpe/rtype

* remove ClientV1 passthrough

* dump the wrapped cert

* remove dead code

* remove the correct dead code

* support earlier mock
2018-02-22 10:14:29 -08:00
Marcus LaFerrera
c3659c300b Return str rather than bytes (#5585)
* Return str rather than bytes

Project id is returned as bytes, which causes issues when constructing the google cloud API url, converting `b'PROJECT_ID'` to `b%27PROJECT_ID%27` causing the request to fail.

* Ensure we handle both bytes and str types

* project_id should be a str or bytes, not int
2018-02-22 10:09:06 -08:00
Brad Warren
f3b23662f1 Don't error immediately on wildcards. (#5600) 2018-02-21 20:52:04 -08:00
Brad Warren
f1b7017c0c Finish dropping Python 2.6 and 3.3 support
* Undo letsencrypt-auto changes

* Remove ordereddict import

* Add Python 3.4 tests to replace 3.3

* Add python_requires

* update pipstrap
2018-02-21 19:11:01 -08:00
ohemorange
ea3b78e3c9 update order object with returned authorizations (#5598) 2018-02-20 18:53:48 -08:00
ohemorange
02b56bd7f3 Merge pull request #5588 from certbot/request_authorizations
Support new_order-style in Certbot
2018-02-20 17:10:05 -08:00
Erica Portnoy
d13a4ed18d add tests for if partial auth success 2018-02-20 16:50:23 -08:00
Erica Portnoy
df50f2d5fa client test 2018-02-20 16:12:15 -08:00
Erica Portnoy
dea43e90b6 lint 2018-02-20 16:11:36 -08:00
Erica Portnoy
a7eadf8862 add new order test for v1 2018-02-20 16:08:46 -08:00
Erica Portnoy
65d0b9674c Fix client test 2018-02-20 16:01:35 -08:00
Erica Portnoy
26bcaff85c add test for new_order for v2 2018-02-20 15:59:58 -08:00
Erica Portnoy
d5a90c5a6e delete key and csr before trying again 2018-02-20 15:43:27 -08:00
Erica Portnoy
051664a142 lint 2018-02-20 15:39:30 -08:00
Erica Portnoy
7c073dbcaf lint 2018-02-20 15:38:18 -08:00
Erica Portnoy
d29c637bf9 support best_effort 2018-02-20 15:36:35 -08:00
Erica Portnoy
d6af978472 remove if/pass 2018-02-20 14:52:11 -08:00
Erica Portnoy
3dfeb483ee lint 2018-02-20 14:49:23 -08:00
Erica Portnoy
76a0cbf9c2 client tests passing 2018-02-20 14:43:12 -08:00
Erica Portnoy
a0e84e65ce auth_handler tests are happy 2018-02-20 14:29:04 -08:00
Erica Portnoy
11f2f1e576 remove extra spaces 2018-02-20 13:20:41 -08:00
Erica Portnoy
d6b4e2001b put back in best_effort code, with a todo for actually supporting it in ACMEv2 2018-02-20 13:19:04 -08:00
schoen
59a1387764 Merge pull request #5594 from DrMattChristian/master
Fix Certbot Apache plugin on Oracle Linux Server, a clone of CentOS, RHEL
2018-02-20 09:12:57 -08:00
Matt Christian
9c84fe1144 Add override class for ID="ol" AKA Oracle Linux Server, a clone of CentOS/RHEL. 2018-02-18 15:45:22 -06:00
Erica Portnoy
68e24a8ea7 start test updates 2018-02-16 17:59:51 -08:00
Erica Portnoy
20d0b91c71 switch interface to new_order and remove best_effort flag 2018-02-16 17:35:10 -08:00
Erica Portnoy
ea2022588b add docstring 2018-02-16 16:32:49 -08:00
Erica Portnoy
eaf739184c pass pem to auth_handler 2018-02-16 16:29:42 -08:00
Erica Portnoy
73bd801f35 add and use request_authorizations 2018-02-16 16:22:26 -08:00
Hugo
42638afc75 Drop support for EOL Python 2.6 and 3.3
* Drop support for EOL Python 2.6

* Use more helpful assertIn/NotIn instead of assertTrue/False

* Drop support for EOL Python 3.3

* Remove redundant Python 3.3 code

* Restore code for RHEL 6 and virtualenv for Py2.7

* Revert pipstrap.py to upstream

* Merge py26_packages and non_py26_packages into all_packages

* Revert changes to *-auto in root

* Update by calling letsencrypt-auto-source/build.py

* Revert permissions for pipstrap.py
2018-02-16 16:14:01 -08:00
ohemorange
e95e963ad6 Get common name from CSR in new_order in ClientV2 (#5587)
* switch new_order to use crypto_util._pyopenssl_cert_or_req_san

* move certbot.crypto_util._get_names_from_loaded_cert_or_req functionality to acme.crypto_util._pyopenssl_cert_or_req_all_names
2018-02-16 16:05:16 -08:00
Brad Warren
2a142aa932 Make Certbot depend on josepy (#5542) 2018-02-16 14:47:10 -08:00
Brad Warren
adec7a8fed Cleanup dockerfile-dev (#5435)
* cleanup dockerfile-dev

* map port 80

* remove python3-dev package
2018-02-16 09:51:27 -08:00
ohemorange
dba6990f70 Merge pull request #5578 from certbot/v2-orders-v2
Add order support and tests
2018-02-15 19:43:06 -08:00
Brad Warren
70a75ebe9d Add tests and fix minor bugs in Order support
* delint

* refactor client tests

* Add test for new order and fix identifiers parsing.

* Add poll_and_finalize test

* Test and fix poll_authorizations timeout

* Add test_failed_authorizations

* Add test_poll_authorizations_success

* Test and fix finalize_order success

* add test_finalize_order_timeout

* add test_finalize_order_error

* test sleep code
2018-02-15 19:26:01 -08:00
Jacob Hoffman-Andrews
e48898a8c8 ACMEv2: Add Order support
This adds two new classes in messages: Order and OrderResource. It also adds methods to ClientV2 to create orders, and poll orders then request issuance.

The CSR is stored on the OrderResource so it can be carried along and submitted when it's time to finalize the order.
2018-02-15 19:12:15 -08:00
ohemorange
d467a4ae95 Add mechanism to detect acme version (#5554)
Detects acme version by checking for newNonce field in the directory, since it's mandatory. Also updates ClientNetwork.account on register and update_registration.

* add mechanism to detect acme version

* update ClientNetwork.account comment

* switch to MultiVersionClient object in acme

* add shim methods

* add returns

* use backwards-compatible format and implement register

* update to actual representation of tos v2

* add tos fields and pass through to v1 for partial updates

* update tests

* pass more tests

* allow instance variable pass-through and lint

* update certbot and tests to use new_account_and_tos method

* remove --agree-tos test from main_test for now because we moved the callback into acme

* add docstrings

* use hasattr

* all most review comments

* use terms_of_service for both v1 and v2

* add tests for acme/client.py

* tests for acme/messages.py
2018-02-15 19:04:17 -08:00
sydneyli
d5efefd979 Re-land proper webroot directory cleanup (#5577)
* fix(webroot): clean up directories properly

* fix(webroot): undo umask in finally

* Fix for MacOS
2018-02-15 15:55:08 -08:00
cclauss
09b5927e6a from botocore.exceptions import ClientError (#5507)
Fixes undefined name 'botocore' in flake8 testing of https://github.com/certbot/certbot

$ __flake8 . --count --select=E901,E999,F821,F822,F823 --show-source --statistics__
```
./tests/letstest/multitester.py:144:12: F821 undefined name 'botocore'
    except botocore.exceptions.ClientError as e:
           ^
1     F821 undefined name 'botocore'
```
2018-02-15 11:07:35 -08:00
sydneyli
7e6d2f1efe Merge pull request #5259 from certbot/issue_5045
Show expiration dates for cert when skipping its renewal
2018-02-15 09:53:13 -08:00
Sydney Li
608875cd65 Add test for skipped certs 2018-02-14 15:45:08 -08:00
sydneyli
99aec1394d Revert "Proper webroot directory cleanup (#5453)" (#5574)
This reverts commit ad0a99a1f5.
2018-02-14 12:09:17 -08:00
Joona Hoikkala
fbace69b5e Fix install verb (#5536)
* Fix install verb

* Fix error message, tests and remove global pylint change

* Fix boulder integration test keypath

* Also use chain_path from lineage if not defined on CLI
2018-02-14 09:28:36 -08:00
Joona Hoikkala
ac464a58e5 Only add Include for TLS configuration if not already there (#5498)
* Only add Include for TLS configuration if not already there

* Add tests to prevent future regression
2018-02-14 08:16:20 -08:00
sydneyli
9277710f6f Added install-only flag (#5531) 2018-02-13 11:15:08 -08:00
sydneyli
ad0a99a1f5 Proper webroot directory cleanup (#5453)
* fix(webroot): clean up directories properly

* fix(webroot): undo umask in finally
2018-02-13 10:50:04 -08:00
Brad Warren
49edf17cb7 ignore .docker (#5477) 2018-02-13 09:52:04 -08:00
Joona Hoikkala
932ecbb9c2 Fix test inconsistence in Apache plugin configurator_test (#5520) 2018-02-12 16:43:59 -08:00
Eli Young
90664f196f Remove autodocs for long-removed acme.other module (#5529)
This module was removed in 22a9c7e3c2. The
autodocs are therefore unnecessary. Furthermore, they are starting to
cause build failures for Fedora.
2018-02-12 16:43:11 -08:00
Jacob Hoffman-Andrews
789be8f9bc Change "Attempting to parse" warning to info. (#5557)
* Change "Attempting to parse" warning to info.

This message shows up on every renewal run when the config was updated
by a newer version of Certbot than the one being run. For instance, if a
user has the certbot packages installed from PPA (currently 0.18.2), but
runs certbot-auto once to try out the latest version (0.21.1), they will
start getting this message via email every 12 hours.
2018-02-12 14:55:41 -08:00
Peter Eckersley
abc4a27613 [Docs] restore docs for ppl just using Certbot git master (#5420)
- Dev / test cycles are one use case for the "running a local copy of
 the client" instructions, but simply running bleeding edge Certbot is
 another
 - So edit the docs to once again explain how to just run bleeding edge
 Certbot, without (say) always getting staging certs.
2018-02-12 14:07:33 -08:00
Brad Warren
1f45832460 Suggest people try the community forum. (#5561) 2018-02-09 16:41:05 -08:00
schoen
a58c875b2a Merge pull request #5526 from certbot/certificates
Use "certificate" instead of "cert" in docs.
2018-02-09 16:32:47 -08:00
ohemorange
d6b247c002 Set ClientNetwork.account after registering (#5558) 2018-02-09 12:54:15 -08:00
Brad Warren
4f0aeb12fa Add find-duplicative-certs docs (#5547)
* add find-duplicative-certs docs

* address review feedback
2018-02-07 14:14:26 -08:00
Jacob Hoffman-Andrews
530a9590e6 Add sudo to certbot-auto instructions. (#5501) 2018-02-07 14:08:03 -08:00
Brad Warren
0416382633 Update leauto_upgrades with tests from #5402. (#5407) 2018-02-06 17:01:58 -08:00
Jacob Hoffman-Andrews
9baf75d6c8 client.py changes for ACMEv2 (#5287)
* Implement ACMEv2 signing of POST bodies.

* Add account, and make acme_version explicit.

* Remove separate NewAccount.

* Rename to add v2.

* Add terms_of_service_agreed.

* Split out wrap_in_jws_v2 test.

* Re-add too-many-public-methods.

* Split Client into ClientBase / Client / ClientV2

* Use camelCase for newAccount.

* Make acme_version optional parameter on .post().

This allows us to instantiate a ClientNetwork before knowing the version.

* Add kid unconditionally.
2018-02-06 16:45:33 -08:00
sydneyli
e085ff06a1 Update old issue link to point to letsencrypt community forums. (#5538) 2018-02-05 16:27:21 -08:00
Jacob Hoffman-Andrews
72b63ca5ac Use "certificate" instead of "cert" in docs. 2018-02-01 13:14:43 -08:00
Brad Warren
45613fd31c update changelog for 0.21.1 (#5504) 2018-01-26 16:02:19 -08:00
Noah Swartz
b05be7fa65 Add expiration date to skipped message 2018-01-26 14:37:50 -08:00
Brad Warren
43bbaadd11 Update certbot-auto and help (#5487)
* Release 0.21.1

(cherry picked from commit ff60d70e68)

* Bump version to 0.22.0
2018-01-25 15:29:38 -08:00
Noah Swartz
a166396358 Merge pull request #5471 from certbot/issue_5449
add let's encrypt status to footer and fix link
2018-01-25 12:20:25 -08:00
Brad Warren
a2239baa45 fix test_tests.sh (#5478) 2018-01-24 22:38:36 -08:00
Brad Warren
a1aba5842e Fix --no-bootstrap on CentOS/RHEL 6 (#5476)
* fix --no-bootstrap on RHEL6

* Add regression test
2018-01-24 22:23:20 -08:00
ohemorange
8a9f21cdd3 Fix Nginx redirect issue (#5479)
* wrap redirect in if host matches

* return 404 if we've created a new block

* change domain matching to exact match

* insert new redirect directive at the top

* add a redirect block to the top if it doesn't already exist, even if there's an existing redirect

* fix obj tests

* remove active parameter

* update tests

* add back spaces

* move imports

* remove unused code
2018-01-24 22:19:32 -08:00
Jacob Hoffman-Andrews
0a4f926b16 Remove Default Detector log line. (#5372)
This produces a super-long log line that wraps to 30-60 lines, depending on
screen width. Even though it's just at debug level, it clutters up the integration
test output without providing proportional debugging value.

* Remove Default Detector log line.

This produces about 30 lines of log output. Even though it's just at debug
level, it clutters up the integration test output without providing proportional
debugging value.

* Add more useful logs.
2018-01-24 15:01:42 -08:00
Noah Swartz
c0068791ce add let's encrypt status to footer and fix link 2018-01-24 13:56:40 -08:00
Joona Hoikkala
b0aa8b7c0b Work around Basic Authentication for challenge dir in Apache (#5461)
Unfortunately, the way that Apache merges the configuration directives is different for mod_rewrite and <Location> / <Directory> directives.

To work around basic auth in VirtualHosts, the challenge override Include had to be split in two. The first part handles overrides for RewriteRule and the other part will handle overrides for <Directory> and <Location> directives.
2018-01-23 16:46:36 -08:00
Noah Swartz
a67a917eca Merge pull request #5446 from certbot/0.21.0-changelog
Add 0.21.0 changelog
2018-01-18 13:03:47 -08:00
Brad Warren
103039ca40 Add 0.21.0 changelog 2018-01-17 17:46:56 -08:00
Brad Warren
aa01b7d0c0 Merge pull request #5445 from certbot/candidate-0.21.0
Release 0.21.0
2018-01-17 17:43:57 -08:00
Brad Warren
325a97c1ed Bump version to 0.22.0 2018-01-17 15:55:41 -08:00
Brad Warren
bf695d048d Release 0.21.0 2018-01-17 15:55:29 -08:00
Brad Warren
1bb2cfadf7 hardcode vhosts and names for test (#5444) 2018-01-17 15:34:34 -08:00
Brad Warren
f43a95e9c1 Merge pull request #5442 from certbot/apache-http-01
Better Apache HTTP01 Support
2018-01-17 11:18:58 -08:00
Brad Warren
522532dc30 Improve no vhost error message 2018-01-17 11:01:24 -08:00
Joona Hoikkala
6dd724e1f4 Merge branch 'apache-http-01' of github.com:certbot/certbot into apache-http-01 2018-01-17 20:08:22 +02:00
Joona Hoikkala
63136be2e5 Make sure the HTTP tests do not use wrong vhosts for asserts 2018-01-17 20:07:38 +02:00
Brad Warren
bd231a3855 Error without vhosts and fix tests token type 2018-01-17 09:38:10 -08:00
ohemorange
e9b57e1783 Add (nonexistent) document root so we don't use the default value (#5437) 2018-01-17 08:02:10 -08:00
ohemorange
2c379cd363 Add a rewrite directive for the .well-known location so we don't hit existing rewrites (#5436) 2018-01-17 08:01:44 -08:00
Joona Hoikkala
b8f288a372 Add include to every VirtualHost if definite one not found based on name 2018-01-17 14:08:45 +02:00
Brad Warren
f420b19492 Apache HTTP01 Improvements
* Fix docstring quote spacing

* Remove unneeded directives

* Enable mod_rewrite

* Remove ifmod rewrite

* Use stricter rewriterule

* Uncomment tests

* Fix order args

* Remove S which doesn't seem to work across contexts

* Use double backslash to make pylint

* Fix enmod test

* Fix http-01 tests

* Test for rewrite

* check for Include in vhost

* add test_same_vhost

* Don't add includes twice

* Include default vhosts in search

* Respect port in find_best_http_vhost

* Add find_best_http_vhost port test

* Filter by port in http01
2018-01-16 23:17:08 -08:00
Joona Hoikkala
314c5f19e5 Set up vhost discovery and overrides for HTTP-01
* Finalized HTTP vhost discovery and added overrides

* Include overrides to every VirtualHost
2018-01-16 23:08:46 -08:00
ohemorange
7e463bccad Handle more edge cases for HTTP-01 support in Nginx (#5421)
* only when using http01, only match default_server by port

* import errors

* put back in the code that creates a dummy block, but only when we can't find anything else
2018-01-16 14:58:45 -08:00
Brad Warren
368ca0c109 Small cleanup for Apache HTTP-01
* Remove http_doer from self

* Refactor _find_best_vhost
2018-01-15 22:08:37 -08:00
Joona Hoikkala
60dd67a60e Use static directory under workdir for HTTP challenges (#5428)
* Use static directory under workdir for HTTP challenges

* Handle the reverter file registration before opening file handle
2018-01-14 15:22:22 -08:00
ohemorange
2cb9d9e2aa Implement HTTP-01 challenge for Nginx (#5414)
* get http01 challenge working

* support multiple challenge types in configurator.py

* update existing nginx tests

* lint

* refactor NginxHttp01 and NginxTlsSni01 to both now  inherit from NginxChallengePerformer

* remove TODO

* challenges_test tests with both tlssni01 and http01

* Make challenges.py more abstract to make lint happier

* add pylint disables to the tests to make pylint happier about the inheritance and abstraction situation

* no need to cover raise NotImplementedError() lines

* python3 compatibility

* test that http01 perform is called

* only remove ssl from addresses during http01

* Initialize addrs_to_add

* Change Nginx http01 to modify server block so the site doesn't stop serving while getting a cert

* pass existing unit tests

* rename sni --> http01 in unit tests

* lint

* fix configurator test

* select an http block instead of https

* properly test for port number

* use domains that have matching addresses

* remove debugger

* remove access_log and error_log cruft that wasn't being executed

* continue to return None from choose_redirect_vhost when create_if_no_match is False

* add nginx integration test
2018-01-11 17:06:23 -08:00
Brad Warren
5d58a3d847 Merge pull request #5417 from certbot/apache-http
HTTP01 support in Apache
2018-01-11 11:18:07 -08:00
Joona Hoikkala
28dad825af Do not try to remove temp dir if it wasn't created 2018-01-11 20:44:40 +02:00
Brad Warren
f0f5defb6f Address minor concerns with Apache HTTP-01
* enable other modules

* change port type

* remove maxDiff from test class

* update port comment

* add -f to a2dismod
2018-01-11 09:59:25 -08:00
Joona Hoikkala
fa97877cfb Make sure that Apache is listening on port 80 and has mod_alias
* Ensure that mod_alias is enabled

* Make sure we listen to port http01_port
2018-01-11 14:48:32 +02:00
Brad Warren
2ba334a182 Add basic HTTP01 support to Apache
* Add a simple version of HTTP01

* remove cert from chall name

* make directory work on 2.2

* cleanup challenges when finished

* import shutil

* fixup perform and cleanup tests

* Add tests for http_01.py
2018-01-10 23:35:09 -08:00
Brad Warren
9e95208101 Factor out common challengeperformer logic (#5413) 2018-01-10 18:34:45 -08:00
Brad Warren
39472f88de reduce ipdb version (#5408) 2018-01-10 13:26:31 -08:00
Brad Warren
3acf5d1ef9 Fix rebootstraping with old venvs (#5392)
* Fix rebootstrapping before venv move

* add regression test

* dedupe test

* Cleanup case when two venvs exist.

* Add clarifying comment

* Add double venv test to leauto_upgrades

* Fix logic with the help of coffee

* redirect stderr

* pass VENV_PATH through sudo

* redirect stderr
2018-01-10 12:10:21 -08:00
Brad Warren
00634394f2 Only respect LE_PYTHON inside USE_PYTHON_3 if we know a user must have set it version 2 (#5402)
* stop exporting LE_PYTHON

* unset LE_PYTHON sometimes
2018-01-09 21:16:44 -08:00
ohemorange
6eb459354f Address erikrose's comments on #5329 (#5400) 2018-01-09 16:48:16 -08:00
ohemorange
f5a02714cd Add deprecation warning for Python 2.6 (#5391)
* Add deprecation warning for Python 2.6

* Allow disabling Python 2.6 warning
2018-01-09 16:11:04 -08:00
Brad Warren
887a6bcfce Handle need to rebootstrap before fetch.py (#5389)
* Fix #5387

* Add test for #5387

* remove LE_PYTHON

* Use environment variable to reduce line length
2018-01-09 15:40:26 -08:00
Joona Hoikkala
288c4d956c Automatically install updates in test script (#5394) 2018-01-09 08:28:52 -08:00
Joona Hoikkala
62ffcf5373 Fix macOS builds for Python2.7 in Travis (#5378)
* Add OSX Python2 tests

* Make sure python2 is originating from homebrew on macOS

* Upgrade the already installed python2 instead of trying to reinstall
2018-01-09 07:48:05 -08:00
Brad Warren
d557475bb6 update Apache ciphersuites (#5383) 2018-01-09 07:46:21 -08:00
Brad Warren
e02adec26b Have letsencrypt-auto do a real upgrade in leauto-upgrades option 2 (#5390)
* Make leauto_upgrades do a real upgrade

* Cleanup vars and output

* Sleep until the server is ready

* add simple_http_server.py

* Use a randomly assigned port

* s/realpath/readlink

* wait for server before getting port

* s/localhost/all interfaces
2018-01-08 17:38:03 -08:00
Brad Warren
24ddc65cd4 Allow non-interactive revocation without deleting certificates (#5386)
* Add --delete-after-revoke flags

* Use delete_after_revoke value

* Add delete_after_revoke unit tests

* Add integration tests for delete-after-revoke.
2018-01-08 17:02:20 -08:00
ohemorange
8585cdd861 Deprecate Python2.6 by using Python3 on CentOS/RHEL 6 (#5329)
* If there's no python or there's only python2.6 on red hat systems, install python3

* Always check for python2.6

* address style, documentation, nits

* factor out all initialization code

* fix up python version return value when no python installed

* add no python error and exit

* document DeterminePythonVersion parameters

* build letsencrypt-auto

* close brace

* build leauto

* fix syntax errors

* set USE_PYTHON_3 for all cases

* rip out NOCRASH

* replace NOCRASH, update LE_PYTHON set logic

* use built-in venv for py3

* switch to LE_PYTHON not affecting bootstrap selection and not overwriting LE_PYTHON

* python3ify fetch.py

* get fetch.py working with python2 and 3

* don't verify server certificates in fetch.py HttpsGetter

* Use SSLContext and an environment variable so that our tests continue to never verify server certificates.

* typo

* build

* remove commented out code

* address review comments

* add documentation for YES_FLAG and QUIET_FLAG

* Add tests to centos6 Dockerfile to make sure we install python3 if and only if appropriate to do so.
2018-01-08 13:57:04 -08:00
Brad Warren
18f6deada8 Fix letsencrypt-auto name and long forms of -n (#5375) 2018-01-05 19:27:00 -08:00
Joona Hoikkala
a1713c0b79 Broader git ignore for pytest cache files (#5361)
Make gitignore take pytest cache directories in to account, even if
they reside in subdirectories.

If pytest is run for a certain module, ie. `pytest certbot-apache` the
cache directory is created under `certbot-apache` directory.
2018-01-05 11:08:38 -08:00
Joona Hoikkala
a3a66cd25d Use apache2ctl modules for Gentoo systems. (#5349)
* Do not call Apache binary for module reset in cleanup()

* Use apache2ctl modules for Gentoo
2018-01-04 14:36:16 -08:00
Noah Swartz
a7d00ee21b print as a string (#5359) 2018-01-04 13:59:29 -08:00
Brad Warren
5388842e5b Fix pytest on macOS in Travis (#5360)
* Add tools/pytest.sh

* pass TRAVIS through in tox.ini

* Use tools/pytest.sh to run pytest

* Add quiet to pytest.ini

* ignore pytest cache
2018-01-03 17:49:22 -08:00
Brad Warren
ed2168aaa8 Fix auto_tests on systems with new bootstrappers (#5348) 2017-12-21 16:55:21 -08:00
Brad Warren
d6b11fea72 More pip dependency resolution workarounds (#5339)
* remove pyopenssl and six deps

* remove outdated tox.ini dep requirement
2017-12-19 16:16:45 -08:00
Brad Warren
a1aea021e7 Pin dependencies in oldest tests (#5316)
* Add tools/merge_requirements.py

* Revert "Fix oldest tests by pinning Google DNS deps (#5000)"

This reverts commit f68fba2be2.

* Add tools/oldest_constraints.txt

* Remove oldest constraints from tox.ini

* Rename dev constraints file

* Update tools/pip_install.sh

* Update install_and_test.sh

* Fix pip_install.sh

* Don't cat when you can cp

* Add ng-httpsclient to dev constraints for oldest tests

* Bump tested setuptools version

* Update dev_constraints comment

* Better document oldest dependencies

* test against oldest versions we say we require

* Update dev constraints

* Properly handle empty lines

* Update constraints gen in pip_install

* Remove duplicated zope.component

* Reduce pyasn1-modules dependency

* Remove blank line

* pin back google-api-python-client

* pin back uritemplate

* pin josepy for oldest tests

* Undo changes to install_and_test.sh

* Update install_and_test.sh description

* use split instead of partition
2017-12-18 12:31:36 -08:00
Brad Warren
1b6005cc61 Pin josepy in letsencrypt-auto (#5321)
* pin josepy in le-auto

* Put pinned versions in sorted order
2017-12-14 18:15:42 -08:00
Joona Hoikkala
0e92d4ea98 Parse variables without whitespace separator correctly in CentOS family of distributions (#5318) 2017-12-11 11:50:56 -08:00
Jannis Leidel
2abc94661a Use josepy instead of acme.jose. (#5203) 2017-12-11 11:25:09 -08:00
Brad Warren
8bc785ed46 Make Travis builds faster in master (#5314)
* Remove extra le-auto tests from master

* Remove dockerfile-dev test from master

* Remove intermediate Python 3.x tests from master

* Reorder travis jobs for speed
2017-12-08 16:35:59 -08:00
Noah Swartz
0046428382 print warnings for 3.3 users (#5283)
fix errors
2017-12-08 12:45:04 -08:00
Michael Coleman
5d0888809f Remove slash from document root path in Webroot example (#5293)
It seems the document root path to the `--webroot-path`, `-w` option
can't have a trailing slash.  
Here is an example of a user who followed this example and had their
certificate signing request error out.  
https://superuser.com/questions/1273984/why-does-certbot-letsencrypt-recieve-a-403-forbidden
2017-12-07 15:53:47 -08:00
Noah Swartz
8096b91496 Merge pull request #5304 from certbot/0.20.0-changelog
Update changelog for 0.20.0
2017-12-07 15:32:35 -08:00
Brad Warren
e696766ed1 Expand on changes to the Apache plugin 2017-12-07 13:48:44 -08:00
ohemorange
8b5d6879cc Create a new server block when making server block ssl (#5220)
* create_new_vhost_from_default --> duplicate_vhost

* add source_path property

* set source path for duplicated vhost

* change around logic of where making ssl happens

* don't add listen 80 to newly created ssl block

* cache vhosts list

* remove source path

* add redirect block if we created a new server block

* Remove listen directives when making server block ssl

* Reset vhost cache on parser load

* flip connected pointer direction for finding newly made server block to match previous redirect search constraints

* also test for new redirect block styles

* fix contains_list and test redirect blocks

* update lint, parser, and obj tests

* reset new vhost (fixing previous bug) and move removing default from addrs under if statement

* reuse and update newly created ssl server block when appropriate, and update unit tests

* append newly created server blocks to file instead of inserting directly after, so we don't have to update other vhosts' paths

* add coverage for NO_IF_REDIRECT_COMMENT_BLOCK

* add coverage for parser load calls

* replace some double quotes with single quotes

* replace backslash continuations with parentheses

* update docstrings

* switch to only creating a new block on redirect enhancement, including removing the get_vhosts cache

* update configurator tests

* update obj test

* switch delete_default default for duplicate_vhost
2017-12-07 09:48:54 -08:00
Brad Warren
d039106b68 Merge pull request #5303 from certbot/candidate-0.20.0
Release 0.20.0
2017-12-06 17:59:51 -08:00
Brad Warren
abed73a8e4 Revert "Nginx reversion (#5299)" (#5305)
This reverts commit c9949411cd.
2017-12-06 17:45:20 -08:00
Noah Swartz
3951baf6c0 Merge pull request #5284 from Eccenux/issue_5274
Show a diff when re-creating certificate
2017-12-06 17:07:36 -08:00
Brad Warren
716f25743c Update changelog for 0.20.0 2017-12-06 16:33:55 -08:00
Noah Swartz
b3ca6bb2b1 Merge pull request #5228 from jonasbn/master
Documentation update to certbot/main.py
2017-12-06 16:26:26 -08:00
Brad Warren
78d97ca023 Bump version to 0.21.0 2017-12-06 14:52:16 -08:00
Brad Warren
f1554324da Release 0.20.0 2017-12-06 14:46:55 -08:00
Brad Warren
c9949411cd Nginx reversion (#5299)
The reason for this PR is many bug fixes in the nginx plugin for changes we haven't released yet are included in #5220 which may not make our next release. If it doesn't, we will (mostly) revert the nginx plugin back to its previous state to avoid releasing these bugs and will revert this PR after the release.

* Revert "Nginx IPv6 support (#5178)"

This reverts commit 68e37b03c8.

* Revert "Fix bug that stopped nginx from finding new server block for redirect (#5198)"

This reverts commit e2ab940ac0.

* Revert "Nginx creates a vhost block if no matching block is found (#5153)"

This reverts commit 95a7d45856.
2017-12-05 20:04:08 -08:00
Brad Warren
678ab7328e Merge pull request #5300 from certbot/flexible-challenge-uri++
ACMEv2: Allow "uri" or "url" in challenge part 2
2017-12-05 12:11:48 -08:00
Brad Warren
62c1112d10 Keep the same behavior with the uri attribute 2017-12-05 10:26:32 -08:00
Jacob Hoffman-Andrews
8c4f016b2d In ACMEv2, challenges have "url" instead of "uri". To handle this smoothly, Challenge's uri field becomes private (_uri), and is joined by _url. Serialization and deserialization will preserve whichever one was set. The uri name is taken over by an @property that returns whichever of the two is set. I chose not to enforce that they shouldn't both be present because it would just add unnecessary code and brittleness with no stability benefit.
* Make url a virtual field.

* Add @property annotation.
2017-12-04 20:51:19 -08:00
Brad Warren
4db7195e77 Fix coveralls (#5298) 2017-12-04 17:09:01 -08:00
Brad Warren
bb70962bb8 Stop using new mock functionality in tests (#5295)
* Remove assert_called_once from dns-route53

* Remove assert_called_once from main_test.py

* Remove assert_called() usage in dns-digitalocean

* Remove assert_called() usage in dns-route53

* Downgrade mock version in certbot-auto
2017-12-04 14:44:22 -08:00
Joona Hoikkala
dc78fd731e Distribution specific override functionality based on class inheritance (#5202)
Class inheritance based approach to distro specific overrides.

How it works:
The certbot-apache plugin entrypoint has been changed to entrypoint.ENTRYPOINT which is a variable containing appropriate override class for system, if available.

Override classes register themselves using decorator override.register() which takes a list of distribution fingerprints (ID & LIKE variables in /etc/os-release, or platform.linux_distribution() as a fallback). These end up as keys in dict override.OVERRIDE_CLASSES and values for the keys are references to the class that called the decorator, hence allowing self-registration of override classes when they are imported. The only file importing these override classes is entrypoint.py, so adding new override classes would need only one import in addition to the actual override class file.

Generic changes:

    Parser initialization has been moved to separate class method, allowing easy override where needed.
    Cleaned up configurator.py a bit, and moved some helper functions to newly created apache_util.py
    Split Debian specific code from configurator.py to debian_override.py
    Changed define_cmd to apache_cmd because the parameters are for every distribution supporting this behavior, and we're able to use the value to build the additional configuration dump commands.
    Moved add_parser_mod() from configurator to parser add_mod()
    Added two new configuration dump parsing methods to update_runtime_variables() in parser: update_includes() and update_modules().
    Changed init_modules() in parser to accommodate the changes above. (ie. don't throw existing self.modules out).
    Moved OS based constants to their respective override classes.
    Refactored configurator class discovery in tests to help easier test case creation using distribution based override configurator class.
    tests.util.get_apache_configurator() now takes keyword argument os_info which is string of the desired mock OS fingerprint response that's used for picking the right override class.

This PR includes two major generic additions that should vastly improve our parsing accuracy and quality:

    Includes are parsed from config dump from httpd binary. This is mandatory for some distributions (Like OpenSUSE) to get visibility over the whole configuration tree because of Include statements passed on in command line, and not via root httpd.conf file.
    Modules are parsed from config dump from httpd binary. This lets us jump into correct IfModule directives if for some reason we have missed the module availability (because of one being included on command line or such).

Distribution specific changes
Because of the generic changes, there are two distributions (or distribution families) that do not provide such functionality, so it had to be overridden in their respective override files. These distributions are:

    CentOS, because it deliberately limits httpd binary stdout using SELinux as a feature. We are doing opportunistic config dumps here however, in case SELinux enforcing is off.
    Gentoo, because it does not provide a way to invoke httpd with command line parsed from its specific configuration file. Gentoo relies heavily on Define statements that are passed over from APACHE2_OPTS variable /etc/conf.d/apache2 file and most of the configuration in root Apache configuration are dependent on these values.

Debian

    Moved the Debian specific parts from configurator.py to Debian specific override.

CentOS

    Parsing of /etc/sysconfig/httpd file for additional Define statements. This could hold other parameters too, but parsing everything off it would require a full Apache lexer. For CLI parameters, I think Defines are the most common ones. This is done in addition of opportunistic parsing of httpd binary config dump.
    Added CentOS default Apache configuration tree for realistic test cases.

Gentoo

    Parsing Defines from /etc/conf.d/apache2 variable APACHE2_OPTS, which holds additional Define statements to enable certain functionalities, enabling parts of the configuration in the Apache2 DOM. This is done instead of trying to parse httpd binary configuration dumps.
    Added default Apache configuration from Gentoo to testdata, including /etc/conf.d/apache2 file for realistic test cases.


* Distribution specific override functionality based on class inheritance

* Need to patch get_systemd_os_like to as travis has proper os-release

* Added pydoc

* Move parser initialization to a method and fix Python 3 __new__ errors

* Parser changes to parse HTTPD config

* Try to get modules and includes from httpd process for better visibility over the configuration

* Had to disable duplicate-code because of test setup (PyCQA/pylint/issues/214)

* CentOS tests and linter fixes

* Gentoo override, tests and linter fixes

* Mock the process call in all the tests that require it

* Fix CentOS test mock

* Restore reseting modules list functionality for cleanup

* Move OS fingerprinting and constant mocks to parent class

* Fixes requested in review

* New entrypoint structure and started moving OS constants to override classes

* OS constants move continued, test and linter fixes

* Removed dead code

* Apache compatibility test changest to reflect OS constant restructure

* Test fix

* Requested changes

* Moved Debian specific tests to own test file

* Removed decorator based override class registration in favor of entrypoint dict

* Fix for update_includes for some versions of Augeas

* Take fedora fix into account in tests

* Review fixes
2017-12-04 11:49:18 -08:00
Jacob Hoffman-Andrews
73ba9af442 Don't echo Boulder logs on failure. (#5290)
The extensive logs made it hard to spot the actual failure.
2017-12-04 11:20:53 -08:00
Eccenux
840c943711 W:266,28: Redefining built-in 'list' (redefined-builtin) 2017-12-02 12:28:53 +01:00
Eccenux
abdde886fa code style 2017-12-02 12:25:58 +01:00
Jacob Hoffman-Andrews
7319cc975a Quiet pip install output. (#5288)
pip install generates a lot of lines of output that make it harder to see what
tox is running in general. This adds the -q flag to pip install.

At the same time, add `set -x` in install_and_test.sh and pip_install.sh so they
echo the commands they are running. This makes it a little clearer what's going
on in tests.

I didn't put `set -x` at the top or in the shebang, because moving it lower lets
us avoid echoing some of the messy if/then setup statements in these scripts,
which focussed attention on the pip install command.
2017-12-01 23:40:09 -08:00
Brad Warren
394dafd38c Revert requiring dnsmadeeasy extras for lexicon (#5291)
Fixes failures at https://travis-ci.org/certbot/certbot/jobs/310248574#L1558.

Additional context can be found at #5230 and 604584521a (diff-2eeaed663bd0d25b7e608891384b7298).
2017-12-01 17:00:24 -08:00
Jacob Hoffman-Andrews
8ce6ee5f3e Remove all but one BOULDER_INTEGRATION, and macOS (#5270)
These tests are retained in the test-everything branch, which has a Travis cron
job to run nightly.

Removing these speeds up the Certbot Travis builds dramatically for two reasons:
 - The Boulder integration tests are slow (10-12 minutes), and it's exceedingly
   rare for them to fail on one Python environment but not another.
 - The macOS tests take a very long time to run, because they need to wait for
   build slots on the limited number of macOS instances, which are often in high
   demand.
2017-12-01 16:10:16 -08:00
Brad Warren
b9b329ecf7 pin pkging tools that have dropped support (#5281) 2017-12-01 13:20:27 -08:00
Brad Warren
48173ed1cb Switch from nose to pytest (#5282)
* Use pipstrap to install a good version of pip

* Use pytest in cb-auto tests

* Remove nose usage in auto_test.py

* remove nose dev dep

* use pytest in test_tests

* Use pytest in tox

* Update dev dependency pinnings

* remove nose multiprocess lines

* Use pytest for coverage

* Use older py and pytest for old python versions

* Add test for Error.__str__

* pin pytest in oldest test

* Fix tests for DNS-DO plugin on py26

* Work around bug for Python 3.3

* Clarify dockerfile comments
2017-12-01 10:59:55 -08:00
Eccenux
20bca19420 Show a diff when re-creating certificate instead of full list of domains #5274 2017-11-30 20:24:49 +01:00
Brad Warren
d246ba78c7 Use pip3 if pip isn't available (#5277) 2017-11-29 13:09:25 -08:00
Jacob Hoffman-Andrews
8fd1d0d19e Small Travis cleanups (#5273)
* Test with no hosts.

* Simplify build matrix.

* Remove after_failure.
2017-11-28 18:22:01 -08:00
Noah Swartz
f5ed771d4f change some instances of help to flag (#5248) 2017-11-27 14:50:06 -08:00
Peter Eckersley
cdd89998e3 Add nginx to these weird instructions (#5243)
These are probably made obsolete by the instruction generator, and they don't include Ubuntu...
2017-11-27 14:49:19 -08:00
jonasbn
e795a79547 Lots of minor small cosmetic changes and addressing the feedback on uniformity (in the file) from @SwartzCr 2017-11-15 07:38:09 +01:00
jonasbn
02126c0961 Minor improvement to newly added documentation section 2017-11-15 07:24:54 +01:00
jonasbn
0b843bb851 Added some missing documentation 2017-11-15 07:23:34 +01:00
jonasbn
4d60f32865 Minor corrections to return types for improved formatting 2017-11-12 13:03:09 +01:00
jonasbn
069ce1c55f Merge branch 'master' of https://github.com/certbot/certbot 2017-11-12 00:32:45 +01:00
jonasbn
eb26e0aacf Updated parameter types for a lot of parametersm some aspects are still a bug unclear, hopefully a review can shed some light on this details 2017-11-12 00:32:24 +01:00
Brad Warren
686fa36b3b Install dnsmadeeasy extras from dns-lexicon (#5230)
* Add tools/pip_constraints.txt to pin all Python dependencies

* Use tools/pip_constraints.txt in tools/pip_install.sh

* Install dnsmadeeasy extras in dnsmadeeasy plugin
2017-11-08 10:58:00 -08:00
jonasbn
1173acfaf0 Making friends with the linter
lint: commands succeeded
congratulations :)
2017-11-07 22:18:11 +01:00
jonasbn
0aa9322280 Added a shot at what might be the proper type, I need to get a better understanding of certbot's datatypes 2017-11-07 21:47:59 +01:00
jonasbn
89485f7463 I think I figured out the authentication handler object 2017-11-07 21:40:35 +01:00
jonasbn
4e73d7ce00 Specified the list parameters after reading up on lists as parameters
Ref: https://stackoverflow.com/questions/3961007/passing-an-array-list-into-python
2017-11-07 21:24:30 +01:00
jonasbn
0137055c24 First shot at updates at documentation, plenty of questions left at issue #4736 2017-11-05 21:59:55 +01:00
Brad Warren
884fc56a3e Use pipstrap to ensure pip works on older systems (#5216)
* Use pipstrap in tools/_venv_common.sh

* Use _venv_common.sh in test_sdists
2017-11-03 10:59:56 -07:00
Joona Hoikkala
68e37b03c8 Nginx IPv6 support (#5178)
* Nginx IPv6 support

* Test and lint fixes

* IPv6 tests to Nginx plugin

* Make ipv6_info() port aware

* Named tuple values for readability

* Lint fix

* Requested changes
2017-10-31 19:41:32 -05:00
yomna
2a13f00301 Merge pull request #5205 from mvi-x/master
[#5155] - replaces isinstance(x, str) with isinstance(x, six.string_types)
2017-10-31 14:42:14 -07:00
yomna
f962b5c83d Forcing pip to use https on older docker images (#5214) 2017-10-31 12:52:40 -07:00
mvi
19a4e6079e [#5155] - replaces instances of isinstance(x, str) with isinstance(x, six.string_types) 2017-10-26 19:13:25 +02:00
ohemorange
e2ab940ac0 Fix bug that stopped nginx from finding new server block for redirect (#5198)
* fix bug that stopped nginx from finding new server block for redirect

* add regression test
2017-10-20 16:46:36 -07:00
ohemorange
3c1dafa9e9 Correctly test for existing Certbot redirect when adding an Nginx redirect block (#5192)
* add test that should fail on completion of this PR

* fix double redirect problem

* update existing test to match new whitespace
2017-10-19 14:56:53 -07:00
Jacob Hoffman-Andrews
a6cecd784b [#4535] - Unwrap 'max retries exceeded' errors (#4733)
Fixes #4535 

Extracts the relevant fields using a regex. We considered catching
specific exception types, and referencing their fields, but the types
raised by `requests` are not well documented and may not be long
term stable. If the regex fails to match, for instance due to a change
in the exception message, the new exception message will just be
passed through.
2017-10-19 14:16:59 -07:00
Felix Yan
5d2f6eb8ed Fix typos in certbot_apache/tests/configurator_test.py (#5193) 2017-10-19 11:23:07 -07:00
ohemorange
95a7d45856 Nginx creates a vhost block if no matching block is found (#5153)
* Allow authentication if there's no appropriate vhost

* Update test

* add flag to suppress raising error if no match is found

* Allow installation if there's no appropriate vhost

* remove traceback

* make new vhost ssl

* Fix existing bugs in nginxparser.py and obj.py

* Switch isinstance(x, str) to isinstance(x, six.string_types) in the Nginx plugin

* remove unused import

* remove unneeded custom copy from Addr

* Add docstring for create_new_vhost_from_default

* add test for create_new_vhost_from_default

* add configurator tests and leave finding the first server block for another PR

* don't assume order from a set

* address multiple default_server problem

* don't add vhosts twice

* update unit tests

* update docstring

* Add logger.info message for using default address in tlssni01 auth
2017-10-13 12:29:02 -07:00
Joona Hoikkala
99f00d21c4 Skip menu in webroot plugin when there's nothing to choose from (#5183)
* Skip menu in webroot, when there's nothing to choose from

* Added testcase
2017-10-13 12:25:33 -07:00
Brad Warren
7c11158810 Retry failures to start boulder (#5176)
Occasionally a network error prevents Docker from starting boulder causing
Travis tests to fail like it did at
https://travis-ci.org/certbot/certbot/jobs/282923098. This works around the
problem by using travis_retry to try to start boulder again if it fails.

This also moves the logic of waiting for boulder to start into
tests/boulder-fetch.sh so people running integration tests locally can benefit.
2017-10-12 17:00:13 -07:00
Joona Hoikkala
232f5a92d1 Fix naming in error message (#5181) 2017-10-11 08:18:41 -07:00
Brad Warren
1081a2501f integration test to prevent regressions of #5115 (#5172) 2017-10-11 08:18:17 -07:00
Brad Warren
03cbe9dd86 Document --no-directory-hooks (#5171) 2017-10-11 08:16:48 -07:00
Brad Warren
cacc40817b Update brew before installing packages (#5182)
* Update brew
2017-10-10 17:30:51 -07:00
r5d
d2c16fcb62 certbot: Flush output after write in IDisplay methods. (#5164)
- Update `notification`, `yesno`, `checklist`, `_print_menu`, and
  `_get_valid_int_ans` methods in `certbot.display.util.FileDisplay`.
- Update `notification` method in
  `certbot.display.util.NoninteractiveDisplay`.

Addresses issue #4879.
2017-10-04 18:06:57 -07:00
Brad Warren
a8051b58eb Update changelog to reflect 0.19.0. (#5170) 2017-10-04 17:58:10 -07:00
Brad Warren
2d4f36cc9f Merge pull request #5169 from certbot/candidate-0.19.0
Release 0.19.0
2017-10-04 16:58:07 -07:00
Brad Warren
13b4a4e1c2 Bump version to 0.20.0 2017-10-04 15:57:16 -07:00
Brad Warren
1f258449a4 Release 0.19.0 2017-10-04 12:11:20 -07:00
yomna
3087b436f3 Delete after revoke [#4109] (#4914)
*     Switching from old branch (issue-4109) and addressing changes requested
    in last iteration of review:
    80aa857fd2

    Requested changes that were addressed:
    - fixed outdated docstring for `cert_path_to_lineage`
    - removed `full_archive_dir_from_renewal_conf` amd replaced with `full_archive_path` (and `_full_archive_path` -> `full_archive_path`)
    - matching on `cert` instead of `chain` in `cert_manager.cert_path_to_lineage`
    - fixed the two coding wrongs make a right issue

    Requested changes which were not addressed:
    - moving `cert_path_to_lineage` from `cert_manager` to `storage`,
      as it would introduce a hard to resolve circular dependency.

* Update integration tests to handle default deletion after revoke.

* Swapping test domains.

* Addressing PR feedback:
	- calling storage.full_archive_path with a ConfigObj instead of None
	- Removing lambda x: x.chain_path as an option to match against

* Addressing PR feedback: it's expected that len(pattern) is 0, so handle that case properly.

* Testing of conflicting values of --cert-name and --cert-path non-interactive mode.

* Silly test for when neither certname nor cert-path were specified.

* Changing archive_files to a private function, because mocking nested functions seems impossible.

* Tests for storage.cert_path_for_cert_name

* Splitting out _acceptable_matches

* Some tests for cert_manager.cert_path_to_lineage

* Offerings to the Lint God

* Cleaner way of dealing with files in archive dirs

* Handling the two different use cases of match_and_check_overlaps a bit better

* late night syntax errors

* Test for when multiple lineages share an archive dir

* Tests for certbot.cert_manager.match_and_check_overlaps

* Removing unneeded nesting

* Lint errors that Travis caught that didn't show up locally

* Adding two integration tests (matching & mismatched --cert-path, --cert-name)  based on feedback.

* Asking the user if they want to delete in interactive mode.
2017-10-03 16:36:26 -07:00
Brad Warren
356471cdf6 Add hook directories (#5151)
* Add hook dir constants

* Add hook dir properties to configuration

* test hook dir properties

* reuse certbot.util.is_exe

* Add certbot.hooks.list_hooks

* test list_hooks

* Run pre-hooks in directory

* Run deploy-hooks in directory

* Run post-hooks in directory

* Refactor and update certbot/tests/hook_test.py

* Add integration tests for hook directories

* Have Certbot create hook directories.

* document renewal hook directories

* Add --no-directory-hooks

* Make minor note about locale independent sorting
2017-10-03 13:52:02 -07:00
r5d
b9d129bd43 certbot: Stop using print in log module. (#5160)
* Update certbot.log.post_arg_parse_except_hook function.
* Update certbot.tests.log_test._test_common method.

See discussion #3720.
2017-10-03 12:52:41 -07:00
Giacomo Ghidini
b0e5809df2 [#5154] Enable certificate verification (incl. revocation) on Docker (#5159)
o Install `openssl` as part of `.certbot-deps`
o `certbot` on Docker container uses `openssl` to verify certificate
2017-10-02 18:34:59 -07:00
Joona Hoikkala
46052f826c Handle NoneType from Augeas better in Apache parser get_arg (#5135)
* Fix #4245

* Simpler, more accurate test

* Do not add empty values to parser modules

* Py26 fix
2017-10-02 16:18:37 -07:00
Brad Warren
5f6b1378ec Fixes #5115 (#5150) 2017-10-02 14:33:49 -07:00
Brad Warren
34d78ff626 Fix hook test. (#5152)
Up until now, this test was written incorrectly. In addition, when it has
failed, it simply prints error messages rather than reporting that the test
failed. This fixes both of these problems.
2017-10-02 13:20:35 -07:00
Brad Warren
cad7d4c8ed Update master to reflect 0.18.2 (#5127)
* Release 0.18.2

(cherry picked from commit d031c42b98)

* Bump version to 0.19.0
2017-09-27 16:02:40 -07:00
Joona Hoikkala
ba84b7ab49 Add test to prevent regressions of #4183 (#5134) 2017-09-27 15:51:28 -07:00
ohemorange
7412099567 Allow multiple interactive certname selections in certbot delete (#5133) 2017-09-27 15:47:40 -07:00
r5d
85deca588f Stop using print in certbot.main module. (#5121)
* Stop using print in `certbot.main` module.

* Update certbot.main.plugins_cmd` function.

* Update test methods `test_plugins_no_args`,
`test_plugins_no_args_unprivileged`, `test_plugins_init` and
`test_plugins_prepare` in `cerbot.tests.MainTest` class.

Addresses #3720.

* certbot: Add `patch_get_utility_with_stdout` function.

* Add functions `certbot.tests.util.patch_get_utility_with_stdout`
  and `certbot.tests.util._create_get_utility_mock_with_stdout`.

* certbot: tests: Update tests in MainTest.

* Update methods `test_plugins_no_args`,
`test_plugins_no_args_unprivileged`, and `test_plugins_init`,
`test_plugins_prepare` to use `patch_get_utility_with_stdout`.

* certbot: tests: Update _create_get_utility_mock_with_stdout.

* Update certbot.tests.util._create_get_utility_mock_with_stdout
  function. The mock function for all IDisplay methods, except
  `notification` method, calls _write_msg and _assert_valid_call.

* certbot: tests: Update `patch_get_utility_with_stdout`

* Update doc string.
* Argument stdout's default value is None now.

* certbot: tests: Update util._create_get_utility_mock_with_stdout.
2017-09-25 18:42:31 -07:00
Brad Warren
8b7d6c4ea3 Update changelog for 0.18.2 (#5128) 2017-09-25 16:46:04 -07:00
Christian Becker
36d5221bac certbot-dns-google: enable automatic credential lookup on google cloud (#5117)
- when no credentials are passed it will try to get valid credentials
using the google metadata service
- this is a feature of the google SDK, so we don't need to handle that
explicitly
- previous behaviour with a credentials file is retained
2017-09-25 12:17:15 -07:00
Joona Hoikkala
1ce813c3cc Do not parse disabled configuration files from under sites-available on Debian / Ubuntu (#4104)
This changes the apache plugin behaviour to only parse enabled configuration files and respecting the --apache-vhost-root CLI parameter for new SSL vhost creation. If --apache-vhost-root isn't defined, or doesn't exist, the SSL vhost will be created to originating non-SSL vhost directory.

This PR also implements actual check for vhost enabled state, and makes sure parser.parse_file() does not discard changes in Augeas DOM, by doing an autosave.

Also handles enabling the new SSL vhost, if it's on a path that's not parsed by Apache.

Fixes: #1328
Fixes: #3545
Fixes: #3791
Fixes: #4523
Fixes: #4837
Fixes: #4905

* First changes

* Handle rest of the errors

* Test fixes

* Final fixes

* Make parse_files accessible and fix linter problems

* Activate vhost at later time

* Cleanup

* Add a new test case, and fix old

* Enable site later in deploy_cert

* Make apache-conf-test default dummy configuration enabled

* Remove is_sites_available as obsolete

* Cleanup

* Brought back conditional vhost_path parsing

* Parenthesis

* Fix merge leftovers

* Fix to work with the recent changes to new file creation

* Added fix and tests for non-symlink vhost in sites-enabled

* Made vhostroot parameter for ApacheParser optional, and removed extra_path

* Respect vhost-root, and add Include statements to root configuration if needed

* Fixed site enabling order to prevent apache restart error while enabling mod_ssl

* Don't exclude Ubuntu / Debian vhost-root cli argument

* Changed the SSL vhost directory selection priority

* Requested fixes for paths and vhost discovery

* Make sure the Augeas DOM is written to disk before loading new files

* Actual checking for if the file is parsed within existing Apache configuration

* Fix the order of dummy SSL directives addition and enabling modules

* Restructured site_enabled checks

* Enabling vhost correctly for non-debian systems
2017-09-25 12:03:09 -07:00
Noah Swartz
ade01d618b add info about -d (#5097) 2017-09-21 08:52:01 -07:00
Michał Zegan
5a4028c763 fix dns-rfc2136 plugin not respecting cnames (#5101)
* fix dns-rfc2136 plugin not respecting cnames

The plugin does not work if the domain of a certificate is found to have a cname record in dns.
That is because when plugin tries to find zone boundary, it searches from the domain up for the SOA record, and each DNS response is checked for the answer being empty, assuming that empty answer means no SOA record is present and the higher level domain has to be checked, and non empty answer section means that this domain is a zone root.
However, if the initial domain, or any upper level domain except the zone root has a cname record pointing to the zone root, then the server will, instead of returning an empty answer, return one containing two records, first a cname pointing to the zone root, then the SOA record of zone root, and that will make the check fail and use a wrong domain as a zone name during update.
Fix that by replacing a check for empty answer with explicitly searching in response's answer section for a SOA record matching the domain that is being checked.

* dns-rfc2136: fix lint errors
2017-09-20 11:29:48 -07:00
yomna
48fd7ee260 Updating the AWS letstest documentation (#5091)
* Better documentation for working w/ AWS.

* Addressing feedback.

* profile name -> key name
2017-09-19 10:25:36 -07:00
Brad Warren
6aabb31eb5 Merge pull request #5118 from erikrose/certbot-auto-timeout
Certbot auto timeout
2017-09-18 15:56:26 -07:00
Noah Swartz
3acde31ed3 Merge pull request #5096 from certbot/0.18.1-release-notes
Add 0.18.1 release notes
2017-09-18 13:45:01 -07:00
Erik Rose
e7884898ec Simplify and stop repeating knowledge by hard-coding timeout into HttpsGetter.get().
Also, switch timeout to 30 so it has every opportunity to actually work, even in bad network weather. (I posit that people are used to 30-second timeouts.)

Stop catching URLError explicitly, since it's a subclass of the already-caught IOError.
2017-09-18 09:55:16 -04:00
Chris J
9be4fedeec Add timeout to certbot-auto HTTPS fetches. Fix #4473. 2017-09-18 09:52:17 -04:00
Chris Julian
f0caf5b04f #4435. CLI Argument Default Organization (#5037)
* Enhancement #4435. Organizing defaults in prepare_and_parse_args()

* Playing fast and loose with tox.

Discovered screwy case involving flag_default returning empty list (domains)

* Setting defaults for more low-hanging fruit. Some caveats remain.

* key_path default to None

* Applying PR feedback: explicit defaults even where redundant

* Obsessive quote consistency

* Set testing config path arguments to a 'certonly' default

* Copy the default domains list rather than get reference

* Build a testing Config from CLI_DEFAULTS

* Update some email tests for use with defaults in config.

config.email and config.noninteractive_mode in these tests
used to be magic-mock'd, so were True-ish. The default
email is now None and default noninteractive_mode is
False, so update in tests accordingly.

* Lint...

* Copy anything retrieved using flag_defaults. Apply this to test_cli_ini_domains too.

* Put those quotes back. Backslashes are just the worst.

* Remove vestigial line

* A test to ensure no regressions around modifying CLI_DEFAULTS
2017-09-15 17:10:43 -07:00
Seong-ho Cho
f6be07da74 fix #5111 AttributeError occured with >=pyOpenSSL-17.2.0 (#5112) 2017-09-15 16:57:10 -07:00
r5d
7c16e0da26 certbot: Let plugins_cmd be run as un-priviliged user. (#5103)
* certbot: Let plugins_cmd be run as un-priviliged user.

* certbot/main.py (main): Update function.

Addresses issue #4350.

* * Add test certbot.tests.main_testMainTest.test_plugins_no_args_unpriviliged
2017-09-15 16:55:05 -07:00
Noah Swartz
03624fa9db add domain name when having issues in the warn output (#5105) 2017-09-15 16:51:06 -07:00
Noah Swartz
d3a00a97a3 fix NAME to CERTNAME (#5114) 2017-09-15 16:47:08 -07:00
Brad Warren
4bc0c83ca7 Add --no-self-upgrade to test farm test. (#5095) 2017-09-14 17:33:32 -07:00
Brad Warren
7d0a77ffcf Release 0.18.1 (#5093)
* Release 0.18.1

(cherry picked from commit 8010822a0b)

* Bump version to 0.19.0
2017-09-14 17:32:45 -07:00
Noah Swartz
837f691992 Merge pull request #5108 from certbot/issue_5107
add a help output for cert-name
2017-09-13 16:39:54 -07:00
Noah Swartz
174a006d9c add renew to existing doc 2017-09-13 11:37:07 -07:00
Noah Swartz
b529250535 add a help output for cert-name 2017-09-12 10:52:51 -07:00
Brad Warren
134d499b07 Add 0.18.1 release notes 2017-09-08 13:33:47 -07:00
Brad Warren
68283940cd Test farm improvements (#5088)
* prevent regressions of #5082

* Fix test_leauto_upgrades.sh

test_leauto_upgrades.sh has been incorrectly been succeeding because while peep
doesn't work with newer versions of pip and letsencrypt-auto would crash,
the output included the version number so we reported the test as passing.
This updates letsencrypt-auto to the oldest version that still works for the
purpose of the test and sets pipefail so errors are properly reported.

* Test symlink creation in test_leauto_upgrades.sh

* Pin dependencies in test_sdists.sh.

* Fix permissions errors in test_tests.sh
2017-09-07 17:54:40 -07:00
Brad Warren
82d0ff1df2 Fix permissions error when upgrading certbot-auto. (#5086)
Now we always check if we have root access if --cb-auto-has-root is not given
on the command line. This allows certbot-auto to properly acquire root when
upgrading from an older version. People upgrading from 0.18.0 to 0.18.1 may
check for root access twice, however, if root's user ID is 0, this check is
essentially a noop. If root's user ID is not 0, we'll request root access a 2nd
time during this upgrade.
2017-09-07 17:23:57 -07:00
Brad Warren
d4fe812508 Update changelog to reflect 0.18.0 (#5081) 2017-09-07 16:06:07 -07:00
Brad Warren
6988491b67 Merge pull request #5080 from certbot/candidate-0.18.0
Release 0.18.0
2017-09-07 05:57:12 -07:00
Brad Warren
1a79f82082 Also check new path when determining cli_command (#5082) 2017-09-06 20:22:27 -07:00
yomna
9fb132ba69 Merge pull request #5075 from certbot/specify-min-six-version
Specify the minimum six version in ACME
2017-09-05 17:49:42 -07:00
Brad Warren
a7267b0fcd Bump version to 0.19.0 2017-09-05 16:07:03 -07:00
Brad Warren
d710c441e2 Specify the minimum six version in acme 2017-09-05 10:07:32 -07:00
Bob Strecansky
a8e1df6e55 [#4535] - Unwrap max retries exceeded errors 2017-08-07 20:15:05 -04:00
Bob Strecansky
b3216727da [#4535] - Unwrap max retries exceeded errors 2017-08-07 19:34:20 -04:00
Bob Strecansky
8a78ef9675 [#4535] - Unwrap max retries exceeded errors 2017-08-07 15:33:01 -04:00
Bob Strecansky
d49d7c57ea Merge branch 'master' of github.com:certbot/certbot into max_retries_exceeded 2017-08-07 15:01:46 -04:00
Bob Strecansky
2e7ec00e8c [#4535] - Unwrap max retries exceeded errors 2017-08-07 08:55:43 -04:00
Bob Strecansky
f7dedae388 [#4535] - Unwrap max retries exceeded errors 2017-08-07 08:39:39 -04:00
Bob Strecansky
9ae987d72b [#4535] - Unwrap max retries exceeded errors 2017-08-07 08:35:00 -04:00
Bob Strecansky
0c14d9372d Merge branch 'master' of github.com:certbot/certbot into max_retries_exceeded 2017-08-07 08:14:54 -04:00
Bob Strecansky
9b8c8f103e [#4535] - Unwrap max retries exceeded errors 2017-08-06 23:44:28 -04:00
Bob Strecansky
8555f4a0bd [#4535] - Unwrap max retries exceeded errors 2017-08-06 23:30:11 -04:00
Bob Strecansky
521f783020 [#4535] - Unwrap max retries exceeded errors 2017-08-06 23:02:07 -04:00
Bob Strecansky
5fb1568b6e [#4535] - Unwrap max retries exceeded errors 2017-08-06 22:26:17 -04:00
Bob Strecansky
57e664077f [#4535] - Unwrap max retries exceeded errors - fixing spacing 2017-08-06 10:51:10 -04:00
Bob Strecansky
880c35f3e3 [#4535] - Unwrap max retries exceeded errors 2017-08-05 10:39:04 -04:00
Bob Strecansky
3cc94798b6 [#4535] - Unwrap max retries exceeded errors 2017-08-05 10:26:01 -04:00
Bob Strecansky
959d72feb0 [#4535] - Unwrap max retries exceeded errors 2017-08-05 10:16:47 -04:00
Bob Strecansky
1e71ff5377 [#4535] - Unwrap max retries exceeded errors 2017-08-02 22:59:27 -04:00
Bob Strecansky
18d3df78e8 [#4535] - Unwrap max retries exceeded errors 2017-07-26 20:37:13 -04:00
350 changed files with 18001 additions and 7167 deletions

View File

@@ -1,3 +1,2 @@
[report]
# show lines missing coverage in output
show_missing = True
omit = */setup.py

7
.gitignore vendored
View File

@@ -35,3 +35,10 @@ tests/letstest/*.pem
tests/letstest/venv/
.venv
# pytest cache
.cache
.mypy_cache/
# docker files
.docker

View File

@@ -41,7 +41,7 @@ load-plugins=linter_plugin
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
disable=fixme,locally-disabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,too-few-public-methods,no-self-use,invalid-name,too-many-instance-attributes,cyclic-import
disable=fixme,locally-disabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,too-few-public-methods,no-self-use,invalid-name,too-many-instance-attributes,cyclic-import,duplicate-code
# abstract-class-not-used cannot be disabled locally (at least in
# pylint 1.4.1), same for abstract-class-little-used

View File

@@ -5,53 +5,46 @@ cache:
- $HOME/.cache/pip
before_install:
- '[ $TRAVIS_OS_NAME == linux ] && dpkg -s libaugeas0 || brew install augeas python3'
- '([ $TRAVIS_OS_NAME == linux ] && dpkg -s libaugeas0) || (brew update && brew install augeas python3 && brew upgrade python && brew link python)'
before_script:
- 'if [ $TRAVIS_OS_NAME = osx ] ; then ulimit -n 1024 ; fi'
# using separate envs with different TOXENVs creates 4x1 Travis build
# matrix, which allows us to clearly distinguish which component under
# test has failed
matrix:
include:
- python: "2.7"
env: TOXENV=cover
- python: "2.7"
env: TOXENV=lint
- python: "2.7"
env: TOXENV=py27-oldest BOULDER_INTEGRATION=1
env: TOXENV=py27_install BOULDER_INTEGRATION=v1
sudo: required
after_failure:
- sudo cat /var/log/mysql/error.log
- ps aux | grep mysql
services: docker
- python: "2.6"
env: TOXENV=py26 BOULDER_INTEGRATION=1
sudo: required
after_failure:
- sudo cat /var/log/mysql/error.log
- ps aux | grep mysql
services: docker
- python: "2.7"
env: TOXENV=py27_install BOULDER_INTEGRATION=1
env: TOXENV=py27_install BOULDER_INTEGRATION=v2
sudo: required
after_failure:
- sudo cat /var/log/mysql/error.log
- ps aux | grep mysql
services: docker
- sudo: required
env: TOXENV=apache_compat
services: docker
before_install:
addons:
- python: "2.7"
env: TOXENV=cover FYI="this also tests py27"
- sudo: required
env: TOXENV=nginx_compat
services: docker
before_install:
addons:
- python: "2.7"
env: TOXENV=lint
- python: "3.5"
env: TOXENV=mypy
- python: "2.7"
env: TOXENV='py27-{acme,apache,certbot,dns,nginx}-oldest'
sudo: required
services: docker
- python: "3.4"
env: TOXENV=py34
sudo: required
services: docker
- python: "3.6"
env: TOXENV=py36
sudo: required
services: docker
- sudo: required
env: TOXENV=le_auto_precise
env: TOXENV=apache_compat
services: docker
before_install:
addons:
@@ -60,60 +53,11 @@ matrix:
services: docker
before_install:
addons:
- sudo: required
env: TOXENV=le_auto_wheezy
services: docker
before_install:
addons:
- sudo: required
env: TOXENV=le_auto_centos6
services: docker
before_install:
addons:
- sudo: required
env: TOXENV=docker_dev
services: docker
before_install:
addons:
- python: "2.7"
env: TOXENV=apacheconftest
sudo: required
- python: "3.3"
env: TOXENV=py33 BOULDER_INTEGRATION=1
sudo: required
after_failure:
- sudo cat /var/log/mysql/error.log
- ps aux | grep mysql
services: docker
- python: "3.4"
env: TOXENV=py34 BOULDER_INTEGRATION=1
sudo: required
after_failure:
- sudo cat /var/log/mysql/error.log
- ps aux | grep mysql
services: docker
- python: "3.5"
env: TOXENV=py35 BOULDER_INTEGRATION=1
sudo: required
after_failure:
- sudo cat /var/log/mysql/error.log
- ps aux | grep mysql
services: docker
- python: "3.6"
env: TOXENV=py36 BOULDER_INTEGRATION=1
sudo: required
after_failure:
- sudo cat /var/log/mysql/error.log
- ps aux | grep mysql
services: docker
- python: "2.7"
env: TOXENV=nginxroundtrip
- language: generic
env: TOXENV=py27
os: osx
- language: generic
env: TOXENV=py36
os: osx
# Only build pushes to the master branch, PRs, and branches beginning with
@@ -130,17 +74,6 @@ branches:
sudo: false
addons:
# Custom /etc/hosts required for simple verification of http-01
# and tls-sni-01, and for certbot_test_nginx
hosts:
- le.wtf
- le1.wtf
- le2.wtf
- le3.wtf
- nginx.wtf
- boulder
- boulder-mysql
- boulder-rabbitmq
apt:
sources:
- augeas
@@ -160,8 +93,10 @@ addons:
- libapache2-mod-wsgi
- libapache2-mod-macro
install: "travis_retry pip install tox coveralls"
script: 'travis_retry tox && ([ "xxx$BOULDER_INTEGRATION" = "xxx" ] || ./tests/travis-integration.sh)'
install: "travis_retry $(command -v pip || command -v pip3) install tox coveralls"
script:
- travis_retry tox
- '[ -z "${BOULDER_INTEGRATION+x}" ] || (travis_retry tests/boulder-fetch.sh && tests/tox-boulder-integration.sh)'
after_success: '[ "$TOXENV" == "cover" ] && coveralls'

View File

@@ -2,6 +2,360 @@
Certbot adheres to [Semantic Versioning](http://semver.org/).
## 0.23.0 - 2018-04-04
### Added
* Support for OpenResty was added to the Nginx plugin.
### Changed
* The timestamps in Certbot's logfiles now use the system's local time zone
rather than UTC.
* Certbot's DNS plugins that use Lexicon now rely on Lexicon>=2.2.1 to be able
to create and delete multiple TXT records on a single domain.
* certbot-dns-google's test suite now works without an internet connection.
### Fixed
* Removed a small window that if during which an error occurred, Certbot
wouldn't clean up performed challenges.
* The parameters `default` and `ipv6only` are now removed from `listen`
directives when creating a new server block in the Nginx plugin.
* `server_name` directives enclosed in quotation marks in Nginx are now properly
supported.
* Resolved an issue preventing the Apache plugin from starting Apache when it's
not currently running on RHEL and Gentoo based systems.
Despite us having broken lockstep, we are continuing to release new versions of
all Certbot components during releases for the time being, however, the only
packages with changes other than their version number were:
* certbot
* certbot-apache
* certbot-dns-cloudxns
* certbot-dns-dnsimple
* certbot-dns-dnsmadeeasy
* certbot-dns-google
* certbot-dns-luadns
* certbot-dns-nsone
* certbot-dns-rfc2136
* certbot-nginx
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/50?closed=1
## 0.22.2 - 2018-03-19
### Fixed
* A type error introduced in 0.22.1 that would occur during challenge cleanup
when a Certbot plugin raises an exception while trying to complete the
challenge was fixed.
Despite us having broken lockstep, we are continuing to release new versions of
all Certbot components during releases for the time being, however, the only
packages with changes other than their version number were:
* certbot
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/53?closed=1
## 0.22.1 - 2018-03-19
### Changed
* The ACME server used with Certbot's --dry-run and --staging flags is now
Let's Encrypt's ACMEv2 staging server which allows people to also test ACMEv2
features with these flags.
### Fixed
* The HTTP Content-Type header is now set to the correct value during
certificate revocation with new versions of the ACME protocol.
* When using Certbot with Let's Encrypt's ACMEv2 server, it would add a blank
line to the top of chain.pem and between the certificates in fullchain.pem
for each lineage. These blank lines have been removed.
* Resolved a bug that caused Certbot's --allow-subset-of-names flag not to
work.
* Fixed a regression in acme.client.Client that caused the class to not work
when it was initialized without a ClientNetwork which is done by some of the
other projects using our ACME library.
Despite us having broken lockstep, we are continuing to release new versions of
all Certbot components during releases for the time being, however, the only
packages with changes other than their version number were:
* acme
* certbot
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/51?closed=1
## 0.22.0 - 2018-03-07
### Added
* Support for obtaining wildcard certificates and a newer version of the ACME
protocol such as the one implemented by Let's Encrypt's upcoming ACMEv2
endpoint was added to Certbot and its ACME library. Certbot still works with
older ACME versions and will automatically change the version of the protocol
used based on the version the ACME CA implements.
* The Apache and Nginx plugins are now able to automatically install a wildcard
certificate to multiple virtual hosts that you select from your server
configuration.
* The `certbot install` command now accepts the `--cert-name` flag for
selecting a certificate.
* `acme.client.BackwardsCompatibleClientV2` was added to Certbot's ACME library
which automatically handles most of the differences between new and old ACME
versions. `acme.client.ClientV2` is also available for people who only want
to support one version of the protocol or want to handle the differences
between versions themselves.
* certbot-auto now supports the flag --install-only which has the script
install Certbot and its dependencies and exit without invoking Certbot.
* Support for issuing a single certificate for a wildcard and base domain was
added to our Google Cloud DNS plugin. To do this, we now require your API
credentials have additional permissions, however, your credentials will
already have these permissions unless you defined a custom role with fewer
permissions than the standard DNS administrator role provided by Google.
These permissions are also only needed for the case described above so it
will continue to work for existing users. For more information about the
permissions changes, see the documentation in the plugin.
### Changed
* We have broken lockstep between our ACME library, Certbot, and its plugins.
This means that the different components do not need to be the same version
to work together like they did previously. This makes packaging easier
because not every piece of Certbot needs to be repackaged to ship a change to
a subset of its components.
* Support for Python 2.6 and Python 3.3 has been removed from ACME, Certbot,
Certbot's plugins, and certbot-auto. If you are using certbot-auto on a RHEL
6 based system, it will walk you through the process of installing Certbot
with Python 3 and refuse to upgrade to a newer version of Certbot until you
have done so.
* Certbot's components now work with older versions of setuptools to simplify
packaging for EPEL 7.
### Fixed
* Issues caused by Certbot's Nginx plugin adding multiple ipv6only directives
has been resolved.
* A problem where Certbot's Apache plugin would add redundant include
directives for the TLS configuration managed by Certbot has been fixed.
* Certbot's webroot plugin now properly deletes any directories it creates.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/48?closed=1
## 0.21.1 - 2018-01-25
### Fixed
* When creating an HTTP to HTTPS redirect in Nginx, we now ensure the Host
header of the request is set to an expected value before redirecting users to
the domain found in the header. The previous way Certbot configured Nginx
redirects was a potential security issue which you can read more about at
https://community.letsencrypt.org/t/security-issue-with-redirects-added-by-certbots-nginx-plugin/51493.
* Fixed a problem where Certbot's Apache plugin could fail HTTP-01 challenges
if basic authentication is configured for the domain you request a
certificate for.
* certbot-auto --no-bootstrap now properly tries to use Python 3.4 on RHEL 6
based systems rather than Python 2.6.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/49?closed=1
## 0.21.0 - 2018-01-17
### Added
* Support for the HTTP-01 challenge type was added to our Apache and Nginx
plugins. For those not aware, Let's Encrypt disabled the TLS-SNI-01 challenge
type which was what was previously being used by our Apache and Nginx plugins
last week due to a security issue. For more information about Let's Encrypt's
change, click
[here](https://community.letsencrypt.org/t/2018-01-11-update-regarding-acme-tls-sni-and-shared-hosting-infrastructure/50188).
Our Apache and Nginx plugins will automatically switch to use HTTP-01 so no
changes need to be made to your Certbot configuration, however, you should
make sure your server is accessible on port 80 and isn't behind an external
proxy doing things like redirecting all traffic from HTTP to HTTPS. HTTP to
HTTPS redirects inside Apache and Nginx are fine.
* IPv6 support was added to the Nginx plugin.
* Support for automatically creating server blocks based on the default server
block was added to the Nginx plugin.
* The flags --delete-after-revoke and --no-delete-after-revoke were added
allowing users to control whether the revoke subcommand also deletes the
certificates it is revoking.
### Changed
* We deprecated support for Python 2.6 and Python 3.3 in Certbot and its ACME
library. Support for these versions of Python will be removed in the next
major release of Certbot. If you are using certbot-auto on a RHEL 6 based
system, it will guide you through the process of installing Python 3.
* We split our implementation of JOSE (Javascript Object Signing and
Encryption) out of our ACME library and into a separate package named josepy.
This package is available on [PyPI](https://pypi.python.org/pypi/josepy) and
on [GitHub](https://github.com/certbot/josepy).
* We updated the ciphersuites used in Apache to the new [values recommended by
Mozilla](https://wiki.mozilla.org/Security/Server_Side_TLS#Intermediate_compatibility_.28default.29).
The major change here is adding ChaCha20 to the list of supported
ciphersuites.
### Fixed
* An issue with our Apache plugin on Gentoo due to differences in their
apache2ctl command have been resolved.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/47?closed=1
## 0.20.0 - 2017-12-06
### Added
* Certbot's ACME library now recognizes URL fields in challenge objects in
preparation for Let's Encrypt's new ACME endpoint. The value is still
accessible in our ACME library through the name "uri".
### Changed
* The Apache plugin now parses some distro specific Apache configuration files
on non-Debian systems allowing it to get a clearer picture on the running
configuration. Internally, these changes were structured so that external
contributors can easily write patches to make the plugin work in new Apache
configurations.
* Certbot better reports network failures by removing information about
connection retries from the error output.
* An unnecessary question when using Certbot's webroot plugin interactively has
been removed.
### Fixed
* Certbot's NGINX plugin no longer sometimes incorrectly reports that it was
unable to deploy a HTTP->HTTPS redirect when requesting Certbot to enable a
redirect for multiple domains.
* Problems where the Apache plugin was failing to find directives and
duplicating existing directives on openSUSE have been resolved.
* An issue running the test shipped with Certbot and some our DNS plugins with
older versions of mock have been resolved.
* On some systems, users reported strangely interleaved output depending on
when stdout and stderr were flushed. This problem was resolved by having
Certbot regularly flush these streams.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/44?closed=1
## 0.19.0 - 2017-10-04
### Added
* Certbot now has renewal hook directories where executable files can be placed
for Certbot to run with the renew subcommand. Pre-hooks, deploy-hooks, and
post-hooks can be specified in the renewal-hooks/pre, renewal-hooks/deploy,
and renewal-hooks/post directories respectively in Certbot's configuration
directory (which is /etc/letsencrypt by default). Certbot will automatically
create these directories when it is run if they do not already exist.
* After revoking a certificate with the revoke subcommand, Certbot will offer
to delete the lineage associated with the certificate. When Certbot is run
with --non-interactive, it will automatically try to delete the associated
lineage.
* When using Certbot's Google Cloud DNS plugin on Google Compute Engine, you no
longer have to provide a credential file to Certbot if you have configured
sufficient permissions for the instance which Certbot can automatically
obtain using Google's metadata service.
### Changed
* When deleting certificates interactively using the delete subcommand, Certbot
will now allow you to select multiple lineages to be deleted at once.
* Certbot's Apache plugin no longer always parses Apache's sites-available on
Debian based systems and instead only parses virtual hosts included in your
Apache configuration. You can provide an additional directory for Certbot to
parse using the command line flag --apache-vhost-root.
### Fixed
* The plugins subcommand can now be run without root access.
* certbot-auto now includes a timeout when updating itself so it no longer
hangs indefinitely when it is unable to connect to the external server.
* An issue where Certbot's Apache plugin would sometimes fail to deploy a
certificate on Debian based systems if mod_ssl wasn't already enabled has
been resolved.
* A bug in our Docker image where the certificates subcommand could not report
if certificates maintained by Certbot had been revoked has been fixed.
* Certbot's RFC 2136 DNS plugin (for use with software like BIND) now properly
performs DNS challenges when the domain being verified contains a CNAME
record.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/43?closed=1
## 0.18.2 - 2017-09-20
### Fixed
* An issue where Certbot's ACME module would raise an AttributeError trying to
create self-signed certificates when used with pyOpenSSL 17.3.0 has been
resolved. For Certbot users with this version of pyOpenSSL, this caused
Certbot to crash when performing a TLS SNI challenge or when the Nginx plugin
tried to create an SSL server block.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/46?closed=1
## 0.18.1 - 2017-09-08
### Fixed
* If certbot-auto was running as an unprivileged user and it upgraded from
0.17.0 to 0.18.0, it would crash with a permissions error and would need to
be run again to successfully complete the upgrade. This has been fixed and
certbot-auto should upgrade cleanly to 0.18.1.
* Certbot usually uses "certbot-auto" or "letsencrypt-auto" in error messages
and the User-Agent string instead of "certbot" when you are using one of
these wrapper scripts. Proper detection of this was broken with Certbot's new
installation path in /opt in 0.18.0 but this problem has been resolved.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/45?closed=1
## 0.18.0 - 2017-09-06
### Added
* The Nginx plugin now configures Nginx to use 2048-bit Diffie-Hellman
parameters. Java 6 clients do not support Diffie-Hellman parameters larger
than 1024 bits, so if you need to support these clients you will need to
manually modify your Nginx configuration after using the Nginx installer.
### Changed
* certbot-auto now installs Certbot in directories under `/opt/eff.org`. If you
had an existing installation from certbot-auto, a symlink is created to the
new directory. You can configure certbot-auto to use a different path by
setting the environment variable VENV_PATH.
* The Nginx plugin can now be selected in Certbot's interactive output.
* Output verbosity of renewal failures when running with `--quiet` has been
reduced.
* The default revocation reason shown in Certbot help output now is a human
readable string instead of a numerical code.
* Plugin selection is now included in normal terminal output.
### Fixed
* A newer version of ConfigArgParse is now installed when using certbot-auto
causing values set to false in a Certbot INI configuration file to be handled
intuitively. Setting a boolean command line flag to false is equivalent to
not including it in the configuration file at all.
* New naming conventions preventing certbot-auto from installing OS
dependencies on Fedora 26 have been resolved.
More details about these changes can be found on our GitHub repo:
https://github.com/certbot/certbot/milestone/42?closed=1
## 0.17.0 - 2017-08-02
### Added

View File

@@ -1,4 +1,4 @@
FROM python:2-alpine
FROM python:2-alpine3.7
ENTRYPOINT [ "certbot" ]
EXPOSE 80 443
@@ -12,6 +12,7 @@ COPY certbot src/certbot
RUN apk add --no-cache --virtual .certbot-deps \
libffi \
libssl1.0 \
openssl \
ca-certificates \
binutils
RUN apk add --no-cache --virtual .build-deps \

View File

@@ -1,70 +1,21 @@
# This Dockerfile builds an image for development.
FROM ubuntu:trusty
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
MAINTAINER William Budington <bill@eff.org>
MAINTAINER Yan <yan@eff.org>
FROM ubuntu:xenial
# Note: this only exposes the port to other docker containers. You
# still have to bind to 443@host at runtime, as per the ACME spec.
EXPOSE 443
# TODO: make sure --config-dir and --work-dir cannot be changed
# through the CLI (certbot-docker wrapper that uses standalone
# authenticator and text mode only?)
VOLUME /etc/letsencrypt /var/lib/letsencrypt
# Note: this only exposes the port to other docker containers.
EXPOSE 80 443
WORKDIR /opt/certbot/src
# no need to mkdir anything:
# https://docs.docker.com/reference/builder/#copy
# If <dest> doesn't exist, it is created along with all missing
# directories in its path.
# TODO: Install Apache/Nginx for plugin development.
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
apt-get install python3-dev git -y && \
COPY . .
RUN apt-get update && \
apt-get install apache2 git nginx-light -y && \
letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*
# the above is not likely to change, so by putting it further up the
# Dockerfile we make sure we cache as much as possible
COPY setup.py README.rst CHANGES.rst MANIFEST.in linter_plugin.py tox.cover.sh tox.ini .pylintrc /opt/certbot/src/
# all above files are necessary for setup.py, however, package source
# code directory has to be copied separately to a subdirectory...
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
# directory, the entire contents of the directory are copied,
# including filesystem metadata. Note: The directory itself is not
# copied, just its contents." Order again matters, three files are far
# more likely to be cached than the whole project directory
COPY certbot /opt/certbot/src/certbot/
COPY acme /opt/certbot/src/acme/
COPY certbot-apache /opt/certbot/src/certbot-apache/
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
COPY letshelp-certbot /opt/certbot/src/letshelp-certbot/
COPY certbot-compatibility-test /opt/certbot/src/certbot-compatibility-test/
COPY tests /opt/certbot/src/tests/
RUN virtualenv --no-site-packages -p python2 /opt/certbot/venv && \
/opt/certbot/venv/bin/pip install -U pip && \
/opt/certbot/venv/bin/pip install -U setuptools && \
/opt/certbot/venv/bin/pip install \
-e /opt/certbot/src/acme \
-e /opt/certbot/src \
-e /opt/certbot/src/certbot-apache \
-e /opt/certbot/src/certbot-nginx \
-e /opt/certbot/src/letshelp-certbot \
-e /opt/certbot/src/certbot-compatibility-test \
-e /opt/certbot/src[dev,docs]
# install in editable mode (-e) to save space: it's not possible to
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
# this might also help in debugging: you can "docker run --entrypoint
# bash" and investigate, apply patches, etc.
RUN VENV_NAME="../venv" tools/venv.sh
ENV PATH /opt/certbot/venv/bin:$PATH

View File

@@ -1,3 +1,9 @@
If you're having trouble using Certbot and aren't sure you've found a bug or
request for a new feature, please first try asking for help at
https://community.letsencrypt.org/. There is a much larger community there of
people familiar with the project who will be able to more quickly answer your
questions.
## My operating system is (include version):

View File

@@ -6,13 +6,13 @@ import logging
import socket
from cryptography.hazmat.primitives import hashes # type: ignore
import josepy as jose
import OpenSSL
import requests
from acme import errors
from acme import crypto_util
from acme import fields
from acme import jose
logger = logging.getLogger(__name__)

View File

@@ -1,6 +1,7 @@
"""Tests for acme.challenges."""
import unittest
import josepy as jose
import mock
import OpenSSL
import requests
@@ -8,7 +9,6 @@ import requests
from six.moves.urllib import parse as urllib_parse # pylint: disable=import-error
from acme import errors
from acme import jose
from acme import test_util
CERT = test_util.load_comparable_cert('cert.pem')

View File

@@ -10,12 +10,14 @@ import time
import six
from six.moves import http_client # pylint: disable=import-error
import josepy as jose
import OpenSSL
import re
import requests
import sys
from acme import crypto_util
from acme import errors
from acme import jose
from acme import jws
from acme import messages
@@ -38,39 +40,24 @@ DEFAULT_NETWORK_TIMEOUT = 45
DER_CONTENT_TYPE = 'application/pkix-cert'
class Client(object): # pylint: disable=too-many-instance-attributes
"""ACME client.
.. todo::
Clean up raised error types hierarchy, document, and handle (wrap)
instances of `.DeserializationError` raised in `from_json()`.
class ClientBase(object): # pylint: disable=too-many-instance-attributes
"""ACME client base object.
:ivar messages.Directory directory:
:ivar key: `.JWK` (private)
:ivar alg: `.JWASignature`
:ivar bool verify_ssl: Verify SSL certificates?
:ivar .ClientNetwork net: Client network. Useful for testing. If not
supplied, it will be initialized using `key`, `alg` and
`verify_ssl`.
:ivar .ClientNetwork net: Client network.
:ivar int acme_version: ACME protocol version. 1 or 2.
"""
def __init__(self, directory, key, alg=jose.RS256, verify_ssl=True,
net=None):
def __init__(self, directory, net, acme_version):
"""Initialize.
:param directory: Directory Resource (`.messages.Directory`) or
URI from which the resource will be downloaded.
:param .messages.Directory directory: Directory Resource
:param .ClientNetwork net: Client network.
:param int acme_version: ACME protocol version. 1 or 2.
"""
self.key = key
self.net = ClientNetwork(key, alg, verify_ssl) if net is None else net
if isinstance(directory, six.string_types):
self.directory = messages.Directory.from_json(
self.net.get(directory).json())
else:
self.directory = directory
self.directory = directory
self.net = net
self.acme_version = acme_version
@classmethod
def _regr_from_response(cls, response, uri=None, terms_of_service=None):
@@ -82,28 +69,8 @@ class Client(object): # pylint: disable=too-many-instance-attributes
uri=response.headers.get('Location', uri),
terms_of_service=terms_of_service)
def register(self, new_reg=None):
"""Register.
:param .NewRegistration new_reg:
:returns: Registration Resource.
:rtype: `.RegistrationResource`
"""
new_reg = messages.NewRegistration() if new_reg is None else new_reg
assert isinstance(new_reg, messages.NewRegistration)
response = self.net.post(self.directory[new_reg], new_reg)
# TODO: handle errors
assert response.status_code == http_client.CREATED
# "Instance of 'Field' has no key/contact member" bug:
# pylint: disable=no-member
return self._regr_from_response(response)
def _send_recv_regr(self, regr, body):
response = self.net.post(regr.uri, body)
response = self._post(regr.uri, body)
# TODO: Boulder returns httplib.ACCEPTED
#assert response.status_code == httplib.OK
@@ -115,6 +82,13 @@ class Client(object): # pylint: disable=too-many-instance-attributes
response, uri=regr.uri,
terms_of_service=regr.terms_of_service)
def _post(self, *args, **kwargs):
"""Wrapper around self.net.post that adds the acme_version.
"""
kwargs.setdefault('acme_version', self.acme_version)
return self.net.post(*args, **kwargs)
def update_registration(self, regr, update=None):
"""Update registration.
@@ -129,6 +103,7 @@ class Client(object): # pylint: disable=too-many-instance-attributes
update = regr.body if update is None else update
body = messages.UpdateRegistration(**dict(update))
updated_regr = self._send_recv_regr(regr, body=body)
self.net.account = updated_regr
return updated_regr
def deactivate_registration(self, regr):
@@ -152,65 +127,14 @@ class Client(object): # pylint: disable=too-many-instance-attributes
"""
return self._send_recv_regr(regr, messages.UpdateRegistration())
def agree_to_tos(self, regr):
"""Agree to the terms-of-service.
Agree to the terms-of-service in a Registration Resource.
:param regr: Registration Resource.
:type regr: `.RegistrationResource`
:returns: Updated Registration Resource.
:rtype: `.RegistrationResource`
"""
return self.update_registration(
regr.update(body=regr.body.update(agreement=regr.terms_of_service)))
def _authzr_from_response(self, response, identifier, uri=None):
def _authzr_from_response(self, response, identifier=None, uri=None):
authzr = messages.AuthorizationResource(
body=messages.Authorization.from_json(response.json()),
uri=response.headers.get('Location', uri))
if authzr.body.identifier != identifier:
if identifier is not None and authzr.body.identifier != identifier:
raise errors.UnexpectedUpdate(authzr)
return authzr
def request_challenges(self, identifier, new_authzr_uri=None):
"""Request challenges.
:param .messages.Identifier identifier: Identifier to be challenged.
:param str new_authzr_uri: Deprecated. Do not use.
:returns: Authorization Resource.
:rtype: `.AuthorizationResource`
"""
if new_authzr_uri is not None:
logger.debug("request_challenges with new_authzr_uri deprecated.")
new_authz = messages.NewAuthorization(identifier=identifier)
response = self.net.post(self.directory.new_authz, new_authz)
# TODO: handle errors
assert response.status_code == http_client.CREATED
return self._authzr_from_response(response, identifier)
def request_domain_challenges(self, domain, new_authzr_uri=None):
"""Request challenges for domain names.
This is simply a convenience function that wraps around
`request_challenges`, but works with domain names instead of
generic identifiers. See ``request_challenges`` for more
documentation.
:param str domain: Domain name to be challenged.
:param str new_authzr_uri: Deprecated. Do not use.
:returns: Authorization Resource.
:rtype: `.AuthorizationResource`
"""
return self.request_challenges(messages.Identifier(
typ=messages.IDENTIFIER_FQDN, value=domain), new_authzr_uri)
def answer_challenge(self, challb, response):
"""Answer challenge.
@@ -226,7 +150,7 @@ class Client(object): # pylint: disable=too-many-instance-attributes
:raises .UnexpectedUpdate:
"""
response = self.net.post(challb.uri, response)
response = self._post(challb.uri, response)
try:
authzr_uri = response.links['up']['url']
except KeyError:
@@ -287,6 +211,142 @@ class Client(object): # pylint: disable=too-many-instance-attributes
response, authzr.body.identifier, authzr.uri)
return updated_authzr, response
def _revoke(self, cert, rsn, url):
"""Revoke certificate.
:param .ComparableX509 cert: `OpenSSL.crypto.X509` wrapped in
`.ComparableX509`
:param int rsn: Reason code for certificate revocation.
:param str url: ACME URL to post to
:raises .ClientError: If revocation is unsuccessful.
"""
response = self._post(url,
messages.Revocation(
certificate=cert,
reason=rsn))
if response.status_code != http_client.OK:
raise errors.ClientError(
'Successful revocation must return HTTP OK status')
class Client(ClientBase):
"""ACME client for a v1 API.
.. todo::
Clean up raised error types hierarchy, document, and handle (wrap)
instances of `.DeserializationError` raised in `from_json()`.
:ivar messages.Directory directory:
:ivar key: `josepy.JWK` (private)
:ivar alg: `josepy.JWASignature`
:ivar bool verify_ssl: Verify SSL certificates?
:ivar .ClientNetwork net: Client network. Useful for testing. If not
supplied, it will be initialized using `key`, `alg` and
`verify_ssl`.
"""
def __init__(self, directory, key, alg=jose.RS256, verify_ssl=True,
net=None):
"""Initialize.
:param directory: Directory Resource (`.messages.Directory`) or
URI from which the resource will be downloaded.
"""
# pylint: disable=too-many-arguments
self.key = key
if net is None:
net = ClientNetwork(key, alg=alg, verify_ssl=verify_ssl)
if isinstance(directory, six.string_types):
directory = messages.Directory.from_json(
net.get(directory).json())
super(Client, self).__init__(directory=directory,
net=net, acme_version=1)
def register(self, new_reg=None):
"""Register.
:param .NewRegistration new_reg:
:returns: Registration Resource.
:rtype: `.RegistrationResource`
"""
new_reg = messages.NewRegistration() if new_reg is None else new_reg
response = self._post(self.directory[new_reg], new_reg)
# TODO: handle errors
assert response.status_code == http_client.CREATED
# "Instance of 'Field' has no key/contact member" bug:
# pylint: disable=no-member
return self._regr_from_response(response)
def agree_to_tos(self, regr):
"""Agree to the terms-of-service.
Agree to the terms-of-service in a Registration Resource.
:param regr: Registration Resource.
:type regr: `.RegistrationResource`
:returns: Updated Registration Resource.
:rtype: `.RegistrationResource`
"""
return self.update_registration(
regr.update(body=regr.body.update(agreement=regr.terms_of_service)))
def request_challenges(self, identifier, new_authzr_uri=None):
"""Request challenges.
:param .messages.Identifier identifier: Identifier to be challenged.
:param str new_authzr_uri: Deprecated. Do not use.
:returns: Authorization Resource.
:rtype: `.AuthorizationResource`
:raises errors.WildcardUnsupportedError: if a wildcard is requested
"""
if new_authzr_uri is not None:
logger.debug("request_challenges with new_authzr_uri deprecated.")
if identifier.value.startswith("*"):
raise errors.WildcardUnsupportedError(
"Requesting an authorization for a wildcard name is"
" forbidden by this version of the ACME protocol.")
new_authz = messages.NewAuthorization(identifier=identifier)
response = self._post(self.directory.new_authz, new_authz)
# TODO: handle errors
assert response.status_code == http_client.CREATED
return self._authzr_from_response(response, identifier)
def request_domain_challenges(self, domain, new_authzr_uri=None):
"""Request challenges for domain names.
This is simply a convenience function that wraps around
`request_challenges`, but works with domain names instead of
generic identifiers. See ``request_challenges`` for more
documentation.
:param str domain: Domain name to be challenged.
:param str new_authzr_uri: Deprecated. Do not use.
:returns: Authorization Resource.
:rtype: `.AuthorizationResource`
:raises errors.WildcardUnsupportedError: if a wildcard is requested
"""
return self.request_challenges(messages.Identifier(
typ=messages.IDENTIFIER_FQDN, value=domain), new_authzr_uri)
def request_issuance(self, csr, authzrs):
"""Request issuance.
@@ -306,7 +366,7 @@ class Client(object): # pylint: disable=too-many-instance-attributes
req = messages.CertificateRequest(csr=csr)
content_type = DER_CONTENT_TYPE # TODO: add 'cert_type 'argument
response = self.net.post(
response = self._post(
self.directory.new_cert,
req,
content_type=content_type,
@@ -407,7 +467,7 @@ class Client(object): # pylint: disable=too-many-instance-attributes
:param str uri: URI of certificate
:returns: tuple of the form
(response, :class:`acme.jose.ComparableX509`)
(response, :class:`josepy.util.ComparableX509`)
:rtype: tuple
"""
@@ -491,26 +551,317 @@ class Client(object): # pylint: disable=too-many-instance-attributes
:raises .ClientError: If revocation is unsuccessful.
"""
response = self.net.post(self.directory[messages.Revocation],
messages.Revocation(
certificate=cert,
reason=rsn),
content_type=None)
if response.status_code != http_client.OK:
raise errors.ClientError(
'Successful revocation must return HTTP OK status')
return self._revoke(cert, rsn, self.directory[messages.Revocation])
class ClientV2(ClientBase):
"""ACME client for a v2 API.
:ivar messages.Directory directory:
:ivar .ClientNetwork net: Client network.
"""
def __init__(self, directory, net):
"""Initialize.
:param .messages.Directory directory: Directory Resource
:param .ClientNetwork net: Client network.
"""
super(ClientV2, self).__init__(directory=directory,
net=net, acme_version=2)
def new_account(self, new_account):
"""Register.
:param .NewRegistration new_account:
:returns: Registration Resource.
:rtype: `.RegistrationResource`
"""
response = self._post(self.directory['newAccount'], new_account)
# "Instance of 'Field' has no key/contact member" bug:
# pylint: disable=no-member
regr = self._regr_from_response(response)
self.net.account = regr
return regr
def new_order(self, csr_pem):
"""Request a new Order object from the server.
:param str csr_pem: A CSR in PEM format.
:returns: The newly created order.
:rtype: OrderResource
"""
csr = OpenSSL.crypto.load_certificate_request(OpenSSL.crypto.FILETYPE_PEM, csr_pem)
# pylint: disable=protected-access
dnsNames = crypto_util._pyopenssl_cert_or_req_all_names(csr)
identifiers = []
for name in dnsNames:
identifiers.append(messages.Identifier(typ=messages.IDENTIFIER_FQDN,
value=name))
order = messages.NewOrder(identifiers=identifiers)
response = self._post(self.directory['newOrder'], order)
body = messages.Order.from_json(response.json())
authorizations = []
for url in body.authorizations:
authorizations.append(self._authzr_from_response(self.net.get(url), uri=url))
return messages.OrderResource(
body=body,
uri=response.headers.get('Location'),
authorizations=authorizations,
csr_pem=csr_pem)
def poll_and_finalize(self, orderr, deadline=None):
"""Poll authorizations and finalize the order.
If no deadline is provided, this method will timeout after 90
seconds.
:param messages.OrderResource orderr: order to finalize
:param datetime.datetime deadline: when to stop polling and timeout
:returns: finalized order
:rtype: messages.OrderResource
"""
if deadline is None:
deadline = datetime.datetime.now() + datetime.timedelta(seconds=90)
orderr = self.poll_authorizations(orderr, deadline)
return self.finalize_order(orderr, deadline)
def poll_authorizations(self, orderr, deadline):
"""Poll Order Resource for status."""
responses = []
for url in orderr.body.authorizations:
while datetime.datetime.now() < deadline:
authzr = self._authzr_from_response(self.net.get(url), uri=url)
if authzr.body.status != messages.STATUS_PENDING:
responses.append(authzr)
break
time.sleep(1)
# If we didn't get a response for every authorization, we fell through
# the bottom of the loop due to hitting the deadline.
if len(responses) < len(orderr.body.authorizations):
raise errors.TimeoutError()
failed = []
for authzr in responses:
if authzr.body.status != messages.STATUS_VALID:
for chall in authzr.body.challenges:
if chall.error != None:
failed.append(authzr)
if len(failed) > 0:
raise errors.ValidationError(failed)
return orderr.update(authorizations=responses)
def finalize_order(self, orderr, deadline):
"""Finalize an order and obtain a certificate.
:param messages.OrderResource orderr: order to finalize
:param datetime.datetime deadline: when to stop polling and timeout
:returns: finalized order
:rtype: messages.OrderResource
"""
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, orderr.csr_pem)
wrapped_csr = messages.CertificateRequest(csr=jose.ComparableX509(csr))
self._post(orderr.body.finalize, wrapped_csr)
while datetime.datetime.now() < deadline:
time.sleep(1)
response = self.net.get(orderr.uri)
body = messages.Order.from_json(response.json())
if body.error is not None:
raise errors.IssuanceError(body.error)
if body.certificate is not None:
certificate_response = self.net.get(body.certificate,
content_type=DER_CONTENT_TYPE).text
return orderr.update(body=body, fullchain_pem=certificate_response)
raise errors.TimeoutError()
def revoke(self, cert, rsn):
"""Revoke certificate.
:param .ComparableX509 cert: `OpenSSL.crypto.X509` wrapped in
`.ComparableX509`
:param int rsn: Reason code for certificate revocation.
:raises .ClientError: If revocation is unsuccessful.
"""
return self._revoke(cert, rsn, self.directory['revokeCert'])
class BackwardsCompatibleClientV2(object):
"""ACME client wrapper that tends towards V2-style calls, but
supports V1 servers.
.. note:: While this class handles the majority of the differences
between versions of the ACME protocol, if you need to support an
ACME server based on version 3 or older of the IETF ACME draft
that uses combinations in authorizations (or lack thereof) to
signal that the client needs to complete something other than
any single challenge in the authorization to make it valid, the
user of this class needs to understand and handle these
differences themselves. This does not apply to either of Let's
Encrypt's endpoints where successfully completing any challenge
in an authorization will make it valid.
:ivar int acme_version: 1 or 2, corresponding to the Let's Encrypt endpoint
:ivar .ClientBase client: either Client or ClientV2
"""
def __init__(self, net, key, server):
directory = messages.Directory.from_json(net.get(server).json())
self.acme_version = self._acme_version_from_directory(directory)
if self.acme_version == 1:
self.client = Client(directory, key=key, net=net)
else:
self.client = ClientV2(directory, net=net)
def __getattr__(self, name):
if name in vars(self.client):
return getattr(self.client, name)
elif name in dir(ClientBase):
return getattr(self.client, name)
else:
raise AttributeError()
def new_account_and_tos(self, regr, check_tos_cb=None):
"""Combined register and agree_tos for V1, new_account for V2
:param .NewRegistration regr:
:param callable check_tos_cb: callback that raises an error if
the check does not work
"""
def _assess_tos(tos):
if check_tos_cb is not None:
check_tos_cb(tos)
if self.acme_version == 1:
regr = self.client.register(regr)
if regr.terms_of_service is not None:
_assess_tos(regr.terms_of_service)
return self.client.agree_to_tos(regr)
return regr
else:
if "terms_of_service" in self.client.directory.meta:
_assess_tos(self.client.directory.meta.terms_of_service)
regr = regr.update(terms_of_service_agreed=True)
return self.client.new_account(regr)
def new_order(self, csr_pem):
"""Request a new Order object from the server.
If using ACMEv1, returns a dummy OrderResource with only
the authorizations field filled in.
:param str csr_pem: A CSR in PEM format.
:returns: The newly created order.
:rtype: OrderResource
:raises errors.WildcardUnsupportedError: if a wildcard domain is
requested but unsupported by the ACME version
"""
if self.acme_version == 1:
csr = OpenSSL.crypto.load_certificate_request(OpenSSL.crypto.FILETYPE_PEM, csr_pem)
# pylint: disable=protected-access
dnsNames = crypto_util._pyopenssl_cert_or_req_all_names(csr)
authorizations = []
for domain in dnsNames:
authorizations.append(self.client.request_domain_challenges(domain))
return messages.OrderResource(authorizations=authorizations, csr_pem=csr_pem)
else:
return self.client.new_order(csr_pem)
def finalize_order(self, orderr, deadline):
"""Finalize an order and obtain a certificate.
:param messages.OrderResource orderr: order to finalize
:param datetime.datetime deadline: when to stop polling and timeout
:returns: finalized order
:rtype: messages.OrderResource
"""
if self.acme_version == 1:
csr_pem = orderr.csr_pem
certr = self.client.request_issuance(
jose.ComparableX509(
OpenSSL.crypto.load_certificate_request(OpenSSL.crypto.FILETYPE_PEM, csr_pem)),
orderr.authorizations)
chain = None
while datetime.datetime.now() < deadline:
try:
chain = self.client.fetch_chain(certr)
break
except errors.Error:
time.sleep(1)
if chain is None:
raise errors.TimeoutError(
'Failed to fetch chain. You should not deploy the generated '
'certificate, please rerun the command for a new one.')
cert = OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_PEM, certr.body.wrapped).decode()
chain = crypto_util.dump_pyopenssl_chain(chain).decode()
return orderr.update(fullchain_pem=(cert + chain))
else:
return self.client.finalize_order(orderr, deadline)
def revoke(self, cert, rsn):
"""Revoke certificate.
:param .ComparableX509 cert: `OpenSSL.crypto.X509` wrapped in
`.ComparableX509`
:param int rsn: Reason code for certificate revocation.
:raises .ClientError: If revocation is unsuccessful.
"""
return self.client.revoke(cert, rsn)
def _acme_version_from_directory(self, directory):
if hasattr(directory, 'newNonce'):
return 2
else:
return 1
class ClientNetwork(object): # pylint: disable=too-many-instance-attributes
"""Client network."""
"""Wrapper around requests that signs POSTs for authentication.
Also adds user agent, and handles Content-Type.
"""
JSON_CONTENT_TYPE = 'application/json'
JOSE_CONTENT_TYPE = 'application/jose+json'
JSON_ERROR_CONTENT_TYPE = 'application/problem+json'
REPLAY_NONCE_HEADER = 'Replay-Nonce'
def __init__(self, key, alg=jose.RS256, verify_ssl=True,
"""Initialize.
:param josepy.JWK key: Account private key
:param messages.RegistrationResource account: Account object. Required if you are
planning to use .post() with acme_version=2 for anything other than
creating a new account; may be set later after registering.
:param josepy.JWASignature alg: Algoritm to use in signing JWS.
:param bool verify_ssl: Whether to verify certificates on SSL connections.
:param str user_agent: String to send as User-Agent header.
:param float timeout: Timeout for requests.
"""
def __init__(self, key, account=None, alg=jose.RS256, verify_ssl=True,
user_agent='acme-python', timeout=DEFAULT_NETWORK_TIMEOUT):
# pylint: disable=too-many-arguments
self.key = key
self.account = account
self.alg = alg
self.verify_ssl = verify_ssl
self._nonces = set()
@@ -526,21 +877,31 @@ class ClientNetwork(object): # pylint: disable=too-many-instance-attributes
except Exception: # pylint: disable=broad-except
pass
def _wrap_in_jws(self, obj, nonce):
def _wrap_in_jws(self, obj, nonce, url, acme_version):
"""Wrap `JSONDeSerializable` object in JWS.
.. todo:: Implement ``acmePath``.
:param .JSONDeSerializable obj:
:param josepy.JSONDeSerializable obj:
:param str url: The URL to which this object will be POSTed
:param bytes nonce:
:rtype: `.JWS`
:rtype: `josepy.JWS`
"""
jobj = obj.json_dumps(indent=2).encode()
logger.debug('JWS payload:\n%s', jobj)
return jws.JWS.sign(
payload=jobj, key=self.key, alg=self.alg,
nonce=nonce).json_dumps(indent=2)
kwargs = {
"alg": self.alg,
"nonce": nonce
}
if acme_version == 2:
kwargs["url"] = url
# newAccount and revokeCert work without the kid
if self.account is not None:
kwargs["kid"] = self.account["uri"]
kwargs["key"] = self.key
# pylint: disable=star-args
return jws.JWS.sign(jobj, **kwargs).json_dumps(indent=2)
@classmethod
def _check_response(cls, response, content_type=None):
@@ -599,6 +960,7 @@ class ClientNetwork(object): # pylint: disable=too-many-instance-attributes
return response
def _send_request(self, method, url, *args, **kwargs):
# pylint: disable=too-many-locals
"""Send HTTP request.
Makes sure that `verify_ssl` is respected. Logs request and
@@ -624,7 +986,32 @@ class ClientNetwork(object): # pylint: disable=too-many-instance-attributes
kwargs.setdefault('headers', {})
kwargs['headers'].setdefault('User-Agent', self.user_agent)
kwargs.setdefault('timeout', self._default_timeout)
response = self.session.request(method, url, *args, **kwargs)
try:
response = self.session.request(method, url, *args, **kwargs)
except requests.exceptions.RequestException as e:
# pylint: disable=pointless-string-statement
"""Requests response parsing
The requests library emits exceptions with a lot of extra text.
We parse them with a regexp to raise a more readable exceptions.
Example:
HTTPSConnectionPool(host='acme-v01.api.letsencrypt.org',
port=443): Max retries exceeded with url: /directory
(Caused by NewConnectionError('
<requests.packages.urllib3.connection.VerifiedHTTPSConnection
object at 0x108356c50>: Failed to establish a new connection:
[Errno 65] No route to host',))"""
# pylint: disable=line-too-long
err_regex = r".*host='(\S*)'.*Max retries exceeded with url\: (\/\w*).*(\[Errno \d+\])([A-Za-z ]*)"
m = re.match(err_regex, str(e))
if m is None:
raise # pragma: no cover
else:
host, path, _err_no, err_msg = m.groups()
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
# If content is DER, log the base64 of it instead of raw bytes, to keep
# binary data out of the logs.
if response.headers.get("Content-Type") == DER_CONTENT_TYPE:
@@ -687,8 +1074,9 @@ class ClientNetwork(object): # pylint: disable=too-many-instance-attributes
else:
raise
def _post_once(self, url, obj, content_type=JOSE_CONTENT_TYPE, **kwargs):
data = self._wrap_in_jws(obj, self._get_nonce(url))
def _post_once(self, url, obj, content_type=JOSE_CONTENT_TYPE,
acme_version=1, **kwargs):
data = self._wrap_in_jws(obj, self._get_nonce(url), url, acme_version)
kwargs.setdefault('headers', {'Content-Type': content_type})
response = self._send_request('POST', url, data=data, **kwargs)
self._add_nonce(response)

View File

@@ -1,16 +1,18 @@
"""Tests for acme.client."""
import copy
import datetime
import json
import unittest
from six.moves import http_client # pylint: disable=import-error
import josepy as jose
import mock
import OpenSSL
import requests
from acme import challenges
from acme import errors
from acme import jose
from acme import jws as acme_jws
from acme import messages
from acme import messages_test
@@ -18,13 +20,32 @@ from acme import test_util
CERT_DER = test_util.load_vector('cert.der')
CERT_SAN_PEM = test_util.load_vector('cert-san.pem')
CSR_SAN_PEM = test_util.load_vector('csr-san.pem')
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
KEY2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
DIRECTORY_V1 = messages.Directory({
messages.NewRegistration:
'https://www.letsencrypt-demo.org/acme/new-reg',
messages.Revocation:
'https://www.letsencrypt-demo.org/acme/revoke-cert',
messages.NewAuthorization:
'https://www.letsencrypt-demo.org/acme/new-authz',
messages.CertificateRequest:
'https://www.letsencrypt-demo.org/acme/new-cert',
})
class ClientTest(unittest.TestCase):
"""Tests for acme.client.Client."""
# pylint: disable=too-many-instance-attributes,too-many-public-methods
DIRECTORY_V2 = messages.Directory({
'newAccount': 'https://www.letsencrypt-demo.org/acme/new-account',
'newNonce': 'https://www.letsencrypt-demo.org/acme/new-nonce',
'newOrder': 'https://www.letsencrypt-demo.org/acme/new-order',
'revokeCert': 'https://www.letsencrypt-demo.org/acme/revoke-cert',
})
class ClientTestBase(unittest.TestCase):
"""Base for tests in acme.client."""
def setUp(self):
self.response = mock.MagicMock(
@@ -33,21 +54,6 @@ class ClientTest(unittest.TestCase):
self.net.post.return_value = self.response
self.net.get.return_value = self.response
self.directory = messages.Directory({
messages.NewRegistration:
'https://www.letsencrypt-demo.org/acme/new-reg',
messages.Revocation:
'https://www.letsencrypt-demo.org/acme/revoke-cert',
messages.NewAuthorization:
'https://www.letsencrypt-demo.org/acme/new-authz',
messages.CertificateRequest:
'https://www.letsencrypt-demo.org/acme/new-cert',
})
from acme.client import Client
self.client = Client(
directory=self.directory, key=KEY, alg=jose.RS256, net=self.net)
self.identifier = messages.Identifier(
typ=messages.IDENTIFIER_FQDN, value='example.com')
@@ -57,8 +63,7 @@ class ClientTest(unittest.TestCase):
contact=self.contact, key=KEY.public_key())
self.new_reg = messages.NewRegistration(**dict(reg))
self.regr = messages.RegistrationResource(
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1',
terms_of_service='https://www.letsencrypt-demo.org/tos')
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1')
# Authorization
authzr_uri = 'https://www.letsencrypt-demo.org/acme/authz/1'
@@ -75,14 +80,217 @@ class ClientTest(unittest.TestCase):
self.authzr = messages.AuthorizationResource(
body=self.authz, uri=authzr_uri)
# Reason code for revocation
self.rsn = 1
class BackwardsCompatibleClientV2Test(ClientTestBase):
"""Tests for acme.client.BackwardsCompatibleClientV2."""
def setUp(self):
super(BackwardsCompatibleClientV2Test, self).setUp()
# contains a loaded cert
self.certr = messages.CertificateResource(
body=messages_test.CERT)
loaded = OpenSSL.crypto.load_certificate(
OpenSSL.crypto.FILETYPE_PEM, CERT_SAN_PEM)
wrapped = jose.ComparableX509(loaded)
self.chain = [wrapped, wrapped]
self.cert_pem = OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_PEM, messages_test.CERT.wrapped).decode()
single_chain = OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_PEM, loaded).decode()
self.chain_pem = single_chain + single_chain
self.fullchain_pem = self.cert_pem + self.chain_pem
self.orderr = messages.OrderResource(
csr_pem=CSR_SAN_PEM)
def _init(self):
uri = 'http://www.letsencrypt-demo.org/directory'
from acme.client import BackwardsCompatibleClientV2
return BackwardsCompatibleClientV2(net=self.net,
key=KEY, server=uri)
def test_init_downloads_directory(self):
uri = 'http://www.letsencrypt-demo.org/directory'
from acme.client import BackwardsCompatibleClientV2
BackwardsCompatibleClientV2(net=self.net,
key=KEY, server=uri)
self.net.get.assert_called_once_with(uri)
def test_init_acme_version(self):
self.response.json.return_value = DIRECTORY_V1.to_json()
client = self._init()
self.assertEqual(client.acme_version, 1)
self.response.json.return_value = DIRECTORY_V2.to_json()
client = self._init()
self.assertEqual(client.acme_version, 2)
def test_forwarding(self):
self.response.json.return_value = DIRECTORY_V1.to_json()
client = self._init()
self.assertEqual(client.directory, client.client.directory)
self.assertEqual(client.key, KEY)
self.assertEqual(client.update_registration, client.client.update_registration)
self.assertRaises(AttributeError, client.__getattr__, 'nonexistent')
self.assertRaises(AttributeError, client.__getattr__, 'new_account_and_tos')
self.assertRaises(AttributeError, client.__getattr__, 'new_account')
def test_new_account_and_tos(self):
# v2 no tos
self.response.json.return_value = DIRECTORY_V2.to_json()
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.new_account_and_tos(self.new_reg)
mock_client().new_account.assert_called_with(self.new_reg)
# v2 tos good
with mock.patch('acme.client.ClientV2') as mock_client:
mock_client().directory.meta.__contains__.return_value = True
client = self._init()
client.new_account_and_tos(self.new_reg, lambda x: True)
mock_client().new_account.assert_called_with(
self.new_reg.update(terms_of_service_agreed=True))
# v2 tos bad
with mock.patch('acme.client.ClientV2') as mock_client:
mock_client().directory.meta.__contains__.return_value = True
client = self._init()
def _tos_cb(tos):
raise errors.Error
self.assertRaises(errors.Error, client.new_account_and_tos,
self.new_reg, _tos_cb)
mock_client().new_account.assert_not_called()
# v1 yes tos
self.response.json.return_value = DIRECTORY_V1.to_json()
with mock.patch('acme.client.Client') as mock_client:
regr = mock.MagicMock(terms_of_service="TOS")
mock_client().register.return_value = regr
client = self._init()
client.new_account_and_tos(self.new_reg)
mock_client().register.assert_called_once_with(self.new_reg)
mock_client().agree_to_tos.assert_called_once_with(regr)
# v1 no tos
with mock.patch('acme.client.Client') as mock_client:
regr = mock.MagicMock(terms_of_service=None)
mock_client().register.return_value = regr
client = self._init()
client.new_account_and_tos(self.new_reg)
mock_client().register.assert_called_once_with(self.new_reg)
mock_client().agree_to_tos.assert_not_called()
@mock.patch('OpenSSL.crypto.load_certificate_request')
@mock.patch('acme.crypto_util._pyopenssl_cert_or_req_all_names')
def test_new_order_v1(self, mock__pyopenssl_cert_or_req_all_names,
unused_mock_load_certificate_request):
self.response.json.return_value = DIRECTORY_V1.to_json()
mock__pyopenssl_cert_or_req_all_names.return_value = ['example.com', 'www.example.com']
mock_csr_pem = mock.MagicMock()
with mock.patch('acme.client.Client') as mock_client:
mock_client().request_domain_challenges.return_value = mock.sentinel.auth
client = self._init()
orderr = client.new_order(mock_csr_pem)
self.assertEqual(orderr.authorizations, [mock.sentinel.auth, mock.sentinel.auth])
def test_new_order_v2(self):
self.response.json.return_value = DIRECTORY_V2.to_json()
mock_csr_pem = mock.MagicMock()
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.new_order(mock_csr_pem)
mock_client().new_order.assert_called_once_with(mock_csr_pem)
@mock.patch('acme.client.Client')
def test_finalize_order_v1_success(self, mock_client):
self.response.json.return_value = DIRECTORY_V1.to_json()
mock_client().request_issuance.return_value = self.certr
mock_client().fetch_chain.return_value = self.chain
deadline = datetime.datetime(9999, 9, 9)
client = self._init()
result = client.finalize_order(self.orderr, deadline)
self.assertEqual(result.fullchain_pem, self.fullchain_pem)
mock_client().fetch_chain.assert_called_once_with(self.certr)
@mock.patch('acme.client.Client')
def test_finalize_order_v1_fetch_chain_error(self, mock_client):
self.response.json.return_value = DIRECTORY_V1.to_json()
mock_client().request_issuance.return_value = self.certr
mock_client().fetch_chain.return_value = self.chain
mock_client().fetch_chain.side_effect = [errors.Error, self.chain]
deadline = datetime.datetime(9999, 9, 9)
client = self._init()
result = client.finalize_order(self.orderr, deadline)
self.assertEqual(result.fullchain_pem, self.fullchain_pem)
self.assertEqual(mock_client().fetch_chain.call_count, 2)
@mock.patch('acme.client.Client')
def test_finalize_order_v1_timeout(self, mock_client):
self.response.json.return_value = DIRECTORY_V1.to_json()
mock_client().request_issuance.return_value = self.certr
deadline = deadline = datetime.datetime.now() - datetime.timedelta(seconds=60)
client = self._init()
self.assertRaises(errors.TimeoutError, client.finalize_order,
self.orderr, deadline)
def test_finalize_order_v2(self):
self.response.json.return_value = DIRECTORY_V2.to_json()
mock_orderr = mock.MagicMock()
mock_deadline = mock.MagicMock()
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.finalize_order(mock_orderr, mock_deadline)
mock_client().finalize_order.assert_called_once_with(mock_orderr, mock_deadline)
def test_revoke(self):
self.response.json.return_value = DIRECTORY_V1.to_json()
with mock.patch('acme.client.Client') as mock_client:
client = self._init()
client.revoke(messages_test.CERT, self.rsn)
mock_client().revoke.assert_called_once_with(messages_test.CERT, self.rsn)
self.response.json.return_value = DIRECTORY_V2.to_json()
with mock.patch('acme.client.ClientV2') as mock_client:
client = self._init()
client.revoke(messages_test.CERT, self.rsn)
mock_client().revoke.assert_called_once_with(messages_test.CERT, self.rsn)
class ClientTest(ClientTestBase):
"""Tests for acme.client.Client."""
# pylint: disable=too-many-instance-attributes,too-many-public-methods
def setUp(self):
super(ClientTest, self).setUp()
self.directory = DIRECTORY_V1
# Registration
self.regr = self.regr.update(
terms_of_service='https://www.letsencrypt-demo.org/tos')
# Request issuance
self.certr = messages.CertificateResource(
body=messages_test.CERT, authzrs=(self.authzr,),
uri='https://www.letsencrypt-demo.org/acme/cert/1',
cert_chain_uri='https://www.letsencrypt-demo.org/ca')
# Reason code for revocation
self.rsn = 1
from acme.client import Client
self.client = Client(
directory=self.directory, key=KEY, alg=jose.RS256, net=self.net)
def test_init_downloads_directory(self):
uri = 'http://www.letsencrypt-demo.org/directory'
@@ -91,6 +299,16 @@ class ClientTest(unittest.TestCase):
directory=uri, key=KEY, alg=jose.RS256, net=self.net)
self.net.get.assert_called_once_with(uri)
@mock.patch('acme.client.ClientNetwork')
def test_init_without_net(self, mock_net):
mock_net.return_value = mock.sentinel.net
alg = jose.RS256
from acme.client import Client
self.client = Client(
directory=self.directory, key=KEY, alg=alg)
mock_net.called_once_with(KEY, alg=alg, verify_ssl=True)
self.assertEqual(self.client.net, mock.sentinel.net)
def test_register(self):
# "Instance of 'Field' has no to_json/update member" bug:
# pylint: disable=no-member
@@ -142,20 +360,23 @@ class ClientTest(unittest.TestCase):
self.client.request_challenges(self.identifier)
self.net.post.assert_called_once_with(
self.directory.new_authz,
messages.NewAuthorization(identifier=self.identifier))
messages.NewAuthorization(identifier=self.identifier),
acme_version=1)
def test_request_challenges_deprecated_arg(self):
self._prepare_response_for_request_challenges()
self.client.request_challenges(self.identifier, new_authzr_uri="hi")
self.net.post.assert_called_once_with(
self.directory.new_authz,
messages.NewAuthorization(identifier=self.identifier))
messages.NewAuthorization(identifier=self.identifier),
acme_version=1)
def test_request_challenges_custom_uri(self):
self._prepare_response_for_request_challenges()
self.client.request_challenges(self.identifier)
self.net.post.assert_called_once_with(
'https://www.letsencrypt-demo.org/acme/new-authz', mock.ANY)
'https://www.letsencrypt-demo.org/acme/new-authz', mock.ANY,
acme_version=1)
def test_request_challenges_unexpected_update(self):
self._prepare_response_for_request_challenges()
@@ -165,6 +386,13 @@ class ClientTest(unittest.TestCase):
errors.UnexpectedUpdate, self.client.request_challenges,
self.identifier)
def test_request_challenges_wildcard(self):
wildcard_identifier = messages.Identifier(
typ=messages.IDENTIFIER_FQDN, value='*.example.org')
self.assertRaises(
errors.WildcardUnsupportedError, self.client.request_challenges,
wildcard_identifier)
def test_request_domain_challenges(self):
self.client.request_challenges = mock.MagicMock()
self.assertEqual(
@@ -417,7 +645,7 @@ class ClientTest(unittest.TestCase):
def test_revoke(self):
self.client.revoke(self.certr.body, self.rsn)
self.net.post.assert_called_once_with(
self.directory[messages.Revocation], mock.ANY, content_type=None)
self.directory[messages.Revocation], mock.ANY, acme_version=1)
def test_revocation_payload(self):
obj = messages.Revocation(certificate=self.certr.body, reason=self.rsn)
@@ -432,9 +660,150 @@ class ClientTest(unittest.TestCase):
self.certr,
self.rsn)
class ClientV2Test(ClientTestBase):
"""Tests for acme.client.ClientV2."""
def setUp(self):
super(ClientV2Test, self).setUp()
self.directory = DIRECTORY_V2
from acme.client import ClientV2
self.client = ClientV2(self.directory, self.net)
self.new_reg = self.new_reg.update(terms_of_service_agreed=True)
self.authzr_uri2 = 'https://www.letsencrypt-demo.org/acme/authz/2'
self.authz2 = self.authz.update(identifier=messages.Identifier(
typ=messages.IDENTIFIER_FQDN, value='www.example.com'),
status=messages.STATUS_PENDING)
self.authzr2 = messages.AuthorizationResource(
body=self.authz2, uri=self.authzr_uri2)
self.order = messages.Order(
identifiers=(self.authz.identifier, self.authz2.identifier),
status=messages.STATUS_PENDING,
authorizations=(self.authzr.uri, self.authzr_uri2),
finalize='https://www.letsencrypt-demo.org/acme/acct/1/order/1/finalize')
self.orderr = messages.OrderResource(
body=self.order,
uri='https://www.letsencrypt-demo.org/acme/acct/1/order/1',
authorizations=[self.authzr, self.authzr2], csr_pem=CSR_SAN_PEM)
def test_new_account(self):
self.response.status_code = http_client.CREATED
self.response.json.return_value = self.regr.body.to_json()
self.response.headers['Location'] = self.regr.uri
self.assertEqual(self.regr, self.client.new_account(self.new_reg))
def test_new_order(self):
order_response = copy.deepcopy(self.response)
order_response.status_code = http_client.CREATED
order_response.json.return_value = self.order.to_json()
order_response.headers['Location'] = self.orderr.uri
self.net.post.return_value = order_response
authz_response = copy.deepcopy(self.response)
authz_response.json.return_value = self.authz.to_json()
authz_response.headers['Location'] = self.authzr.uri
authz_response2 = self.response
authz_response2.json.return_value = self.authz2.to_json()
authz_response2.headers['Location'] = self.authzr2.uri
self.net.get.side_effect = (authz_response, authz_response2)
self.assertEqual(self.client.new_order(CSR_SAN_PEM), self.orderr)
@mock.patch('acme.client.datetime')
def test_poll_and_finalize(self, mock_datetime):
mock_datetime.datetime.now.return_value = datetime.datetime(2018, 2, 15)
mock_datetime.timedelta = datetime.timedelta
expected_deadline = mock_datetime.datetime.now() + datetime.timedelta(seconds=90)
self.client.poll_authorizations = mock.Mock(return_value=self.orderr)
self.client.finalize_order = mock.Mock(return_value=self.orderr)
self.assertEqual(self.client.poll_and_finalize(self.orderr), self.orderr)
self.client.poll_authorizations.assert_called_once_with(self.orderr, expected_deadline)
self.client.finalize_order.assert_called_once_with(self.orderr, expected_deadline)
@mock.patch('acme.client.datetime')
def test_poll_authorizations_timeout(self, mock_datetime):
now_side_effect = [datetime.datetime(2018, 2, 15),
datetime.datetime(2018, 2, 16),
datetime.datetime(2018, 2, 17)]
mock_datetime.datetime.now.side_effect = now_side_effect
self.response.json.side_effect = [
self.authz.to_json(), self.authz2.to_json(), self.authz2.to_json()]
self.assertRaises(
errors.TimeoutError, self.client.poll_authorizations, self.orderr, now_side_effect[1])
def test_poll_authorizations_failure(self):
deadline = datetime.datetime(9999, 9, 9)
challb = self.challr.body.update(status=messages.STATUS_INVALID,
error=messages.Error.with_code('unauthorized'))
authz = self.authz.update(status=messages.STATUS_INVALID, challenges=(challb,))
self.response.json.return_value = authz.to_json()
self.assertRaises(
errors.ValidationError, self.client.poll_authorizations, self.orderr, deadline)
def test_poll_authorizations_success(self):
deadline = datetime.datetime(9999, 9, 9)
updated_authz2 = self.authz2.update(status=messages.STATUS_VALID)
updated_authzr2 = messages.AuthorizationResource(
body=updated_authz2, uri=self.authzr_uri2)
updated_orderr = self.orderr.update(authorizations=[self.authzr, updated_authzr2])
self.response.json.side_effect = (
self.authz.to_json(), self.authz2.to_json(), updated_authz2.to_json())
self.assertEqual(self.client.poll_authorizations(self.orderr, deadline), updated_orderr)
def test_finalize_order_success(self):
updated_order = self.order.update(
certificate='https://www.letsencrypt-demo.org/acme/cert/')
updated_orderr = self.orderr.update(body=updated_order, fullchain_pem=CERT_SAN_PEM)
self.response.json.return_value = updated_order.to_json()
self.response.text = CERT_SAN_PEM
deadline = datetime.datetime(9999, 9, 9)
self.assertEqual(self.client.finalize_order(self.orderr, deadline), updated_orderr)
def test_finalize_order_error(self):
updated_order = self.order.update(error=messages.Error.with_code('unauthorized'))
self.response.json.return_value = updated_order.to_json()
deadline = datetime.datetime(9999, 9, 9)
self.assertRaises(errors.IssuanceError, self.client.finalize_order, self.orderr, deadline)
def test_finalize_order_timeout(self):
deadline = datetime.datetime.now() - datetime.timedelta(seconds=60)
self.assertRaises(errors.TimeoutError, self.client.finalize_order, self.orderr, deadline)
def test_revoke(self):
self.client.revoke(messages_test.CERT, self.rsn)
self.net.post.assert_called_once_with(
self.directory["revokeCert"], mock.ANY, acme_version=2)
class MockJSONDeSerializable(jose.JSONDeSerializable):
# pylint: disable=missing-docstring
def __init__(self, value):
self.value = value
def to_partial_json(self):
return {'foo': self.value}
@classmethod
def from_json(cls, value):
pass # pragma: no cover
class ClientNetworkTest(unittest.TestCase):
"""Tests for acme.client.ClientNetwork."""
# pylint: disable=too-many-public-methods
def setUp(self):
self.verify_ssl = mock.MagicMock()
@@ -453,25 +822,27 @@ class ClientNetworkTest(unittest.TestCase):
self.assertTrue(self.net.verify_ssl is self.verify_ssl)
def test_wrap_in_jws(self):
class MockJSONDeSerializable(jose.JSONDeSerializable):
# pylint: disable=missing-docstring
def __init__(self, value):
self.value = value
def to_partial_json(self):
return {'foo': self.value}
@classmethod
def from_json(cls, value):
pass # pragma: no cover
# pylint: disable=protected-access
jws_dump = self.net._wrap_in_jws(
MockJSONDeSerializable('foo'), nonce=b'Tg')
MockJSONDeSerializable('foo'), nonce=b'Tg', url="url",
acme_version=1)
jws = acme_jws.JWS.json_loads(jws_dump)
self.assertEqual(json.loads(jws.payload.decode()), {'foo': 'foo'})
self.assertEqual(jws.signature.combined.nonce, b'Tg')
def test_wrap_in_jws_v2(self):
self.net.account = {'uri': 'acct-uri'}
# pylint: disable=protected-access
jws_dump = self.net._wrap_in_jws(
MockJSONDeSerializable('foo'), nonce=b'Tg', url="url",
acme_version=2)
jws = acme_jws.JWS.json_loads(jws_dump)
self.assertEqual(json.loads(jws.payload.decode()), {'foo': 'foo'})
self.assertEqual(jws.signature.combined.nonce, b'Tg')
self.assertEqual(jws.signature.combined.kid, u'acct-uri')
self.assertEqual(jws.signature.combined.url, u'url')
def test_check_response_not_ok_jobj_no_error(self):
self.response.ok = False
self.response.json.return_value = {}
@@ -621,6 +992,21 @@ class ClientNetworkTest(unittest.TestCase):
self.assertRaises(requests.exceptions.RequestException,
self.net._send_request, 'GET', 'uri')
def test_urllib_error(self):
# Using a connection error to test a properly formatted error message
try:
# pylint: disable=protected-access
self.net._send_request('GET', "http://localhost:19123/nonexistent.txt")
# Value Error Generated Exceptions
except ValueError as y:
self.assertEqual("Requesting localhost/nonexistent: "
"Connection refused", str(y))
# Requests Library Exceptions
except requests.exceptions.ConnectionError as z: #pragma: no cover
self.assertEqual("('Connection aborted.', "
"error(111, 'Connection refused'))", str(z))
class ClientNetworkWithMockedResponseTest(unittest.TestCase):
"""Tests for acme.client.ClientNetwork which mock out response."""
@@ -686,13 +1072,13 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
self.assertEqual(self.checked_response, self.net.post(
'uri', self.obj, content_type=self.content_type))
self.net._wrap_in_jws.assert_called_once_with(
self.obj, jose.b64decode(self.all_nonces.pop()))
self.obj, jose.b64decode(self.all_nonces.pop()), "uri", 1)
self.available_nonces = []
self.assertRaises(errors.MissingNonce, self.net.post,
'uri', self.obj, content_type=self.content_type)
self.net._wrap_in_jws.assert_called_with(
self.obj, jose.b64decode(self.all_nonces.pop()))
self.obj, jose.b64decode(self.all_nonces.pop()), "uri", 1)
def test_post_wrong_initial_nonce(self): # HEAD
self.available_nonces = [b'f', jose.b64encode(b'good')]

View File

@@ -2,11 +2,13 @@
import binascii
import contextlib
import logging
import os
import re
import socket
import sys
import OpenSSL
import josepy as jose
from acme import errors
@@ -129,8 +131,7 @@ def probe_sni(name, host, port=443, timeout=300,
context = OpenSSL.SSL.Context(method)
context.set_timeout(timeout)
socket_kwargs = {} if sys.version_info < (2, 7) else {
'source_address': source_address}
socket_kwargs = {'source_address': source_address}
host_protocol_agnostic = None if host == '::' or host == '0' else host
@@ -185,6 +186,15 @@ def make_csr(private_key_pem, domains, must_staple=False):
return OpenSSL.crypto.dump_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr)
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
common_name = loaded_cert_or_req.get_subject().CN
sans = _pyopenssl_cert_or_req_san(loaded_cert_or_req)
if common_name is None:
return sans
else:
return [common_name] + [d for d in sans if d != common_name]
def _pyopenssl_cert_or_req_san(cert_or_req):
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
@@ -243,7 +253,7 @@ def gen_ss_cert(key, domains, not_before=None,
"""
assert domains, "Must provide one or more hostnames for the cert."
cert = OpenSSL.crypto.X509()
cert.set_serial_number(int(binascii.hexlify(OpenSSL.rand.bytes(16)), 16))
cert.set_serial_number(int(binascii.hexlify(os.urandom(16)), 16))
cert.set_version(2)
extensions = [
@@ -270,3 +280,26 @@ def gen_ss_cert(key, domains, not_before=None,
cert.set_pubkey(key)
cert.sign(key, "sha256")
return cert
def dump_pyopenssl_chain(chain, filetype=OpenSSL.crypto.FILETYPE_PEM):
"""Dump certificate chain into a bundle.
:param list chain: List of `OpenSSL.crypto.X509` (or wrapped in
:class:`josepy.util.ComparableX509`).
:returns: certificate chain bundle
:rtype: bytes
"""
# XXX: returns empty string when no chain is available, which
# shuts up RenewableCert, but might not be the best solution...
def _dump_cert(cert):
if isinstance(cert, jose.ComparableX509):
# pylint: disable=protected-access
cert = cert.wrapped
return OpenSSL.crypto.dump_certificate(filetype, cert)
# assumes that OpenSSL.crypto.dump_certificate includes ending
# newline character
return b"".join(_dump_cert(cert) for cert in chain)

View File

@@ -8,17 +8,16 @@ import unittest
import six
from six.moves import socketserver #type: ignore # pylint: disable=import-error
import josepy as jose
import OpenSSL
from acme import errors
from acme import jose
from acme import test_util
class SSLSocketAndProbeSNITest(unittest.TestCase):
"""Tests for acme.crypto_util.SSLSocket/probe_sni."""
_multiprocess_can_split_ = True
def setUp(self):
self.cert = test_util.load_comparable_cert('rsa2048_cert.pem')
@@ -66,10 +65,33 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
# self.assertRaises(errors.Error, self._probe, b'bar')
class PyOpenSSLCertOrReqAllNamesTest(unittest.TestCase):
"""Test for acme.crypto_util._pyopenssl_cert_or_req_all_names."""
@classmethod
def _call(cls, loader, name):
# pylint: disable=protected-access
from acme.crypto_util import _pyopenssl_cert_or_req_all_names
return _pyopenssl_cert_or_req_all_names(loader(name))
def _call_cert(self, name):
return self._call(test_util.load_cert, name)
def test_cert_one_san_no_common(self):
self.assertEqual(self._call_cert('cert-nocn.der'),
['no-common-name.badssl.com'])
def test_cert_no_sans_yes_common(self):
self.assertEqual(self._call_cert('cert.pem'), ['example.com'])
def test_cert_two_sans_yes_common(self):
self.assertEqual(self._call_cert('cert-san.pem'),
['example.com', 'www.example.com'])
class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
"""Test for acme.crypto_util._pyopenssl_cert_or_req_san."""
_multiprocess_can_split_ = True
@classmethod
def _call(cls, loader, name):
@@ -140,7 +162,6 @@ class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
class RandomSnTest(unittest.TestCase):
"""Test for random certificate serial numbers."""
_multiprocess_can_split_ = True
def setUp(self):
self.cert_count = 5
@@ -173,9 +194,9 @@ class MakeCSRTest(unittest.TestCase):
self.assertTrue(b'--END CERTIFICATE REQUEST--' in csr_pem)
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
# In pyopenssl 0.13 (used with TOXENV=py26-oldest and py27-oldest), csr
# objects don't have a get_extensions() method, so we skip this test if
# the method isn't available.
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
# have a get_extensions() method, so we skip this test if the method
# isn't available.
if hasattr(csr, 'get_extensions'):
self.assertEquals(len(csr.get_extensions()), 1)
self.assertEquals(csr.get_extensions()[0].get_data(),
@@ -191,9 +212,9 @@ class MakeCSRTest(unittest.TestCase):
csr = OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
# In pyopenssl 0.13 (used with TOXENV=py26-oldest and py27-oldest), csr
# objects don't have a get_extensions() method, so we skip this test if
# the method isn't available.
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
# have a get_extensions() method, so we skip this test if the method
# isn't available.
if hasattr(csr, 'get_extensions'):
self.assertEquals(len(csr.get_extensions()), 2)
# NOTE: Ideally we would filter by the TLS Feature OID, but
@@ -204,5 +225,33 @@ class MakeCSRTest(unittest.TestCase):
self.assertEqual(len(must_staple_exts), 1,
"Expected exactly one Must Staple extension")
class DumpPyopensslChainTest(unittest.TestCase):
"""Test for dump_pyopenssl_chain."""
@classmethod
def _call(cls, loaded):
# pylint: disable=protected-access
from acme.crypto_util import dump_pyopenssl_chain
return dump_pyopenssl_chain(loaded)
def test_dump_pyopenssl_chain(self):
names = ['cert.pem', 'cert-san.pem', 'cert-idnsans.pem']
loaded = [test_util.load_cert(name) for name in names]
length = sum(
len(OpenSSL.crypto.dump_certificate(OpenSSL.crypto.FILETYPE_PEM, cert))
for cert in loaded)
self.assertEqual(len(self._call(loaded)), length)
def test_dump_pyopenssl_chain_wrapped(self):
names = ['cert.pem', 'cert-san.pem', 'cert-idnsans.pem']
loaded = [test_util.load_cert(name) for name in names]
wrap_func = jose.ComparableX509
wrapped = [wrap_func(cert) for cert in loaded]
dump_func = OpenSSL.crypto.dump_certificate
length = sum(len(dump_func(OpenSSL.crypto.FILETYPE_PEM, cert)) for cert in loaded)
self.assertEqual(len(self._call(wrapped)), length)
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,5 +1,5 @@
"""ACME errors."""
from acme.jose import errors as jose_errors
from josepy import errors as jose_errors
class Error(Exception):
@@ -83,6 +83,28 @@ class PollError(ClientError):
return '{0}(exhausted={1!r}, updated={2!r})'.format(
self.__class__.__name__, self.exhausted, self.updated)
class ValidationError(Error):
"""Error for authorization failures. Contains a list of authorization
resources, each of which is invalid and should have an error field.
"""
def __init__(self, failed_authzrs):
self.failed_authzrs = failed_authzrs
super(ValidationError, self).__init__()
class TimeoutError(Error):
"""Error for when polling an authorization or an order times out."""
class IssuanceError(Error):
"""Error sent by the server after requesting issuance of a certificate."""
def __init__(self, error):
"""Initialize.
:param messages.Error error: The error provided by the server.
"""
self.error = error
super(IssuanceError, self).__init__()
class ConflictError(ClientError):
"""Error for when the server returns a 409 (Conflict) HTTP status.
@@ -93,3 +115,6 @@ class ConflictError(ClientError):
self.location = location
super(ConflictError, self).__init__()
class WildcardUnsupportedError(Error):
"""Error for when a wildcard is requested but is unsupported by ACME CA."""

View File

@@ -1,10 +1,9 @@
"""ACME JSON fields."""
import logging
import josepy as jose
import pyrfc3339
from acme import jose
logger = logging.getLogger(__name__)

View File

@@ -2,10 +2,9 @@
import datetime
import unittest
import josepy as jose
import pytz
from acme import jose
class FixedTest(unittest.TestCase):
"""Tests for acme.fields.Fixed."""

View File

@@ -1,82 +0,0 @@
"""Javascript Object Signing and Encryption (jose).
This package is a Python implementation of the standards developed by
IETF `Javascript Object Signing and Encryption (Active WG)`_, in
particular the following RFCs:
- `JSON Web Algorithms (JWA)`_
- `JSON Web Key (JWK)`_
- `JSON Web Signature (JWS)`_
.. _`Javascript Object Signing and Encryption (Active WG)`:
https://tools.ietf.org/wg/jose/
.. _`JSON Web Algorithms (JWA)`:
https://datatracker.ietf.org/doc/draft-ietf-jose-json-web-algorithms/
.. _`JSON Web Key (JWK)`:
https://datatracker.ietf.org/doc/draft-ietf-jose-json-web-key/
.. _`JSON Web Signature (JWS)`:
https://datatracker.ietf.org/doc/draft-ietf-jose-json-web-signature/
"""
from acme.jose.b64 import (
b64decode,
b64encode,
)
from acme.jose.errors import (
DeserializationError,
SerializationError,
Error,
UnrecognizedTypeError,
)
from acme.jose.interfaces import JSONDeSerializable
from acme.jose.json_util import (
Field,
JSONObjectWithFields,
TypedJSONObjectWithFields,
decode_b64jose,
decode_cert,
decode_csr,
decode_hex16,
encode_b64jose,
encode_cert,
encode_csr,
encode_hex16,
)
from acme.jose.jwa import (
HS256,
HS384,
HS512,
JWASignature,
PS256,
PS384,
PS512,
RS256,
RS384,
RS512,
)
from acme.jose.jwk import (
JWK,
JWKRSA,
)
from acme.jose.jws import (
Header,
JWS,
Signature,
)
from acme.jose.util import (
ComparableX509,
ComparableKey,
ComparableRSAKey,
ImmutableMap,
)

View File

@@ -1,61 +0,0 @@
"""JOSE Base64.
`JOSE Base64`_ is defined as:
- URL-safe Base64
- padding stripped
.. _`JOSE Base64`:
https://tools.ietf.org/html/draft-ietf-jose-json-web-signature-37#appendix-C
.. Do NOT try to call this module "base64", as it will "shadow" the
standard library.
"""
import base64
import six
def b64encode(data):
"""JOSE Base64 encode.
:param data: Data to be encoded.
:type data: `bytes`
:returns: JOSE Base64 string.
:rtype: bytes
:raises TypeError: if `data` is of incorrect type
"""
if not isinstance(data, six.binary_type):
raise TypeError('argument should be {0}'.format(six.binary_type))
return base64.urlsafe_b64encode(data).rstrip(b'=')
def b64decode(data):
"""JOSE Base64 decode.
:param data: Base64 string to be decoded. If it's unicode, then
only ASCII characters are allowed.
:type data: `bytes` or `unicode`
:returns: Decoded data.
:rtype: bytes
:raises TypeError: if input is of incorrect type
:raises ValueError: if input is unicode with non-ASCII characters
"""
if isinstance(data, six.string_types):
try:
data = data.encode('ascii')
except UnicodeEncodeError:
raise ValueError(
'unicode argument should contain only ASCII characters')
elif not isinstance(data, six.binary_type):
raise TypeError('argument should be a str or unicode')
return base64.urlsafe_b64decode(data + b'=' * (4 - (len(data) % 4)))

View File

@@ -1,77 +0,0 @@
"""Tests for acme.jose.b64."""
import unittest
import six
# https://en.wikipedia.org/wiki/Base64#Examples
B64_PADDING_EXAMPLES = {
b'any carnal pleasure.': (b'YW55IGNhcm5hbCBwbGVhc3VyZS4', b'='),
b'any carnal pleasure': (b'YW55IGNhcm5hbCBwbGVhc3VyZQ', b'=='),
b'any carnal pleasur': (b'YW55IGNhcm5hbCBwbGVhc3Vy', b''),
b'any carnal pleasu': (b'YW55IGNhcm5hbCBwbGVhc3U', b'='),
b'any carnal pleas': (b'YW55IGNhcm5hbCBwbGVhcw', b'=='),
}
B64_URL_UNSAFE_EXAMPLES = {
six.int2byte(251) + six.int2byte(239): b'--8',
six.int2byte(255) * 2: b'__8',
}
class B64EncodeTest(unittest.TestCase):
"""Tests for acme.jose.b64.b64encode."""
@classmethod
def _call(cls, data):
from acme.jose.b64 import b64encode
return b64encode(data)
def test_empty(self):
self.assertEqual(self._call(b''), b'')
def test_unsafe_url(self):
for text, b64 in six.iteritems(B64_URL_UNSAFE_EXAMPLES):
self.assertEqual(self._call(text), b64)
def test_different_paddings(self):
for text, (b64, _) in six.iteritems(B64_PADDING_EXAMPLES):
self.assertEqual(self._call(text), b64)
def test_unicode_fails_with_type_error(self):
self.assertRaises(TypeError, self._call, u'some unicode')
class B64DecodeTest(unittest.TestCase):
"""Tests for acme.jose.b64.b64decode."""
@classmethod
def _call(cls, data):
from acme.jose.b64 import b64decode
return b64decode(data)
def test_unsafe_url(self):
for text, b64 in six.iteritems(B64_URL_UNSAFE_EXAMPLES):
self.assertEqual(self._call(b64), text)
def test_input_without_padding(self):
for text, (b64, _) in six.iteritems(B64_PADDING_EXAMPLES):
self.assertEqual(self._call(b64), text)
def test_input_with_padding(self):
for text, (b64, pad) in six.iteritems(B64_PADDING_EXAMPLES):
self.assertEqual(self._call(b64 + pad), text)
def test_unicode_with_ascii(self):
self.assertEqual(self._call(u'YQ'), b'a')
def test_non_ascii_unicode_fails(self):
self.assertRaises(ValueError, self._call, u'\u0105')
def test_type_error_no_unicode_or_bytes(self):
self.assertRaises(TypeError, self._call, object())
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,35 +0,0 @@
"""JOSE errors."""
class Error(Exception):
"""Generic JOSE Error."""
class DeserializationError(Error):
"""JSON deserialization error."""
def __str__(self):
return "Deserialization error: {0}".format(
super(DeserializationError, self).__str__())
class SerializationError(Error):
"""JSON serialization error."""
class UnrecognizedTypeError(DeserializationError):
"""Unrecognized type error.
:ivar str typ: The unrecognized type of the JSON object.
:ivar jobj: Full JSON object.
"""
def __init__(self, typ, jobj):
self.typ = typ
self.jobj = jobj
super(UnrecognizedTypeError, self).__init__(str(self))
def __str__(self):
return '{0} was not recognized, full message: {1}'.format(
self.typ, self.jobj)

View File

@@ -1,17 +0,0 @@
"""Tests for acme.jose.errors."""
import unittest
class UnrecognizedTypeErrorTest(unittest.TestCase):
def setUp(self):
from acme.jose.errors import UnrecognizedTypeError
self.error = UnrecognizedTypeError('foo', {'type': 'foo'})
def test_str(self):
self.assertEqual(
"foo was not recognized, full message: {'type': 'foo'}",
str(self.error))
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,216 +0,0 @@
"""JOSE interfaces."""
import abc
import collections
import json
import six
from acme.jose import errors
from acme.jose import util
# pylint: disable=no-self-argument,no-method-argument,no-init,inherit-non-class
# pylint: disable=too-few-public-methods
@six.add_metaclass(abc.ABCMeta)
class JSONDeSerializable(object):
# pylint: disable=too-few-public-methods
"""Interface for (de)serializable JSON objects.
Please recall, that standard Python library implements
:class:`json.JSONEncoder` and :class:`json.JSONDecoder` that perform
translations based on respective :ref:`conversion tables
<conversion-table>` that look pretty much like the one below (for
complete tables see relevant Python documentation):
.. _conversion-table:
====== ======
JSON Python
====== ======
object dict
... ...
====== ======
While the above **conversion table** is about translation of JSON
documents to/from the basic Python types only,
:class:`JSONDeSerializable` introduces the following two concepts:
serialization
Turning an arbitrary Python object into Python object that can
be encoded into a JSON document. **Full serialization** produces
a Python object composed of only basic types as required by the
:ref:`conversion table <conversion-table>`. **Partial
serialization** (accomplished by :meth:`to_partial_json`)
produces a Python object that might also be built from other
:class:`JSONDeSerializable` objects.
deserialization
Turning a decoded Python object (necessarily one of the basic
types as required by the :ref:`conversion table
<conversion-table>`) into an arbitrary Python object.
Serialization produces **serialized object** ("partially serialized
object" or "fully serialized object" for partial and full
serialization respectively) and deserialization produces
**deserialized object**, both usually denoted in the source code as
``jobj``.
Wording in the official Python documentation might be confusing
after reading the above, but in the light of those definitions, one
can view :meth:`json.JSONDecoder.decode` as decoder and
deserializer of basic types, :meth:`json.JSONEncoder.default` as
serializer of basic types, :meth:`json.JSONEncoder.encode` as
serializer and encoder of basic types.
One could extend :mod:`json` to support arbitrary object
(de)serialization either by:
- overriding :meth:`json.JSONDecoder.decode` and
:meth:`json.JSONEncoder.default` in subclasses
- or passing ``object_hook`` argument (or ``object_hook_pairs``)
to :func:`json.load`/:func:`json.loads` or ``default`` argument
for :func:`json.dump`/:func:`json.dumps`.
Interestingly, ``default`` is required to perform only partial
serialization, as :func:`json.dumps` applies ``default``
recursively. This is the idea behind making :meth:`to_partial_json`
produce only partial serialization, while providing custom
:meth:`json_dumps` that dumps with ``default`` set to
:meth:`json_dump_default`.
To make further documentation a bit more concrete, please, consider
the following imaginatory implementation example::
class Foo(JSONDeSerializable):
def to_partial_json(self):
return 'foo'
@classmethod
def from_json(cls, jobj):
return Foo()
class Bar(JSONDeSerializable):
def to_partial_json(self):
return [Foo(), Foo()]
@classmethod
def from_json(cls, jobj):
return Bar()
"""
@abc.abstractmethod
def to_partial_json(self): # pragma: no cover
"""Partially serialize.
Following the example, **partial serialization** means the following::
assert isinstance(Bar().to_partial_json()[0], Foo)
assert isinstance(Bar().to_partial_json()[1], Foo)
# in particular...
assert Bar().to_partial_json() != ['foo', 'foo']
:raises acme.jose.errors.SerializationError:
in case of any serialization error.
:returns: Partially serializable object.
"""
raise NotImplementedError()
def to_json(self):
"""Fully serialize.
Again, following the example from before, **full serialization**
means the following::
assert Bar().to_json() == ['foo', 'foo']
:raises acme.jose.errors.SerializationError:
in case of any serialization error.
:returns: Fully serialized object.
"""
def _serialize(obj):
if isinstance(obj, JSONDeSerializable):
return _serialize(obj.to_partial_json())
if isinstance(obj, six.string_types): # strings are Sequence
return obj
elif isinstance(obj, list):
return [_serialize(subobj) for subobj in obj]
elif isinstance(obj, collections.Sequence):
# default to tuple, otherwise Mapping could get
# unhashable list
return tuple(_serialize(subobj) for subobj in obj)
elif isinstance(obj, collections.Mapping):
return dict((_serialize(key), _serialize(value))
for key, value in six.iteritems(obj))
else:
return obj
return _serialize(self)
@util.abstractclassmethod
def from_json(cls, jobj): # pylint: disable=unused-argument
"""Deserialize a decoded JSON document.
:param jobj: Python object, composed of only other basic data
types, as decoded from JSON document. Not necessarily
:class:`dict` (as decoded from "JSON object" document).
:raises acme.jose.errors.DeserializationError:
if decoding was unsuccessful, e.g. in case of unparseable
X509 certificate, or wrong padding in JOSE base64 encoded
string, etc.
"""
# TypeError: Can't instantiate abstract class <cls> with
# abstract methods from_json, to_partial_json
return cls() # pylint: disable=abstract-class-instantiated
@classmethod
def json_loads(cls, json_string):
"""Deserialize from JSON document string."""
try:
loads = json.loads(json_string)
except ValueError as error:
raise errors.DeserializationError(error)
return cls.from_json(loads)
def json_dumps(self, **kwargs):
"""Dump to JSON string using proper serializer.
:returns: JSON document string.
:rtype: str
"""
return json.dumps(self, default=self.json_dump_default, **kwargs)
def json_dumps_pretty(self):
"""Dump the object to pretty JSON document string.
:rtype: str
"""
return self.json_dumps(sort_keys=True, indent=4, separators=(',', ': '))
@classmethod
def json_dump_default(cls, python_object):
"""Serialize Python object.
This function is meant to be passed as ``default`` to
:func:`json.dump` or :func:`json.dumps`. They call
``default(python_object)`` only for non-basic Python types, so
this function necessarily raises :class:`TypeError` if
``python_object`` is not an instance of
:class:`IJSONSerializable`.
Please read the class docstring for more information.
"""
if isinstance(python_object, JSONDeSerializable):
return python_object.to_partial_json()
else: # this branch is necessary, cannot just "return"
raise TypeError(repr(python_object) + ' is not JSON serializable')

View File

@@ -1,114 +0,0 @@
"""Tests for acme.jose.interfaces."""
import unittest
class JSONDeSerializableTest(unittest.TestCase):
# pylint: disable=too-many-instance-attributes
def setUp(self):
from acme.jose.interfaces import JSONDeSerializable
# pylint: disable=missing-docstring,invalid-name
class Basic(JSONDeSerializable):
def __init__(self, v):
self.v = v
def to_partial_json(self):
return self.v
@classmethod
def from_json(cls, jobj):
return cls(jobj)
class Sequence(JSONDeSerializable):
def __init__(self, x, y):
self.x = x
self.y = y
def to_partial_json(self):
return [self.x, self.y]
@classmethod
def from_json(cls, jobj):
return cls(
Basic.from_json(jobj[0]), Basic.from_json(jobj[1]))
class Mapping(JSONDeSerializable):
def __init__(self, x, y):
self.x = x
self.y = y
def to_partial_json(self):
return {self.x: self.y}
@classmethod
def from_json(cls, jobj):
pass # pragma: no cover
self.basic1 = Basic('foo1')
self.basic2 = Basic('foo2')
self.seq = Sequence(self.basic1, self.basic2)
self.mapping = Mapping(self.basic1, self.basic2)
self.nested = Basic([[self.basic1]])
self.tuple = Basic(('foo',))
# pylint: disable=invalid-name
self.Basic = Basic
self.Sequence = Sequence
self.Mapping = Mapping
def test_to_json_sequence(self):
self.assertEqual(self.seq.to_json(), ['foo1', 'foo2'])
def test_to_json_mapping(self):
self.assertEqual(self.mapping.to_json(), {'foo1': 'foo2'})
def test_to_json_other(self):
mock_value = object()
self.assertTrue(self.Basic(mock_value).to_json() is mock_value)
def test_to_json_nested(self):
self.assertEqual(self.nested.to_json(), [['foo1']])
def test_to_json(self):
self.assertEqual(self.tuple.to_json(), (('foo', )))
def test_from_json_not_implemented(self):
from acme.jose.interfaces import JSONDeSerializable
self.assertRaises(TypeError, JSONDeSerializable.from_json, 'xxx')
def test_json_loads(self):
seq = self.Sequence.json_loads('["foo1", "foo2"]')
self.assertTrue(isinstance(seq, self.Sequence))
self.assertTrue(isinstance(seq.x, self.Basic))
self.assertTrue(isinstance(seq.y, self.Basic))
self.assertEqual(seq.x.v, 'foo1')
self.assertEqual(seq.y.v, 'foo2')
def test_json_dumps(self):
self.assertEqual('["foo1", "foo2"]', self.seq.json_dumps())
def test_json_dumps_pretty(self):
self.assertEqual(self.seq.json_dumps_pretty(),
'[\n "foo1",\n "foo2"\n]')
def test_json_dump_default(self):
from acme.jose.interfaces import JSONDeSerializable
self.assertEqual(
'foo1', JSONDeSerializable.json_dump_default(self.basic1))
jobj = JSONDeSerializable.json_dump_default(self.seq)
self.assertEqual(len(jobj), 2)
self.assertTrue(jobj[0] is self.basic1)
self.assertTrue(jobj[1] is self.basic2)
def test_json_dump_default_type_error(self):
from acme.jose.interfaces import JSONDeSerializable
self.assertRaises(
TypeError, JSONDeSerializable.json_dump_default, object())
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,485 +0,0 @@
"""JSON (de)serialization framework.
The framework presented here is somewhat based on `Go's "json" package`_
(especially the ``omitempty`` functionality).
.. _`Go's "json" package`: http://golang.org/pkg/encoding/json/
"""
import abc
import binascii
import logging
import OpenSSL
import six
from acme.jose import b64
from acme.jose import errors
from acme.jose import interfaces
from acme.jose import util
logger = logging.getLogger(__name__)
class Field(object):
"""JSON object field.
:class:`Field` is meant to be used together with
:class:`JSONObjectWithFields`.
``encoder`` (``decoder``) is a callable that accepts a single
parameter, i.e. a value to be encoded (decoded), and returns the
serialized (deserialized) value. In case of errors it should raise
:class:`~acme.jose.errors.SerializationError`
(:class:`~acme.jose.errors.DeserializationError`).
Note, that ``decoder`` should perform partial serialization only.
:ivar str json_name: Name of the field when encoded to JSON.
:ivar default: Default value (used when not present in JSON object).
:ivar bool omitempty: If ``True`` and the field value is empty, then
it will not be included in the serialized JSON object, and
``default`` will be used for deserialization. Otherwise, if ``False``,
field is considered as required, value will always be included in the
serialized JSON objected, and it must also be present when
deserializing.
"""
__slots__ = ('json_name', 'default', 'omitempty', 'fdec', 'fenc')
def __init__(self, json_name, default=None, omitempty=False,
decoder=None, encoder=None):
# pylint: disable=too-many-arguments
self.json_name = json_name
self.default = default
self.omitempty = omitempty
self.fdec = self.default_decoder if decoder is None else decoder
self.fenc = self.default_encoder if encoder is None else encoder
@classmethod
def _empty(cls, value):
"""Is the provided value considered "empty" for this field?
This is useful for subclasses that might want to override the
definition of being empty, e.g. for some more exotic data types.
"""
return not isinstance(value, bool) and not value
def omit(self, value):
"""Omit the value in output?"""
return self._empty(value) and self.omitempty
def _update_params(self, **kwargs):
current = dict(json_name=self.json_name, default=self.default,
omitempty=self.omitempty,
decoder=self.fdec, encoder=self.fenc)
current.update(kwargs)
return type(self)(**current) # pylint: disable=star-args
def decoder(self, fdec):
"""Descriptor to change the decoder on JSON object field."""
return self._update_params(decoder=fdec)
def encoder(self, fenc):
"""Descriptor to change the encoder on JSON object field."""
return self._update_params(encoder=fenc)
def decode(self, value):
"""Decode a value, optionally with context JSON object."""
return self.fdec(value)
def encode(self, value):
"""Encode a value, optionally with context JSON object."""
return self.fenc(value)
@classmethod
def default_decoder(cls, value):
"""Default decoder.
Recursively deserialize into immutable types (
:class:`acme.jose.util.frozendict` instead of
:func:`dict`, :func:`tuple` instead of :func:`list`).
"""
# bases cases for different types returned by json.loads
if isinstance(value, list):
return tuple(cls.default_decoder(subvalue) for subvalue in value)
elif isinstance(value, dict):
return util.frozendict(
dict((cls.default_decoder(key), cls.default_decoder(value))
for key, value in six.iteritems(value)))
else: # integer or string
return value
@classmethod
def default_encoder(cls, value):
"""Default (passthrough) encoder."""
# field.to_partial_json() is no good as encoder has to do partial
# serialization only
return value
class JSONObjectWithFieldsMeta(abc.ABCMeta):
"""Metaclass for :class:`JSONObjectWithFields` and its subclasses.
It makes sure that, for any class ``cls`` with ``__metaclass__``
set to ``JSONObjectWithFieldsMeta``:
1. All fields (attributes of type :class:`Field`) in the class
definition are moved to the ``cls._fields`` dictionary, where
keys are field attribute names and values are fields themselves.
2. ``cls.__slots__`` is extended by all field attribute names
(i.e. not :attr:`Field.json_name`). Original ``cls.__slots__``
are stored in ``cls._orig_slots``.
In a consequence, for a field attribute name ``some_field``,
``cls.some_field`` will be a slot descriptor and not an instance
of :class:`Field`. For example::
some_field = Field('someField', default=())
class Foo(object):
__metaclass__ = JSONObjectWithFieldsMeta
__slots__ = ('baz',)
some_field = some_field
assert Foo.__slots__ == ('some_field', 'baz')
assert Foo._orig_slots == ()
assert Foo.some_field is not Field
assert Foo._fields.keys() == ['some_field']
assert Foo._fields['some_field'] is some_field
As an implementation note, this metaclass inherits from
:class:`abc.ABCMeta` (and not the usual :class:`type`) to mitigate
the metaclass conflict (:class:`ImmutableMap` and
:class:`JSONDeSerializable`, parents of :class:`JSONObjectWithFields`,
use :class:`abc.ABCMeta` as its metaclass).
"""
def __new__(mcs, name, bases, dikt):
fields = {}
for base in bases:
fields.update(getattr(base, '_fields', {}))
# Do not reorder, this class might override fields from base classes!
for key, value in tuple(six.iteritems(dikt)):
# not six.iterkeys() (in-place edit!)
if isinstance(value, Field):
fields[key] = dikt.pop(key)
dikt['_orig_slots'] = dikt.get('__slots__', ())
dikt['__slots__'] = tuple(
list(dikt['_orig_slots']) + list(six.iterkeys(fields)))
dikt['_fields'] = fields
return abc.ABCMeta.__new__(mcs, name, bases, dikt)
@six.add_metaclass(JSONObjectWithFieldsMeta)
class JSONObjectWithFields(util.ImmutableMap, interfaces.JSONDeSerializable):
# pylint: disable=too-few-public-methods
"""JSON object with fields.
Example::
class Foo(JSONObjectWithFields):
bar = Field('Bar')
empty = Field('Empty', omitempty=True)
@bar.encoder
def bar(value):
return value + 'bar'
@bar.decoder
def bar(value):
if not value.endswith('bar'):
raise errors.DeserializationError('No bar suffix!')
return value[:-3]
assert Foo(bar='baz').to_partial_json() == {'Bar': 'bazbar'}
assert Foo.from_json({'Bar': 'bazbar'}) == Foo(bar='baz')
assert (Foo.from_json({'Bar': 'bazbar', 'Empty': '!'})
== Foo(bar='baz', empty='!'))
assert Foo(bar='baz').bar == 'baz'
"""
@classmethod
def _defaults(cls):
"""Get default fields values."""
return dict([(slot, field.default) for slot, field
in six.iteritems(cls._fields)])
def __init__(self, **kwargs):
# pylint: disable=star-args
super(JSONObjectWithFields, self).__init__(
**(dict(self._defaults(), **kwargs)))
def encode(self, name):
"""Encode a single field.
:param str name: Name of the field to be encoded.
:raises errors.SerializationError: if field cannot be serialized
:raises errors.Error: if field could not be found
"""
try:
field = self._fields[name]
except KeyError:
raise errors.Error("Field not found: {0}".format(name))
return field.encode(getattr(self, name))
def fields_to_partial_json(self):
"""Serialize fields to JSON."""
jobj = {}
omitted = set()
for slot, field in six.iteritems(self._fields):
value = getattr(self, slot)
if field.omit(value):
omitted.add((slot, value))
else:
try:
jobj[field.json_name] = field.encode(value)
except errors.SerializationError as error:
raise errors.SerializationError(
'Could not encode {0} ({1}): {2}'.format(
slot, value, error))
return jobj
def to_partial_json(self):
return self.fields_to_partial_json()
@classmethod
def _check_required(cls, jobj):
missing = set()
for _, field in six.iteritems(cls._fields):
if not field.omitempty and field.json_name not in jobj:
missing.add(field.json_name)
if missing:
raise errors.DeserializationError(
'The following fields are required: {0}'.format(
','.join(missing)))
@classmethod
def fields_from_json(cls, jobj):
"""Deserialize fields from JSON."""
cls._check_required(jobj)
fields = {}
for slot, field in six.iteritems(cls._fields):
if field.json_name not in jobj and field.omitempty:
fields[slot] = field.default
else:
value = jobj[field.json_name]
try:
fields[slot] = field.decode(value)
except errors.DeserializationError as error:
raise errors.DeserializationError(
'Could not decode {0!r} ({1!r}): {2}'.format(
slot, value, error))
return fields
@classmethod
def from_json(cls, jobj):
return cls(**cls.fields_from_json(jobj))
def encode_b64jose(data):
"""Encode JOSE Base-64 field.
:param bytes data:
:rtype: `unicode`
"""
# b64encode produces ASCII characters only
return b64.b64encode(data).decode('ascii')
def decode_b64jose(data, size=None, minimum=False):
"""Decode JOSE Base-64 field.
:param unicode data:
:param int size: Required length (after decoding).
:param bool minimum: If ``True``, then `size` will be treated as
minimum required length, as opposed to exact equality.
:rtype: bytes
"""
error_cls = TypeError if six.PY2 else binascii.Error
try:
decoded = b64.b64decode(data.encode())
except error_cls as error:
raise errors.DeserializationError(error)
if size is not None and ((not minimum and len(decoded) != size) or
(minimum and len(decoded) < size)):
raise errors.DeserializationError(
"Expected at least or exactly {0} bytes".format(size))
return decoded
def encode_hex16(value):
"""Hexlify.
:param bytes value:
:rtype: unicode
"""
return binascii.hexlify(value).decode()
def decode_hex16(value, size=None, minimum=False):
"""Decode hexlified field.
:param unicode value:
:param int size: Required length (after decoding).
:param bool minimum: If ``True``, then `size` will be treated as
minimum required length, as opposed to exact equality.
:rtype: bytes
"""
value = value.encode()
if size is not None and ((not minimum and len(value) != size * 2) or
(minimum and len(value) < size * 2)):
raise errors.DeserializationError()
error_cls = TypeError if six.PY2 else binascii.Error
try:
return binascii.unhexlify(value)
except error_cls as error:
raise errors.DeserializationError(error)
def encode_cert(cert):
"""Encode certificate as JOSE Base-64 DER.
:type cert: `OpenSSL.crypto.X509` wrapped in `.ComparableX509`
:rtype: unicode
"""
return encode_b64jose(OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_ASN1, cert.wrapped))
def decode_cert(b64der):
"""Decode JOSE Base-64 DER-encoded certificate.
:param unicode b64der:
:rtype: `OpenSSL.crypto.X509` wrapped in `.ComparableX509`
"""
try:
return util.ComparableX509(OpenSSL.crypto.load_certificate(
OpenSSL.crypto.FILETYPE_ASN1, decode_b64jose(b64der)))
except OpenSSL.crypto.Error as error:
raise errors.DeserializationError(error)
def encode_csr(csr):
"""Encode CSR as JOSE Base-64 DER.
:type csr: `OpenSSL.crypto.X509Req` wrapped in `.ComparableX509`
:rtype: unicode
"""
return encode_b64jose(OpenSSL.crypto.dump_certificate_request(
OpenSSL.crypto.FILETYPE_ASN1, csr.wrapped))
def decode_csr(b64der):
"""Decode JOSE Base-64 DER-encoded CSR.
:param unicode b64der:
:rtype: `OpenSSL.crypto.X509Req` wrapped in `.ComparableX509`
"""
try:
return util.ComparableX509(OpenSSL.crypto.load_certificate_request(
OpenSSL.crypto.FILETYPE_ASN1, decode_b64jose(b64der)))
except OpenSSL.crypto.Error as error:
raise errors.DeserializationError(error)
class TypedJSONObjectWithFields(JSONObjectWithFields):
"""JSON object with type."""
typ = NotImplemented
"""Type of the object. Subclasses must override."""
type_field_name = "type"
"""Field name used to distinguish different object types.
Subclasses will probably have to override this.
"""
TYPES = NotImplemented
"""Types registered for JSON deserialization"""
@classmethod
def register(cls, type_cls, typ=None):
"""Register class for JSON deserialization."""
typ = type_cls.typ if typ is None else typ
cls.TYPES[typ] = type_cls
return type_cls
@classmethod
def get_type_cls(cls, jobj):
"""Get the registered class for ``jobj``."""
if cls in six.itervalues(cls.TYPES):
if cls.type_field_name not in jobj:
raise errors.DeserializationError(
"Missing type field ({0})".format(cls.type_field_name))
# cls is already registered type_cls, force to use it
# so that, e.g Revocation.from_json(jobj) fails if
# jobj["type"] != "revocation".
return cls
if not isinstance(jobj, dict):
raise errors.DeserializationError(
"{0} is not a dictionary object".format(jobj))
try:
typ = jobj[cls.type_field_name]
except KeyError:
raise errors.DeserializationError("missing type field")
try:
return cls.TYPES[typ]
except KeyError:
raise errors.UnrecognizedTypeError(typ, jobj)
def to_partial_json(self):
"""Get JSON serializable object.
:returns: Serializable JSON object representing ACME typed object.
:meth:`validate` will almost certainly not work, due to reasons
explained in :class:`acme.interfaces.IJSONSerializable`.
:rtype: dict
"""
jobj = self.fields_to_partial_json()
jobj[self.type_field_name] = self.typ
return jobj
@classmethod
def from_json(cls, jobj):
"""Deserialize ACME object from valid JSON object.
:raises acme.errors.UnrecognizedTypeError: if type
of the ACME object has not been registered.
"""
# make sure subclasses don't cause infinite recursive from_json calls
type_cls = cls.get_type_cls(jobj)
return type_cls(**type_cls.fields_from_json(jobj))

View File

@@ -1,381 +0,0 @@
"""Tests for acme.jose.json_util."""
import itertools
import unittest
import mock
import six
from acme import test_util
from acme.jose import errors
from acme.jose import interfaces
from acme.jose import util
CERT = test_util.load_comparable_cert('cert.pem')
CSR = test_util.load_comparable_csr('csr.pem')
class FieldTest(unittest.TestCase):
"""Tests for acme.jose.json_util.Field."""
def test_no_omit_boolean(self):
from acme.jose.json_util import Field
for default, omitempty, value in itertools.product(
[True, False], [True, False], [True, False]):
self.assertFalse(
Field("foo", default=default, omitempty=omitempty).omit(value))
def test_descriptors(self):
mock_value = mock.MagicMock()
# pylint: disable=missing-docstring
def decoder(unused_value):
return 'd'
def encoder(unused_value):
return 'e'
from acme.jose.json_util import Field
field = Field('foo')
field = field.encoder(encoder)
self.assertEqual('e', field.encode(mock_value))
field = field.decoder(decoder)
self.assertEqual('e', field.encode(mock_value))
self.assertEqual('d', field.decode(mock_value))
def test_default_encoder_is_partial(self):
class MockField(interfaces.JSONDeSerializable):
# pylint: disable=missing-docstring
def to_partial_json(self):
return 'foo' # pragma: no cover
@classmethod
def from_json(cls, jobj):
pass # pragma: no cover
mock_field = MockField()
from acme.jose.json_util import Field
self.assertTrue(Field.default_encoder(mock_field) is mock_field)
# in particular...
self.assertNotEqual('foo', Field.default_encoder(mock_field))
def test_default_encoder_passthrough(self):
mock_value = mock.MagicMock()
from acme.jose.json_util import Field
self.assertTrue(Field.default_encoder(mock_value) is mock_value)
def test_default_decoder_list_to_tuple(self):
from acme.jose.json_util import Field
self.assertEqual((1, 2, 3), Field.default_decoder([1, 2, 3]))
def test_default_decoder_dict_to_frozendict(self):
from acme.jose.json_util import Field
obj = Field.default_decoder({'x': 2})
self.assertTrue(isinstance(obj, util.frozendict))
self.assertEqual(obj, util.frozendict(x=2))
def test_default_decoder_passthrough(self):
mock_value = mock.MagicMock()
from acme.jose.json_util import Field
self.assertTrue(Field.default_decoder(mock_value) is mock_value)
class JSONObjectWithFieldsMetaTest(unittest.TestCase):
"""Tests for acme.jose.json_util.JSONObjectWithFieldsMeta."""
def setUp(self):
from acme.jose.json_util import Field
from acme.jose.json_util import JSONObjectWithFieldsMeta
self.field = Field('Baz')
self.field2 = Field('Baz2')
# pylint: disable=invalid-name,missing-docstring,too-few-public-methods
# pylint: disable=blacklisted-name
@six.add_metaclass(JSONObjectWithFieldsMeta)
class A(object):
__slots__ = ('bar',)
baz = self.field
class B(A):
pass
class C(A):
baz = self.field2
self.a_cls = A
self.b_cls = B
self.c_cls = C
def test_fields(self):
# pylint: disable=protected-access,no-member
self.assertEqual({'baz': self.field}, self.a_cls._fields)
self.assertEqual({'baz': self.field}, self.b_cls._fields)
def test_fields_inheritance(self):
# pylint: disable=protected-access,no-member
self.assertEqual({'baz': self.field2}, self.c_cls._fields)
def test_slots(self):
self.assertEqual(('bar', 'baz'), self.a_cls.__slots__)
self.assertEqual(('baz',), self.b_cls.__slots__)
def test_orig_slots(self):
# pylint: disable=protected-access,no-member
self.assertEqual(('bar',), self.a_cls._orig_slots)
self.assertEqual((), self.b_cls._orig_slots)
class JSONObjectWithFieldsTest(unittest.TestCase):
"""Tests for acme.jose.json_util.JSONObjectWithFields."""
# pylint: disable=protected-access
def setUp(self):
from acme.jose.json_util import JSONObjectWithFields
from acme.jose.json_util import Field
class MockJSONObjectWithFields(JSONObjectWithFields):
# pylint: disable=invalid-name,missing-docstring,no-self-argument
# pylint: disable=too-few-public-methods
x = Field('x', omitempty=True,
encoder=(lambda x: x * 2),
decoder=(lambda x: x / 2))
y = Field('y')
z = Field('Z') # on purpose uppercase
@y.encoder
def y(value):
if value == 500:
raise errors.SerializationError()
return value
@y.decoder
def y(value):
if value == 500:
raise errors.DeserializationError()
return value
# pylint: disable=invalid-name
self.MockJSONObjectWithFields = MockJSONObjectWithFields
self.mock = MockJSONObjectWithFields(x=None, y=2, z=3)
def test_init_defaults(self):
self.assertEqual(self.mock, self.MockJSONObjectWithFields(y=2, z=3))
def test_encode(self):
self.assertEqual(10, self.MockJSONObjectWithFields(
x=5, y=0, z=0).encode("x"))
def test_encode_wrong_field(self):
self.assertRaises(errors.Error, self.mock.encode, 'foo')
def test_encode_serialization_error_passthrough(self):
self.assertRaises(
errors.SerializationError,
self.MockJSONObjectWithFields(y=500, z=None).encode, "y")
def test_fields_to_partial_json_omits_empty(self):
self.assertEqual(self.mock.fields_to_partial_json(), {'y': 2, 'Z': 3})
def test_fields_from_json_fills_default_for_empty(self):
self.assertEqual(
{'x': None, 'y': 2, 'z': 3},
self.MockJSONObjectWithFields.fields_from_json({'y': 2, 'Z': 3}))
def test_fields_from_json_fails_on_missing(self):
self.assertRaises(
errors.DeserializationError,
self.MockJSONObjectWithFields.fields_from_json, {'y': 0})
self.assertRaises(
errors.DeserializationError,
self.MockJSONObjectWithFields.fields_from_json, {'Z': 0})
self.assertRaises(
errors.DeserializationError,
self.MockJSONObjectWithFields.fields_from_json, {'x': 0, 'y': 0})
self.assertRaises(
errors.DeserializationError,
self.MockJSONObjectWithFields.fields_from_json, {'x': 0, 'Z': 0})
def test_fields_to_partial_json_encoder(self):
self.assertEqual(
self.MockJSONObjectWithFields(x=1, y=2, z=3).to_partial_json(),
{'x': 2, 'y': 2, 'Z': 3})
def test_fields_from_json_decoder(self):
self.assertEqual(
{'x': 2, 'y': 2, 'z': 3},
self.MockJSONObjectWithFields.fields_from_json(
{'x': 4, 'y': 2, 'Z': 3}))
def test_fields_to_partial_json_error_passthrough(self):
self.assertRaises(
errors.SerializationError, self.MockJSONObjectWithFields(
x=1, y=500, z=3).to_partial_json)
def test_fields_from_json_error_passthrough(self):
self.assertRaises(
errors.DeserializationError,
self.MockJSONObjectWithFields.from_json,
{'x': 4, 'y': 500, 'Z': 3})
class DeEncodersTest(unittest.TestCase):
def setUp(self):
self.b64_cert = (
u'MIIB3jCCAYigAwIBAgICBTkwDQYJKoZIhvcNAQELBQAwdzELMAkGA1UEBhM'
u'CVVMxETAPBgNVBAgMCE1pY2hpZ2FuMRIwEAYDVQQHDAlBbm4gQXJib3IxKz'
u'ApBgNVBAoMIlVuaXZlcnNpdHkgb2YgTWljaGlnYW4gYW5kIHRoZSBFRkYxF'
u'DASBgNVBAMMC2V4YW1wbGUuY29tMB4XDTE0MTIxMTIyMzQ0NVoXDTE0MTIx'
u'ODIyMzQ0NVowdzELMAkGA1UEBhMCVVMxETAPBgNVBAgMCE1pY2hpZ2FuMRI'
u'wEAYDVQQHDAlBbm4gQXJib3IxKzApBgNVBAoMIlVuaXZlcnNpdHkgb2YgTW'
u'ljaGlnYW4gYW5kIHRoZSBFRkYxFDASBgNVBAMMC2V4YW1wbGUuY29tMFwwD'
u'QYJKoZIhvcNAQEBBQADSwAwSAJBAKx1c7RR7R_drnBSQ_zfx1vQLHUbFLh1'
u'AQQQ5R8DZUXd36efNK79vukFhN9HFoHZiUvOjm0c-pVE6K-EdE_twuUCAwE'
u'AATANBgkqhkiG9w0BAQsFAANBAC24z0IdwIVKSlntksllvr6zJepBH5fMnd'
u'fk3XJp10jT6VE-14KNtjh02a56GoraAvJAT5_H67E8GvJ_ocNnB_o'
)
self.b64_csr = (
u'MIIBXTCCAQcCAQAweTELMAkGA1UEBhMCVVMxETAPBgNVBAgMCE1pY2hpZ2F'
u'uMRIwEAYDVQQHDAlBbm4gQXJib3IxDDAKBgNVBAoMA0VGRjEfMB0GA1UECw'
u'wWVW5pdmVyc2l0eSBvZiBNaWNoaWdhbjEUMBIGA1UEAwwLZXhhbXBsZS5jb'
u'20wXDANBgkqhkiG9w0BAQEFAANLADBIAkEArHVztFHtH92ucFJD_N_HW9As'
u'dRsUuHUBBBDlHwNlRd3fp580rv2-6QWE30cWgdmJS86ObRz6lUTor4R0T-3'
u'C5QIDAQABoCkwJwYJKoZIhvcNAQkOMRowGDAWBgNVHREEDzANggtleGFtcG'
u'xlLmNvbTANBgkqhkiG9w0BAQsFAANBAHJH_O6BtC9aGzEVCMGOZ7z9iIRHW'
u'Szr9x_bOzn7hLwsbXPAgO1QxEwL-X-4g20Gn9XBE1N9W6HCIEut2d8wACg'
)
def test_encode_b64jose(self):
from acme.jose.json_util import encode_b64jose
encoded = encode_b64jose(b'x')
self.assertTrue(isinstance(encoded, six.string_types))
self.assertEqual(u'eA', encoded)
def test_decode_b64jose(self):
from acme.jose.json_util import decode_b64jose
decoded = decode_b64jose(u'eA')
self.assertTrue(isinstance(decoded, six.binary_type))
self.assertEqual(b'x', decoded)
def test_decode_b64jose_padding_error(self):
from acme.jose.json_util import decode_b64jose
self.assertRaises(errors.DeserializationError, decode_b64jose, u'x')
def test_decode_b64jose_size(self):
from acme.jose.json_util import decode_b64jose
self.assertEqual(b'foo', decode_b64jose(u'Zm9v', size=3))
self.assertRaises(
errors.DeserializationError, decode_b64jose, u'Zm9v', size=2)
self.assertRaises(
errors.DeserializationError, decode_b64jose, u'Zm9v', size=4)
def test_decode_b64jose_minimum_size(self):
from acme.jose.json_util import decode_b64jose
self.assertEqual(b'foo', decode_b64jose(u'Zm9v', size=3, minimum=True))
self.assertEqual(b'foo', decode_b64jose(u'Zm9v', size=2, minimum=True))
self.assertRaises(errors.DeserializationError, decode_b64jose,
u'Zm9v', size=4, minimum=True)
def test_encode_hex16(self):
from acme.jose.json_util import encode_hex16
encoded = encode_hex16(b'foo')
self.assertEqual(u'666f6f', encoded)
self.assertTrue(isinstance(encoded, six.string_types))
def test_decode_hex16(self):
from acme.jose.json_util import decode_hex16
decoded = decode_hex16(u'666f6f')
self.assertEqual(b'foo', decoded)
self.assertTrue(isinstance(decoded, six.binary_type))
def test_decode_hex16_minimum_size(self):
from acme.jose.json_util import decode_hex16
self.assertEqual(b'foo', decode_hex16(u'666f6f', size=3, minimum=True))
self.assertEqual(b'foo', decode_hex16(u'666f6f', size=2, minimum=True))
self.assertRaises(errors.DeserializationError, decode_hex16,
u'666f6f', size=4, minimum=True)
def test_decode_hex16_odd_length(self):
from acme.jose.json_util import decode_hex16
self.assertRaises(errors.DeserializationError, decode_hex16, u'x')
def test_encode_cert(self):
from acme.jose.json_util import encode_cert
self.assertEqual(self.b64_cert, encode_cert(CERT))
def test_decode_cert(self):
from acme.jose.json_util import decode_cert
cert = decode_cert(self.b64_cert)
self.assertTrue(isinstance(cert, util.ComparableX509))
self.assertEqual(cert, CERT)
self.assertRaises(errors.DeserializationError, decode_cert, u'')
def test_encode_csr(self):
from acme.jose.json_util import encode_csr
self.assertEqual(self.b64_csr, encode_csr(CSR))
def test_decode_csr(self):
from acme.jose.json_util import decode_csr
csr = decode_csr(self.b64_csr)
self.assertTrue(isinstance(csr, util.ComparableX509))
self.assertEqual(csr, CSR)
self.assertRaises(errors.DeserializationError, decode_csr, u'')
class TypedJSONObjectWithFieldsTest(unittest.TestCase):
def setUp(self):
from acme.jose.json_util import TypedJSONObjectWithFields
# pylint: disable=missing-docstring,abstract-method
# pylint: disable=too-few-public-methods
class MockParentTypedJSONObjectWithFields(TypedJSONObjectWithFields):
TYPES = {}
type_field_name = 'type'
@MockParentTypedJSONObjectWithFields.register
class MockTypedJSONObjectWithFields(
MockParentTypedJSONObjectWithFields):
typ = 'test'
__slots__ = ('foo',)
@classmethod
def fields_from_json(cls, jobj):
return {'foo': jobj['foo']}
def fields_to_partial_json(self):
return {'foo': self.foo}
self.parent_cls = MockParentTypedJSONObjectWithFields
self.msg = MockTypedJSONObjectWithFields(foo='bar')
def test_to_partial_json(self):
self.assertEqual(self.msg.to_partial_json(), {
'type': 'test',
'foo': 'bar',
})
def test_from_json_non_dict_fails(self):
for value in [[], (), 5, "asd"]: # all possible input types
self.assertRaises(
errors.DeserializationError, self.parent_cls.from_json, value)
def test_from_json_dict_no_type_fails(self):
self.assertRaises(
errors.DeserializationError, self.parent_cls.from_json, {})
def test_from_json_unknown_type_fails(self):
self.assertRaises(errors.UnrecognizedTypeError,
self.parent_cls.from_json, {'type': 'bar'})
def test_from_json_returns_obj(self):
self.assertEqual({'foo': 'bar'}, self.parent_cls.from_json(
{'type': 'test', 'foo': 'bar'}))
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,180 +0,0 @@
"""JSON Web Algorithm.
https://tools.ietf.org/html/draft-ietf-jose-json-web-algorithms-40
"""
import abc
import collections
import logging
import cryptography.exceptions
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import hashes # type: ignore
from cryptography.hazmat.primitives import hmac # type: ignore
from cryptography.hazmat.primitives.asymmetric import padding # type: ignore
from acme.jose import errors
from acme.jose import interfaces
from acme.jose import jwk
logger = logging.getLogger(__name__)
class JWA(interfaces.JSONDeSerializable): # pylint: disable=abstract-method
# pylint: disable=too-few-public-methods
# for some reason disable=abstract-method has to be on the line
# above...
"""JSON Web Algorithm."""
class JWASignature(JWA, collections.Hashable): # type: ignore
"""JSON Web Signature Algorithm."""
SIGNATURES = {} # type: dict
def __init__(self, name):
self.name = name
def __eq__(self, other):
if not isinstance(other, JWASignature):
return NotImplemented
return self.name == other.name
def __hash__(self):
return hash((self.__class__, self.name))
def __ne__(self, other):
return not self == other
@classmethod
def register(cls, signature_cls):
"""Register class for JSON deserialization."""
cls.SIGNATURES[signature_cls.name] = signature_cls
return signature_cls
def to_partial_json(self):
return self.name
@classmethod
def from_json(cls, jobj):
return cls.SIGNATURES[jobj]
@abc.abstractmethod
def sign(self, key, msg): # pragma: no cover
"""Sign the ``msg`` using ``key``."""
raise NotImplementedError()
@abc.abstractmethod
def verify(self, key, msg, sig): # pragma: no cover
"""Verify the ``msg` and ``sig`` using ``key``."""
raise NotImplementedError()
def __repr__(self):
return self.name
class _JWAHS(JWASignature):
kty = jwk.JWKOct
def __init__(self, name, hash_):
super(_JWAHS, self).__init__(name)
self.hash = hash_()
def sign(self, key, msg):
signer = hmac.HMAC(key, self.hash, backend=default_backend())
signer.update(msg)
return signer.finalize()
def verify(self, key, msg, sig):
verifier = hmac.HMAC(key, self.hash, backend=default_backend())
verifier.update(msg)
try:
verifier.verify(sig)
except cryptography.exceptions.InvalidSignature as error:
logger.debug(error, exc_info=True)
return False
else:
return True
class _JWARSA(object):
kty = jwk.JWKRSA
padding = NotImplemented
hash = NotImplemented
def sign(self, key, msg):
"""Sign the ``msg`` using ``key``."""
try:
signer = key.signer(self.padding, self.hash)
except AttributeError as error:
logger.debug(error, exc_info=True)
raise errors.Error("Public key cannot be used for signing")
except ValueError as error: # digest too large
logger.debug(error, exc_info=True)
raise errors.Error(str(error))
signer.update(msg)
try:
return signer.finalize()
except ValueError as error:
logger.debug(error, exc_info=True)
raise errors.Error(str(error))
def verify(self, key, msg, sig):
"""Verify the ``msg` and ``sig`` using ``key``."""
verifier = key.verifier(sig, self.padding, self.hash)
verifier.update(msg)
try:
verifier.verify()
except cryptography.exceptions.InvalidSignature as error:
logger.debug(error, exc_info=True)
return False
else:
return True
class _JWARS(_JWARSA, JWASignature):
def __init__(self, name, hash_):
super(_JWARS, self).__init__(name)
self.padding = padding.PKCS1v15()
self.hash = hash_()
class _JWAPS(_JWARSA, JWASignature):
def __init__(self, name, hash_):
super(_JWAPS, self).__init__(name)
self.padding = padding.PSS(
mgf=padding.MGF1(hash_()),
salt_length=padding.PSS.MAX_LENGTH)
self.hash = hash_()
class _JWAES(JWASignature): # pylint: disable=abstract-class-not-used
# TODO: implement ES signatures
def sign(self, key, msg): # pragma: no cover
raise NotImplementedError()
def verify(self, key, msg, sig): # pragma: no cover
raise NotImplementedError()
HS256 = JWASignature.register(_JWAHS('HS256', hashes.SHA256))
HS384 = JWASignature.register(_JWAHS('HS384', hashes.SHA384))
HS512 = JWASignature.register(_JWAHS('HS512', hashes.SHA512))
RS256 = JWASignature.register(_JWARS('RS256', hashes.SHA256))
RS384 = JWASignature.register(_JWARS('RS384', hashes.SHA384))
RS512 = JWASignature.register(_JWARS('RS512', hashes.SHA512))
PS256 = JWASignature.register(_JWAPS('PS256', hashes.SHA256))
PS384 = JWASignature.register(_JWAPS('PS384', hashes.SHA384))
PS512 = JWASignature.register(_JWAPS('PS512', hashes.SHA512))
ES256 = JWASignature.register(_JWAES('ES256'))
ES384 = JWASignature.register(_JWAES('ES384'))
ES512 = JWASignature.register(_JWAES('ES512'))

View File

@@ -1,104 +0,0 @@
"""Tests for acme.jose.jwa."""
import unittest
from acme import test_util
from acme.jose import errors
RSA256_KEY = test_util.load_rsa_private_key('rsa256_key.pem')
RSA512_KEY = test_util.load_rsa_private_key('rsa512_key.pem')
RSA1024_KEY = test_util.load_rsa_private_key('rsa1024_key.pem')
class JWASignatureTest(unittest.TestCase):
"""Tests for acme.jose.jwa.JWASignature."""
def setUp(self):
from acme.jose.jwa import JWASignature
class MockSig(JWASignature):
# pylint: disable=missing-docstring,too-few-public-methods
# pylint: disable=abstract-class-not-used
def sign(self, key, msg):
raise NotImplementedError() # pragma: no cover
def verify(self, key, msg, sig):
raise NotImplementedError() # pragma: no cover
# pylint: disable=invalid-name
self.Sig1 = MockSig('Sig1')
self.Sig2 = MockSig('Sig2')
def test_eq(self):
self.assertEqual(self.Sig1, self.Sig1)
def test_ne(self):
self.assertNotEqual(self.Sig1, self.Sig2)
def test_ne_other_type(self):
self.assertNotEqual(self.Sig1, 5)
def test_repr(self):
self.assertEqual('Sig1', repr(self.Sig1))
self.assertEqual('Sig2', repr(self.Sig2))
def test_to_partial_json(self):
self.assertEqual(self.Sig1.to_partial_json(), 'Sig1')
self.assertEqual(self.Sig2.to_partial_json(), 'Sig2')
def test_from_json(self):
from acme.jose.jwa import JWASignature
from acme.jose.jwa import RS256
self.assertTrue(JWASignature.from_json('RS256') is RS256)
class JWAHSTest(unittest.TestCase): # pylint: disable=too-few-public-methods
def test_it(self):
from acme.jose.jwa import HS256
sig = (
b"\xceR\xea\xcd\x94\xab\xcf\xfb\xe0\xacA.:\x1a'\x08i\xe2\xc4"
b"\r\x85+\x0e\x85\xaeUZ\xd4\xb3\x97zO"
)
self.assertEqual(HS256.sign(b'some key', b'foo'), sig)
self.assertTrue(HS256.verify(b'some key', b'foo', sig) is True)
self.assertTrue(HS256.verify(b'some key', b'foo', sig + b'!') is False)
class JWARSTest(unittest.TestCase):
def test_sign_no_private_part(self):
from acme.jose.jwa import RS256
self.assertRaises(
errors.Error, RS256.sign, RSA512_KEY.public_key(), b'foo')
def test_sign_key_too_small(self):
from acme.jose.jwa import RS256
from acme.jose.jwa import PS256
self.assertRaises(errors.Error, RS256.sign, RSA256_KEY, b'foo')
self.assertRaises(errors.Error, PS256.sign, RSA256_KEY, b'foo')
def test_rs(self):
from acme.jose.jwa import RS256
sig = (
b'|\xc6\xb2\xa4\xab(\x87\x99\xfa*:\xea\xf8\xa0N&}\x9f\x0f\xc0O'
b'\xc6t\xa3\xe6\xfa\xbb"\x15Y\x80Y\xe0\x81\xb8\x88)\xba\x0c\x9c'
b'\xa4\x99\x1e\x19&\xd8\xc7\x99S\x97\xfc\x85\x0cOV\xe6\x07\x99'
b'\xd2\xb9.>}\xfd'
)
self.assertEqual(RS256.sign(RSA512_KEY, b'foo'), sig)
self.assertTrue(RS256.verify(RSA512_KEY.public_key(), b'foo', sig))
self.assertFalse(RS256.verify(
RSA512_KEY.public_key(), b'foo', sig + b'!'))
def test_ps(self):
from acme.jose.jwa import PS256
sig = PS256.sign(RSA1024_KEY, b'foo')
self.assertTrue(PS256.verify(RSA1024_KEY.public_key(), b'foo', sig))
self.assertFalse(PS256.verify(
RSA1024_KEY.public_key(), b'foo', sig + b'!'))
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,281 +0,0 @@
"""JSON Web Key."""
import abc
import binascii
import json
import logging
import cryptography.exceptions
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import hashes # type: ignore
from cryptography.hazmat.primitives import serialization
from cryptography.hazmat.primitives.asymmetric import ec # type: ignore
from cryptography.hazmat.primitives.asymmetric import rsa
import six
from acme.jose import errors
from acme.jose import json_util
from acme.jose import util
logger = logging.getLogger(__name__)
class JWK(json_util.TypedJSONObjectWithFields):
# pylint: disable=too-few-public-methods
"""JSON Web Key."""
type_field_name = 'kty'
TYPES = {} # type: dict
cryptography_key_types = () # type: tuple
"""Subclasses should override."""
required = NotImplemented
"""Required members of public key's representation as defined by JWK/JWA."""
_thumbprint_json_dumps_params = {
# "no whitespace or line breaks before or after any syntactic
# elements"
'indent': None,
'separators': (',', ':'),
# "members ordered lexicographically by the Unicode [UNICODE]
# code points of the member names"
'sort_keys': True,
}
def thumbprint(self, hash_function=hashes.SHA256):
"""Compute JWK Thumbprint.
https://tools.ietf.org/html/rfc7638
:returns bytes:
"""
digest = hashes.Hash(hash_function(), backend=default_backend())
digest.update(json.dumps(
dict((k, v) for k, v in six.iteritems(self.to_json())
if k in self.required),
**self._thumbprint_json_dumps_params).encode())
return digest.finalize()
@abc.abstractmethod
def public_key(self): # pragma: no cover
"""Generate JWK with public key.
For symmetric cryptosystems, this would return ``self``.
"""
raise NotImplementedError()
@classmethod
def _load_cryptography_key(cls, data, password=None, backend=None):
backend = default_backend() if backend is None else backend
exceptions = {}
# private key?
for loader in (serialization.load_pem_private_key,
serialization.load_der_private_key):
try:
return loader(data, password, backend)
except (ValueError, TypeError,
cryptography.exceptions.UnsupportedAlgorithm) as error:
exceptions[loader] = error
# public key?
for loader in (serialization.load_pem_public_key,
serialization.load_der_public_key):
try:
return loader(data, backend)
except (ValueError,
cryptography.exceptions.UnsupportedAlgorithm) as error:
exceptions[loader] = error
# no luck
raise errors.Error('Unable to deserialize key: {0}'.format(exceptions))
@classmethod
def load(cls, data, password=None, backend=None):
"""Load serialized key as JWK.
:param str data: Public or private key serialized as PEM or DER.
:param str password: Optional password.
:param backend: A `.PEMSerializationBackend` and
`.DERSerializationBackend` provider.
:raises errors.Error: if unable to deserialize, or unsupported
JWK algorithm
:returns: JWK of an appropriate type.
:rtype: `JWK`
"""
try:
key = cls._load_cryptography_key(data, password, backend)
except errors.Error as error:
logger.debug('Loading symmetric key, asymmetric failed: %s', error)
return JWKOct(key=data)
if cls.typ is not NotImplemented and not isinstance(
key, cls.cryptography_key_types):
raise errors.Error('Unable to deserialize {0} into {1}'.format(
key.__class__, cls.__class__))
for jwk_cls in six.itervalues(cls.TYPES):
if isinstance(key, jwk_cls.cryptography_key_types):
return jwk_cls(key=key)
raise errors.Error('Unsupported algorithm: {0}'.format(key.__class__))
@JWK.register
class JWKES(JWK): # pragma: no cover
# pylint: disable=abstract-class-not-used
"""ES JWK.
.. warning:: This is not yet implemented!
"""
typ = 'ES'
cryptography_key_types = (
ec.EllipticCurvePublicKey, ec.EllipticCurvePrivateKey)
required = ('crv', JWK.type_field_name, 'x', 'y')
def fields_to_partial_json(self):
raise NotImplementedError()
@classmethod
def fields_from_json(cls, jobj):
raise NotImplementedError()
def public_key(self):
raise NotImplementedError()
@JWK.register
class JWKOct(JWK):
"""Symmetric JWK."""
typ = 'oct'
__slots__ = ('key',)
required = ('k', JWK.type_field_name)
def fields_to_partial_json(self):
# TODO: An "alg" member SHOULD also be present to identify the
# algorithm intended to be used with the key, unless the
# application uses another means or convention to determine
# the algorithm used.
return {'k': json_util.encode_b64jose(self.key)}
@classmethod
def fields_from_json(cls, jobj):
return cls(key=json_util.decode_b64jose(jobj['k']))
def public_key(self):
return self
@JWK.register
class JWKRSA(JWK):
"""RSA JWK.
:ivar key: `cryptography.hazmat.primitives.rsa.RSAPrivateKey`
or `cryptography.hazmat.primitives.rsa.RSAPublicKey` wrapped
in `.ComparableRSAKey`
"""
typ = 'RSA'
cryptography_key_types = (rsa.RSAPublicKey, rsa.RSAPrivateKey)
__slots__ = ('key',)
required = ('e', JWK.type_field_name, 'n')
def __init__(self, *args, **kwargs):
if 'key' in kwargs and not isinstance(
kwargs['key'], util.ComparableRSAKey):
kwargs['key'] = util.ComparableRSAKey(kwargs['key'])
super(JWKRSA, self).__init__(*args, **kwargs)
@classmethod
def _encode_param(cls, data):
"""Encode Base64urlUInt.
:type data: long
:rtype: unicode
"""
def _leading_zeros(arg):
if len(arg) % 2:
return '0' + arg
return arg
return json_util.encode_b64jose(binascii.unhexlify(
_leading_zeros(hex(data)[2:].rstrip('L'))))
@classmethod
def _decode_param(cls, data):
"""Decode Base64urlUInt."""
try:
return int(binascii.hexlify(json_util.decode_b64jose(data)), 16)
except ValueError: # invalid literal for long() with base 16
raise errors.DeserializationError()
def public_key(self):
return type(self)(key=self.key.public_key())
@classmethod
def fields_from_json(cls, jobj):
# pylint: disable=invalid-name
n, e = (cls._decode_param(jobj[x]) for x in ('n', 'e'))
public_numbers = rsa.RSAPublicNumbers(e=e, n=n)
if 'd' not in jobj: # public key
key = public_numbers.public_key(default_backend())
else: # private key
d = cls._decode_param(jobj['d'])
if ('p' in jobj or 'q' in jobj or 'dp' in jobj or
'dq' in jobj or 'qi' in jobj or 'oth' in jobj):
# "If the producer includes any of the other private
# key parameters, then all of the others MUST be
# present, with the exception of "oth", which MUST
# only be present when more than two prime factors
# were used."
p, q, dp, dq, qi, = all_params = tuple(
jobj.get(x) for x in ('p', 'q', 'dp', 'dq', 'qi'))
if tuple(param for param in all_params if param is None):
raise errors.Error(
'Some private parameters are missing: {0}'.format(
all_params))
p, q, dp, dq, qi = tuple(
cls._decode_param(x) for x in all_params)
# TODO: check for oth
else:
# cryptography>=0.8
p, q = rsa.rsa_recover_prime_factors(n, e, d)
dp = rsa.rsa_crt_dmp1(d, p)
dq = rsa.rsa_crt_dmq1(d, q)
qi = rsa.rsa_crt_iqmp(p, q)
key = rsa.RSAPrivateNumbers(
p, q, d, dp, dq, qi, public_numbers).private_key(
default_backend())
return cls(key=key)
def fields_to_partial_json(self):
# pylint: disable=protected-access
if isinstance(self.key._wrapped, rsa.RSAPublicKey):
numbers = self.key.public_numbers()
params = {
'n': numbers.n,
'e': numbers.e,
}
else: # rsa.RSAPrivateKey
private = self.key.private_numbers()
public = self.key.public_key().public_numbers()
params = {
'n': public.n,
'e': public.e,
'd': private.d,
'p': private.p,
'q': private.q,
'dp': private.dmp1,
'dq': private.dmq1,
'qi': private.iqmp,
}
return dict((key, self._encode_param(value))
for key, value in six.iteritems(params))

View File

@@ -1,191 +0,0 @@
"""Tests for acme.jose.jwk."""
import binascii
import unittest
from acme import test_util
from acme.jose import errors
from acme.jose import json_util
from acme.jose import util
DSA_PEM = test_util.load_vector('dsa512_key.pem')
RSA256_KEY = test_util.load_rsa_private_key('rsa256_key.pem')
RSA512_KEY = test_util.load_rsa_private_key('rsa512_key.pem')
class JWKTest(unittest.TestCase):
"""Tests for acme.jose.jwk.JWK."""
def test_load(self):
from acme.jose.jwk import JWK
self.assertRaises(errors.Error, JWK.load, DSA_PEM)
def test_load_subclass_wrong_type(self):
from acme.jose.jwk import JWKRSA
self.assertRaises(errors.Error, JWKRSA.load, DSA_PEM)
class JWKTestBaseMixin(object):
"""Mixin test for JWK subclass tests."""
thumbprint = NotImplemented
def test_thumbprint_private(self):
self.assertEqual(self.thumbprint, self.jwk.thumbprint())
def test_thumbprint_public(self):
self.assertEqual(self.thumbprint, self.jwk.public_key().thumbprint())
class JWKOctTest(unittest.TestCase, JWKTestBaseMixin):
"""Tests for acme.jose.jwk.JWKOct."""
thumbprint = (b"\xf3\xe7\xbe\xa8`\xd2\xdap\xe9}\x9c\xce>"
b"\xd0\xfcI\xbe\xcd\x92'\xd4o\x0e\xf41\xea"
b"\x8e(\x8a\xb2i\x1c")
def setUp(self):
from acme.jose.jwk import JWKOct
self.jwk = JWKOct(key=b'foo')
self.jobj = {'kty': 'oct', 'k': json_util.encode_b64jose(b'foo')}
def test_to_partial_json(self):
self.assertEqual(self.jwk.to_partial_json(), self.jobj)
def test_from_json(self):
from acme.jose.jwk import JWKOct
self.assertEqual(self.jwk, JWKOct.from_json(self.jobj))
def test_from_json_hashable(self):
from acme.jose.jwk import JWKOct
hash(JWKOct.from_json(self.jobj))
def test_load(self):
from acme.jose.jwk import JWKOct
self.assertEqual(self.jwk, JWKOct.load(b'foo'))
def test_public_key(self):
self.assertTrue(self.jwk.public_key() is self.jwk)
class JWKRSATest(unittest.TestCase, JWKTestBaseMixin):
"""Tests for acme.jose.jwk.JWKRSA."""
# pylint: disable=too-many-instance-attributes
thumbprint = (b'\x83K\xdc#3\x98\xca\x98\xed\xcb\x80\x80<\x0c'
b'\xf0\x95\xb9H\xb2*l\xbd$\xe5&|O\x91\xd4 \xb0Y')
def setUp(self):
from acme.jose.jwk import JWKRSA
self.jwk256 = JWKRSA(key=RSA256_KEY.public_key())
self.jwk256json = {
'kty': 'RSA',
'e': 'AQAB',
'n': 'm2Fylv-Uz7trgTW8EBHP3FQSMeZs2GNQ6VRo1sIVJEk',
}
# pylint: disable=protected-access
self.jwk256_not_comparable = JWKRSA(
key=RSA256_KEY.public_key()._wrapped)
self.jwk512 = JWKRSA(key=RSA512_KEY.public_key())
self.jwk512json = {
'kty': 'RSA',
'e': 'AQAB',
'n': 'rHVztFHtH92ucFJD_N_HW9AsdRsUuHUBBBDlHwNlRd3fp5'
'80rv2-6QWE30cWgdmJS86ObRz6lUTor4R0T-3C5Q',
}
self.private = JWKRSA(key=RSA256_KEY)
self.private_json_small = self.jwk256json.copy()
self.private_json_small['d'] = (
'lPQED_EPTV0UIBfNI3KP2d9Jlrc2mrMllmf946bu-CE')
self.private_json = self.jwk256json.copy()
self.private_json.update({
'd': 'lPQED_EPTV0UIBfNI3KP2d9Jlrc2mrMllmf946bu-CE',
'p': 'zUVNZn4lLLBD1R6NE8TKNQ',
'q': 'wcfKfc7kl5jfqXArCRSURQ',
'dp': 'CWJFq43QvT5Bm5iN8n1okQ',
'dq': 'bHh2u7etM8LKKCF2pY2UdQ',
'qi': 'oi45cEkbVoJjAbnQpFY87Q',
})
self.jwk = self.private
def test_init_auto_comparable(self):
self.assertTrue(isinstance(
self.jwk256_not_comparable.key, util.ComparableRSAKey))
self.assertEqual(self.jwk256, self.jwk256_not_comparable)
def test_encode_param_zero(self):
from acme.jose.jwk import JWKRSA
# pylint: disable=protected-access
# TODO: move encode/decode _param to separate class
self.assertEqual('AA', JWKRSA._encode_param(0))
def test_equals(self):
self.assertEqual(self.jwk256, self.jwk256)
self.assertEqual(self.jwk512, self.jwk512)
def test_not_equals(self):
self.assertNotEqual(self.jwk256, self.jwk512)
self.assertNotEqual(self.jwk512, self.jwk256)
def test_load(self):
from acme.jose.jwk import JWKRSA
self.assertEqual(self.private, JWKRSA.load(
test_util.load_vector('rsa256_key.pem')))
def test_public_key(self):
self.assertEqual(self.jwk256, self.private.public_key())
def test_to_partial_json(self):
self.assertEqual(self.jwk256.to_partial_json(), self.jwk256json)
self.assertEqual(self.jwk512.to_partial_json(), self.jwk512json)
self.assertEqual(self.private.to_partial_json(), self.private_json)
def test_from_json(self):
from acme.jose.jwk import JWK
self.assertEqual(
self.jwk256, JWK.from_json(self.jwk256json))
self.assertEqual(
self.jwk512, JWK.from_json(self.jwk512json))
self.assertEqual(self.private, JWK.from_json(self.private_json))
def test_from_json_private_small(self):
from acme.jose.jwk import JWK
self.assertEqual(self.private, JWK.from_json(self.private_json_small))
def test_from_json_missing_one_additional(self):
from acme.jose.jwk import JWK
del self.private_json['q']
self.assertRaises(errors.Error, JWK.from_json, self.private_json)
def test_from_json_hashable(self):
from acme.jose.jwk import JWK
hash(JWK.from_json(self.jwk256json))
def test_from_json_non_schema_errors(self):
# valid against schema, but still failing
from acme.jose.jwk import JWK
self.assertRaises(errors.DeserializationError, JWK.from_json,
{'kty': 'RSA', 'e': 'AQAB', 'n': ''})
self.assertRaises(errors.DeserializationError, JWK.from_json,
{'kty': 'RSA', 'e': 'AQAB', 'n': '1'})
def test_thumbprint_go_jose(self):
# https://github.com/square/go-jose/blob/4ddd71883fa547d37fbf598071f04512d8bafee3/jwk.go#L155
# https://github.com/square/go-jose/blob/4ddd71883fa547d37fbf598071f04512d8bafee3/jwk_test.go#L331-L344
# https://github.com/square/go-jose/blob/4ddd71883fa547d37fbf598071f04512d8bafee3/jwk_test.go#L384
from acme.jose.jwk import JWKRSA
key = JWKRSA.json_loads("""{
"kty": "RSA",
"kid": "bilbo.baggins@hobbiton.example",
"use": "sig",
"n": "n4EPtAOCc9AlkeQHPzHStgAbgs7bTZLwUBZdR8_KuKPEHLd4rHVTeT-O-XV2jRojdNhxJWTDvNd7nqQ0VEiZQHz_AJmSCpMaJMRBSFKrKb2wqVwGU_NsYOYL-QtiWN2lbzcEe6XC0dApr5ydQLrHqkHHig3RBordaZ6Aj-oBHqFEHYpPe7Tpe-OfVfHd1E6cS6M1FZcD1NNLYD5lFHpPI9bTwJlsde3uhGqC0ZCuEHg8lhzwOHrtIQbS0FVbb9k3-tVTU4fg_3L_vniUFAKwuCLqKnS2BYwdq_mzSnbLY7h_qixoR7jig3__kRhuaxwUkRz5iaiQkqgc5gHdrNP5zw",
"e": "AQAB"
}""")
self.assertEqual(
binascii.hexlify(key.thumbprint()),
b"f63838e96077ad1fc01c3f8405774dedc0641f558ebb4b40dccf5f9b6d66a932")
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,433 +0,0 @@
"""JOSE Web Signature."""
import argparse
import base64
import sys
import OpenSSL
import six
from acme.jose import b64
from acme.jose import errors
from acme.jose import json_util
from acme.jose import jwa
from acme.jose import jwk
from acme.jose import util
class MediaType(object):
"""MediaType field encoder/decoder."""
PREFIX = 'application/'
"""MIME Media Type and Content Type prefix."""
@classmethod
def decode(cls, value):
"""Decoder."""
# 4.1.10
if '/' not in value:
if ';' in value:
raise errors.DeserializationError('Unexpected semi-colon')
return cls.PREFIX + value
return value
@classmethod
def encode(cls, value):
"""Encoder."""
# 4.1.10
if ';' not in value:
assert value.startswith(cls.PREFIX)
return value[len(cls.PREFIX):]
return value
class Header(json_util.JSONObjectWithFields):
"""JOSE Header.
.. warning:: This class supports **only** Registered Header
Parameter Names (as defined in section 4.1 of the
protocol). If you need Public Header Parameter Names (4.2)
or Private Header Parameter Names (4.3), you must subclass
and override :meth:`from_json` and :meth:`to_partial_json`
appropriately.
.. warning:: This class does not support any extensions through
the "crit" (Critical) Header Parameter (4.1.11) and as a
conforming implementation, :meth:`from_json` treats its
occurrence as an error. Please subclass if you seek for
a different behaviour.
:ivar x5tS256: "x5t#S256"
:ivar str typ: MIME Media Type, inc. :const:`MediaType.PREFIX`.
:ivar str cty: Content-Type, inc. :const:`MediaType.PREFIX`.
"""
alg = json_util.Field(
'alg', decoder=jwa.JWASignature.from_json, omitempty=True)
jku = json_util.Field('jku', omitempty=True)
jwk = json_util.Field('jwk', decoder=jwk.JWK.from_json, omitempty=True)
kid = json_util.Field('kid', omitempty=True)
x5u = json_util.Field('x5u', omitempty=True)
x5c = json_util.Field('x5c', omitempty=True, default=())
x5t = json_util.Field(
'x5t', decoder=json_util.decode_b64jose, omitempty=True)
x5tS256 = json_util.Field(
'x5t#S256', decoder=json_util.decode_b64jose, omitempty=True)
typ = json_util.Field('typ', encoder=MediaType.encode,
decoder=MediaType.decode, omitempty=True)
cty = json_util.Field('cty', encoder=MediaType.encode,
decoder=MediaType.decode, omitempty=True)
crit = json_util.Field('crit', omitempty=True, default=())
def not_omitted(self):
"""Fields that would not be omitted in the JSON object."""
return dict((name, getattr(self, name))
for name, field in six.iteritems(self._fields)
if not field.omit(getattr(self, name)))
def __add__(self, other):
if not isinstance(other, type(self)):
raise TypeError('Header cannot be added to: {0}'.format(
type(other)))
not_omitted_self = self.not_omitted()
not_omitted_other = other.not_omitted()
if set(not_omitted_self).intersection(not_omitted_other):
raise TypeError('Addition of overlapping headers not defined')
not_omitted_self.update(not_omitted_other)
return type(self)(**not_omitted_self) # pylint: disable=star-args
def find_key(self):
"""Find key based on header.
.. todo:: Supports only "jwk" header parameter lookup.
:returns: (Public) key found in the header.
:rtype: .JWK
:raises acme.jose.errors.Error: if key could not be found
"""
if self.jwk is None:
raise errors.Error('No key found')
return self.jwk
@crit.decoder
def crit(unused_value):
# pylint: disable=missing-docstring,no-self-argument,no-self-use
raise errors.DeserializationError(
'"crit" is not supported, please subclass')
# x5c does NOT use JOSE Base64 (4.1.6)
@x5c.encoder # type: ignore
def x5c(value): # pylint: disable=missing-docstring,no-self-argument
return [base64.b64encode(OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_ASN1, cert.wrapped)) for cert in value]
@x5c.decoder # type: ignore
def x5c(value): # pylint: disable=missing-docstring,no-self-argument
try:
return tuple(util.ComparableX509(OpenSSL.crypto.load_certificate(
OpenSSL.crypto.FILETYPE_ASN1,
base64.b64decode(cert))) for cert in value)
except OpenSSL.crypto.Error as error:
raise errors.DeserializationError(error)
class Signature(json_util.JSONObjectWithFields):
"""JWS Signature.
:ivar combined: Combined Header (protected and unprotected,
:class:`Header`).
:ivar unicode protected: JWS protected header (Jose Base-64 decoded).
:ivar header: JWS Unprotected Header (:class:`Header`).
:ivar str signature: The signature.
"""
header_cls = Header
__slots__ = ('combined',)
protected = json_util.Field('protected', omitempty=True, default='')
header = json_util.Field(
'header', omitempty=True, default=header_cls(),
decoder=header_cls.from_json)
signature = json_util.Field(
'signature', decoder=json_util.decode_b64jose,
encoder=json_util.encode_b64jose)
@protected.encoder # type: ignore
def protected(value): # pylint: disable=missing-docstring,no-self-argument
# wrong type guess (Signature, not bytes) | pylint: disable=no-member
return json_util.encode_b64jose(value.encode('utf-8'))
@protected.decoder # type: ignore
def protected(value): # pylint: disable=missing-docstring,no-self-argument
return json_util.decode_b64jose(value).decode('utf-8')
def __init__(self, **kwargs):
if 'combined' not in kwargs:
kwargs = self._with_combined(kwargs)
super(Signature, self).__init__(**kwargs)
assert self.combined.alg is not None
@classmethod
def _with_combined(cls, kwargs):
assert 'combined' not in kwargs
header = kwargs.get('header', cls._fields['header'].default)
protected = kwargs.get('protected', cls._fields['protected'].default)
if protected:
combined = header + cls.header_cls.json_loads(protected)
else:
combined = header
kwargs['combined'] = combined
return kwargs
@classmethod
def _msg(cls, protected, payload):
return (b64.b64encode(protected.encode('utf-8')) + b'.' +
b64.b64encode(payload))
def verify(self, payload, key=None):
"""Verify.
:param JWK key: Key used for verification.
"""
key = self.combined.find_key() if key is None else key
return self.combined.alg.verify(
key=key.key, sig=self.signature,
msg=self._msg(self.protected, payload))
@classmethod
def sign(cls, payload, key, alg, include_jwk=True,
protect=frozenset(), **kwargs):
"""Sign.
:param JWK key: Key for signature.
"""
assert isinstance(key, alg.kty)
header_params = kwargs
header_params['alg'] = alg
if include_jwk:
header_params['jwk'] = key.public_key()
assert set(header_params).issubset(cls.header_cls._fields)
assert protect.issubset(cls.header_cls._fields)
protected_params = {}
for header in protect:
if header in header_params:
protected_params[header] = header_params.pop(header)
if protected_params:
# pylint: disable=star-args
protected = cls.header_cls(**protected_params).json_dumps()
else:
protected = ''
header = cls.header_cls(**header_params) # pylint: disable=star-args
signature = alg.sign(key.key, cls._msg(protected, payload))
return cls(protected=protected, header=header, signature=signature)
def fields_to_partial_json(self):
fields = super(Signature, self).fields_to_partial_json()
if not fields['header'].not_omitted():
del fields['header']
return fields
@classmethod
def fields_from_json(cls, jobj):
fields = super(Signature, cls).fields_from_json(jobj)
fields_with_combined = cls._with_combined(fields)
if 'alg' not in fields_with_combined['combined'].not_omitted():
raise errors.DeserializationError('alg not present')
return fields_with_combined
class JWS(json_util.JSONObjectWithFields):
"""JSON Web Signature.
:ivar str payload: JWS Payload.
:ivar str signature: JWS Signatures.
"""
__slots__ = ('payload', 'signatures')
signature_cls = Signature
def verify(self, key=None):
"""Verify."""
return all(sig.verify(self.payload, key) for sig in self.signatures)
@classmethod
def sign(cls, payload, **kwargs):
"""Sign."""
return cls(payload=payload, signatures=(
cls.signature_cls.sign(payload=payload, **kwargs),))
@property
def signature(self):
"""Get a singleton signature.
:rtype: `signature_cls`
"""
assert len(self.signatures) == 1
return self.signatures[0]
def to_compact(self):
"""Compact serialization.
:rtype: bytes
"""
assert len(self.signatures) == 1
assert 'alg' not in self.signature.header.not_omitted()
# ... it must be in protected
return (
b64.b64encode(self.signature.protected.encode('utf-8')) +
b'.' +
b64.b64encode(self.payload) +
b'.' +
b64.b64encode(self.signature.signature))
@classmethod
def from_compact(cls, compact):
"""Compact deserialization.
:param bytes compact:
"""
try:
protected, payload, signature = compact.split(b'.')
except ValueError:
raise errors.DeserializationError(
'Compact JWS serialization should comprise of exactly'
' 3 dot-separated components')
sig = cls.signature_cls(
protected=b64.b64decode(protected).decode('utf-8'),
signature=b64.b64decode(signature))
return cls(payload=b64.b64decode(payload), signatures=(sig,))
def to_partial_json(self, flat=True): # pylint: disable=arguments-differ
assert self.signatures
payload = json_util.encode_b64jose(self.payload)
if flat and len(self.signatures) == 1:
ret = self.signatures[0].to_partial_json()
ret['payload'] = payload
return ret
else:
return {
'payload': payload,
'signatures': self.signatures,
}
@classmethod
def from_json(cls, jobj):
if 'signature' in jobj and 'signatures' in jobj:
raise errors.DeserializationError('Flat mixed with non-flat')
elif 'signature' in jobj: # flat
return cls(payload=json_util.decode_b64jose(jobj.pop('payload')),
signatures=(cls.signature_cls.from_json(jobj),))
else:
return cls(payload=json_util.decode_b64jose(jobj['payload']),
signatures=tuple(cls.signature_cls.from_json(sig)
for sig in jobj['signatures']))
class CLI(object):
"""JWS CLI."""
@classmethod
def sign(cls, args):
"""Sign."""
key = args.alg.kty.load(args.key.read())
args.key.close()
if args.protect is None:
args.protect = []
if args.compact:
args.protect.append('alg')
sig = JWS.sign(payload=sys.stdin.read().encode(), key=key, alg=args.alg,
protect=set(args.protect))
if args.compact:
six.print_(sig.to_compact().decode('utf-8'))
else: # JSON
six.print_(sig.json_dumps_pretty())
@classmethod
def verify(cls, args):
"""Verify."""
if args.compact:
sig = JWS.from_compact(sys.stdin.read().encode())
else: # JSON
try:
sig = JWS.json_loads(sys.stdin.read())
except errors.Error as error:
six.print_(error)
return -1
if args.key is not None:
assert args.kty is not None
key = args.kty.load(args.key.read()).public_key()
args.key.close()
else:
key = None
sys.stdout.write(sig.payload)
return not sig.verify(key=key)
@classmethod
def _alg_type(cls, arg):
return jwa.JWASignature.from_json(arg)
@classmethod
def _header_type(cls, arg):
assert arg in Signature.header_cls._fields
return arg
@classmethod
def _kty_type(cls, arg):
assert arg in jwk.JWK.TYPES
return jwk.JWK.TYPES[arg]
@classmethod
def run(cls, args=sys.argv[1:]):
"""Parse arguments and sign/verify."""
parser = argparse.ArgumentParser()
parser.add_argument('--compact', action='store_true')
subparsers = parser.add_subparsers()
parser_sign = subparsers.add_parser('sign')
parser_sign.set_defaults(func=cls.sign)
parser_sign.add_argument(
'-k', '--key', type=argparse.FileType('rb'), required=True)
parser_sign.add_argument(
'-a', '--alg', type=cls._alg_type, default=jwa.RS256)
parser_sign.add_argument(
'-p', '--protect', action='append', type=cls._header_type)
parser_verify = subparsers.add_parser('verify')
parser_verify.set_defaults(func=cls.verify)
parser_verify.add_argument(
'-k', '--key', type=argparse.FileType('rb'), required=False)
parser_verify.add_argument(
'--kty', type=cls._kty_type, required=False)
parsed = parser.parse_args(args)
return parsed.func(parsed)
if __name__ == '__main__':
exit(CLI.run()) # pragma: no cover

View File

@@ -1,239 +0,0 @@
"""Tests for acme.jose.jws."""
import base64
import unittest
import mock
import OpenSSL
from acme import test_util
from acme.jose import errors
from acme.jose import json_util
from acme.jose import jwa
from acme.jose import jwk
CERT = test_util.load_comparable_cert('cert.pem')
KEY = jwk.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
class MediaTypeTest(unittest.TestCase):
"""Tests for acme.jose.jws.MediaType."""
def test_decode(self):
from acme.jose.jws import MediaType
self.assertEqual('application/app', MediaType.decode('application/app'))
self.assertEqual('application/app', MediaType.decode('app'))
self.assertRaises(
errors.DeserializationError, MediaType.decode, 'app;foo')
def test_encode(self):
from acme.jose.jws import MediaType
self.assertEqual('app', MediaType.encode('application/app'))
self.assertEqual('application/app;foo',
MediaType.encode('application/app;foo'))
class HeaderTest(unittest.TestCase):
"""Tests for acme.jose.jws.Header."""
def setUp(self):
from acme.jose.jws import Header
self.header1 = Header(jwk='foo')
self.header2 = Header(jwk='bar')
self.crit = Header(crit=('a', 'b'))
self.empty = Header()
def test_add_non_empty(self):
from acme.jose.jws import Header
self.assertEqual(Header(jwk='foo', crit=('a', 'b')),
self.header1 + self.crit)
def test_add_empty(self):
self.assertEqual(self.header1, self.header1 + self.empty)
self.assertEqual(self.header1, self.empty + self.header1)
def test_add_overlapping_error(self):
self.assertRaises(TypeError, self.header1.__add__, self.header2)
def test_add_wrong_type_error(self):
self.assertRaises(TypeError, self.header1.__add__, 'xxx')
def test_crit_decode_always_errors(self):
from acme.jose.jws import Header
self.assertRaises(errors.DeserializationError, Header.from_json,
{'crit': ['a', 'b']})
def test_x5c_decoding(self):
from acme.jose.jws import Header
header = Header(x5c=(CERT, CERT))
jobj = header.to_partial_json()
cert_asn1 = OpenSSL.crypto.dump_certificate(
OpenSSL.crypto.FILETYPE_ASN1, CERT.wrapped)
cert_b64 = base64.b64encode(cert_asn1)
self.assertEqual(jobj, {'x5c': [cert_b64, cert_b64]})
self.assertEqual(header, Header.from_json(jobj))
jobj['x5c'][0] = base64.b64encode(b'xxx' + cert_asn1)
self.assertRaises(errors.DeserializationError, Header.from_json, jobj)
def test_find_key(self):
self.assertEqual('foo', self.header1.find_key())
self.assertEqual('bar', self.header2.find_key())
self.assertRaises(errors.Error, self.crit.find_key)
class SignatureTest(unittest.TestCase):
"""Tests for acme.jose.jws.Signature."""
def test_from_json(self):
from acme.jose.jws import Header
from acme.jose.jws import Signature
self.assertEqual(
Signature(signature=b'foo', header=Header(alg=jwa.RS256)),
Signature.from_json(
{'signature': 'Zm9v', 'header': {'alg': 'RS256'}}))
def test_from_json_no_alg_error(self):
from acme.jose.jws import Signature
self.assertRaises(errors.DeserializationError,
Signature.from_json, {'signature': 'foo'})
class JWSTest(unittest.TestCase):
"""Tests for acme.jose.jws.JWS."""
def setUp(self):
self.privkey = KEY
self.pubkey = self.privkey.public_key()
from acme.jose.jws import JWS
self.unprotected = JWS.sign(
payload=b'foo', key=self.privkey, alg=jwa.RS256)
self.protected = JWS.sign(
payload=b'foo', key=self.privkey, alg=jwa.RS256,
protect=frozenset(['jwk', 'alg']))
self.mixed = JWS.sign(
payload=b'foo', key=self.privkey, alg=jwa.RS256,
protect=frozenset(['alg']))
def test_pubkey_jwk(self):
self.assertEqual(self.unprotected.signature.combined.jwk, self.pubkey)
self.assertEqual(self.protected.signature.combined.jwk, self.pubkey)
self.assertEqual(self.mixed.signature.combined.jwk, self.pubkey)
def test_sign_unprotected(self):
self.assertTrue(self.unprotected.verify())
def test_sign_protected(self):
self.assertTrue(self.protected.verify())
def test_sign_mixed(self):
self.assertTrue(self.mixed.verify())
def test_compact_lost_unprotected(self):
compact = self.mixed.to_compact()
self.assertEqual(
b'eyJhbGciOiAiUlMyNTYifQ.Zm9v.OHdxFVj73l5LpxbFp1AmYX4yJM0Pyb'
b'_893n1zQjpim_eLS5J1F61lkvrCrCDErTEJnBGOGesJ72M7b6Ve1cAJA',
compact)
from acme.jose.jws import JWS
mixed = JWS.from_compact(compact)
self.assertNotEqual(self.mixed, mixed)
self.assertEqual(
set(['alg']), set(mixed.signature.combined.not_omitted()))
def test_from_compact_missing_components(self):
from acme.jose.jws import JWS
self.assertRaises(errors.DeserializationError, JWS.from_compact, b'.')
def test_json_omitempty(self):
protected_jobj = self.protected.to_partial_json(flat=True)
unprotected_jobj = self.unprotected.to_partial_json(flat=True)
self.assertTrue('protected' not in unprotected_jobj)
self.assertTrue('header' not in protected_jobj)
unprotected_jobj['header'] = unprotected_jobj['header'].to_json()
from acme.jose.jws import JWS
self.assertEqual(JWS.from_json(protected_jobj), self.protected)
self.assertEqual(JWS.from_json(unprotected_jobj), self.unprotected)
def test_json_flat(self):
jobj_to = {
'signature': json_util.encode_b64jose(
self.mixed.signature.signature),
'payload': json_util.encode_b64jose(b'foo'),
'header': self.mixed.signature.header,
'protected': json_util.encode_b64jose(
self.mixed.signature.protected.encode('utf-8')),
}
jobj_from = jobj_to.copy()
jobj_from['header'] = jobj_from['header'].to_json()
self.assertEqual(self.mixed.to_partial_json(flat=True), jobj_to)
from acme.jose.jws import JWS
self.assertEqual(self.mixed, JWS.from_json(jobj_from))
def test_json_not_flat(self):
jobj_to = {
'signatures': (self.mixed.signature,),
'payload': json_util.encode_b64jose(b'foo'),
}
jobj_from = jobj_to.copy()
jobj_from['signatures'] = [jobj_to['signatures'][0].to_json()]
self.assertEqual(self.mixed.to_partial_json(flat=False), jobj_to)
from acme.jose.jws import JWS
self.assertEqual(self.mixed, JWS.from_json(jobj_from))
def test_from_json_mixed_flat(self):
from acme.jose.jws import JWS
self.assertRaises(errors.DeserializationError, JWS.from_json,
{'signatures': (), 'signature': 'foo'})
def test_from_json_hashable(self):
from acme.jose.jws import JWS
hash(JWS.from_json(self.mixed.to_json()))
class CLITest(unittest.TestCase):
def setUp(self):
self.key_path = test_util.vector_path('rsa512_key.pem')
def test_unverified(self):
from acme.jose.jws import CLI
with mock.patch('sys.stdin') as sin:
sin.read.return_value = '{"payload": "foo", "signature": "xxx"}'
with mock.patch('sys.stdout'):
self.assertEqual(-1, CLI.run(['verify']))
def test_json(self):
from acme.jose.jws import CLI
with mock.patch('sys.stdin') as sin:
sin.read.return_value = 'foo'
with mock.patch('sys.stdout') as sout:
CLI.run(['sign', '-k', self.key_path, '-a', 'RS256',
'-p', 'jwk'])
sin.read.return_value = sout.write.mock_calls[0][1][0]
self.assertEqual(0, CLI.run(['verify']))
def test_compact(self):
from acme.jose.jws import CLI
with mock.patch('sys.stdin') as sin:
sin.read.return_value = 'foo'
with mock.patch('sys.stdout') as sout:
CLI.run(['--compact', 'sign', '-k', self.key_path])
sin.read.return_value = sout.write.mock_calls[0][1][0]
self.assertEqual(0, CLI.run([
'--compact', 'verify', '--kty', 'RSA',
'-k', self.key_path]))
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,226 +0,0 @@
"""JOSE utilities."""
import collections
from cryptography.hazmat.primitives.asymmetric import rsa
import OpenSSL
import six
class abstractclassmethod(classmethod):
# pylint: disable=invalid-name,too-few-public-methods
"""Descriptor for an abstract classmethod.
It augments the :mod:`abc` framework with an abstract
classmethod. This is implemented as :class:`abc.abstractclassmethod`
in the standard Python library starting with version 3.2.
This particular implementation, allegedly based on Python 3.3 source
code, is stolen from
http://stackoverflow.com/questions/11217878/python-2-7-combine-abc-abstractmethod-and-classmethod.
"""
__isabstractmethod__ = True
def __init__(self, target):
target.__isabstractmethod__ = True
super(abstractclassmethod, self).__init__(target)
class ComparableX509(object): # pylint: disable=too-few-public-methods
"""Wrapper for OpenSSL.crypto.X509** objects that supports __eq__.
:ivar wrapped: Wrapped certificate or certificate request.
:type wrapped: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
"""
def __init__(self, wrapped):
assert isinstance(wrapped, OpenSSL.crypto.X509) or isinstance(
wrapped, OpenSSL.crypto.X509Req)
self.wrapped = wrapped
def __getattr__(self, name):
return getattr(self.wrapped, name)
def _dump(self, filetype=OpenSSL.crypto.FILETYPE_ASN1):
"""Dumps the object into a buffer with the specified encoding.
:param int filetype: The desired encoding. Should be one of
`OpenSSL.crypto.FILETYPE_ASN1`,
`OpenSSL.crypto.FILETYPE_PEM`, or
`OpenSSL.crypto.FILETYPE_TEXT`.
:returns: Encoded X509 object.
:rtype: str
"""
if isinstance(self.wrapped, OpenSSL.crypto.X509):
func = OpenSSL.crypto.dump_certificate
else: # assert in __init__ makes sure this is X509Req
func = OpenSSL.crypto.dump_certificate_request
return func(filetype, self.wrapped)
def __eq__(self, other):
if not isinstance(other, self.__class__):
return NotImplemented
# pylint: disable=protected-access
return self._dump() == other._dump()
def __hash__(self):
return hash((self.__class__, self._dump()))
def __ne__(self, other):
return not self == other
def __repr__(self):
return '<{0}({1!r})>'.format(self.__class__.__name__, self.wrapped)
class ComparableKey(object): # pylint: disable=too-few-public-methods
"""Comparable wrapper for `cryptography` keys.
See https://github.com/pyca/cryptography/issues/2122.
"""
__hash__ = NotImplemented
def __init__(self, wrapped):
self._wrapped = wrapped
def __getattr__(self, name):
return getattr(self._wrapped, name)
def __eq__(self, other):
# pylint: disable=protected-access
if (not isinstance(other, self.__class__) or
self._wrapped.__class__ is not other._wrapped.__class__):
return NotImplemented
elif hasattr(self._wrapped, 'private_numbers'):
return self.private_numbers() == other.private_numbers()
elif hasattr(self._wrapped, 'public_numbers'):
return self.public_numbers() == other.public_numbers()
else:
return NotImplemented
def __ne__(self, other):
return not self == other
def __repr__(self):
return '<{0}({1!r})>'.format(self.__class__.__name__, self._wrapped)
def public_key(self):
"""Get wrapped public key."""
return self.__class__(self._wrapped.public_key())
class ComparableRSAKey(ComparableKey): # pylint: disable=too-few-public-methods
"""Wrapper for `cryptography` RSA keys.
Wraps around:
- `cryptography.hazmat.primitives.asymmetric.RSAPrivateKey`
- `cryptography.hazmat.primitives.asymmetric.RSAPublicKey`
"""
def __hash__(self):
# public_numbers() hasn't got stable hash!
# https://github.com/pyca/cryptography/issues/2143
if isinstance(self._wrapped, rsa.RSAPrivateKeyWithSerialization):
priv = self.private_numbers()
pub = priv.public_numbers
return hash((self.__class__, priv.p, priv.q, priv.dmp1,
priv.dmq1, priv.iqmp, pub.n, pub.e))
elif isinstance(self._wrapped, rsa.RSAPublicKeyWithSerialization):
pub = self.public_numbers()
return hash((self.__class__, pub.n, pub.e))
class ImmutableMap(collections.Mapping, collections.Hashable): # type: ignore
# pylint: disable=too-few-public-methods
"""Immutable key to value mapping with attribute access."""
__slots__ = ()
"""Must be overridden in subclasses."""
def __init__(self, **kwargs):
if set(kwargs) != set(self.__slots__):
raise TypeError(
'__init__() takes exactly the following arguments: {0} '
'({1} given)'.format(', '.join(self.__slots__),
', '.join(kwargs) if kwargs else 'none'))
for slot in self.__slots__:
object.__setattr__(self, slot, kwargs.pop(slot))
def update(self, **kwargs):
"""Return updated map."""
items = dict(self)
items.update(kwargs)
return type(self)(**items) # pylint: disable=star-args
def __getitem__(self, key):
try:
return getattr(self, key)
except AttributeError:
raise KeyError(key)
def __iter__(self):
return iter(self.__slots__)
def __len__(self):
return len(self.__slots__)
def __hash__(self):
return hash(tuple(getattr(self, slot) for slot in self.__slots__))
def __setattr__(self, name, value):
raise AttributeError("can't set attribute")
def __repr__(self):
return '{0}({1})'.format(self.__class__.__name__, ', '.join(
'{0}={1!r}'.format(key, value)
for key, value in six.iteritems(self)))
class frozendict(collections.Mapping, collections.Hashable): # type: ignore
# pylint: disable=invalid-name,too-few-public-methods
"""Frozen dictionary."""
__slots__ = ('_items', '_keys')
def __init__(self, *args, **kwargs):
if kwargs and not args:
items = dict(kwargs)
elif len(args) == 1 and isinstance(args[0], collections.Mapping):
items = args[0]
else:
raise TypeError()
# TODO: support generators/iterators
object.__setattr__(self, '_items', items)
object.__setattr__(self, '_keys', tuple(sorted(six.iterkeys(items))))
def __getitem__(self, key):
return self._items[key]
def __iter__(self):
return iter(self._keys)
def __len__(self):
return len(self._items)
def _sorted_items(self):
return tuple((key, self[key]) for key in self._keys)
def __hash__(self):
return hash(self._sorted_items())
def __getattr__(self, name):
try:
return self._items[name]
except KeyError:
raise AttributeError(name)
def __setattr__(self, name, value):
raise AttributeError("can't set attribute")
def __repr__(self):
return 'frozendict({0})'.format(', '.join('{0}={1!r}'.format(
key, value) for key, value in self._sorted_items()))

View File

@@ -1,199 +0,0 @@
"""Tests for acme.jose.util."""
import functools
import unittest
import six
from acme import test_util
class ComparableX509Test(unittest.TestCase):
"""Tests for acme.jose.util.ComparableX509."""
def setUp(self):
# test_util.load_comparable_{csr,cert} return ComparableX509
self.req1 = test_util.load_comparable_csr('csr.pem')
self.req2 = test_util.load_comparable_csr('csr.pem')
self.req_other = test_util.load_comparable_csr('csr-san.pem')
self.cert1 = test_util.load_comparable_cert('cert.pem')
self.cert2 = test_util.load_comparable_cert('cert.pem')
self.cert_other = test_util.load_comparable_cert('cert-san.pem')
def test_getattr_proxy(self):
self.assertTrue(self.cert1.has_expired())
def test_eq(self):
self.assertEqual(self.req1, self.req2)
self.assertEqual(self.cert1, self.cert2)
def test_ne(self):
self.assertNotEqual(self.req1, self.req_other)
self.assertNotEqual(self.cert1, self.cert_other)
def test_ne_wrong_types(self):
self.assertNotEqual(self.req1, 5)
self.assertNotEqual(self.cert1, 5)
def test_hash(self):
self.assertEqual(hash(self.req1), hash(self.req2))
self.assertNotEqual(hash(self.req1), hash(self.req_other))
self.assertEqual(hash(self.cert1), hash(self.cert2))
self.assertNotEqual(hash(self.cert1), hash(self.cert_other))
def test_repr(self):
for x509 in self.req1, self.cert1:
self.assertEqual(repr(x509),
'<ComparableX509({0!r})>'.format(x509.wrapped))
class ComparableRSAKeyTest(unittest.TestCase):
"""Tests for acme.jose.util.ComparableRSAKey."""
def setUp(self):
# test_utl.load_rsa_private_key return ComparableRSAKey
self.key = test_util.load_rsa_private_key('rsa256_key.pem')
self.key_same = test_util.load_rsa_private_key('rsa256_key.pem')
self.key2 = test_util.load_rsa_private_key('rsa512_key.pem')
def test_getattr_proxy(self):
self.assertEqual(256, self.key.key_size)
def test_eq(self):
self.assertEqual(self.key, self.key_same)
def test_ne(self):
self.assertNotEqual(self.key, self.key2)
def test_ne_different_types(self):
self.assertNotEqual(self.key, 5)
def test_ne_not_wrapped(self):
# pylint: disable=protected-access
self.assertNotEqual(self.key, self.key_same._wrapped)
def test_ne_no_serialization(self):
from acme.jose.util import ComparableRSAKey
self.assertNotEqual(ComparableRSAKey(5), ComparableRSAKey(5))
def test_hash(self):
self.assertTrue(isinstance(hash(self.key), int))
self.assertEqual(hash(self.key), hash(self.key_same))
self.assertNotEqual(hash(self.key), hash(self.key2))
def test_repr(self):
self.assertTrue(repr(self.key).startswith(
'<ComparableRSAKey(<cryptography.hazmat.'))
def test_public_key(self):
from acme.jose.util import ComparableRSAKey
self.assertTrue(isinstance(self.key.public_key(), ComparableRSAKey))
class ImmutableMapTest(unittest.TestCase):
"""Tests for acme.jose.util.ImmutableMap."""
def setUp(self):
# pylint: disable=invalid-name,too-few-public-methods
# pylint: disable=missing-docstring
from acme.jose.util import ImmutableMap
class A(ImmutableMap):
__slots__ = ('x', 'y')
class B(ImmutableMap):
__slots__ = ('x', 'y')
self.A = A
self.B = B
self.a1 = self.A(x=1, y=2)
self.a1_swap = self.A(y=2, x=1)
self.a2 = self.A(x=3, y=4)
self.b = self.B(x=1, y=2)
def test_update(self):
self.assertEqual(self.A(x=2, y=2), self.a1.update(x=2))
self.assertEqual(self.a2, self.a1.update(x=3, y=4))
def test_get_missing_item_raises_key_error(self):
self.assertRaises(KeyError, self.a1.__getitem__, 'z')
def test_order_of_args_does_not_matter(self):
self.assertEqual(self.a1, self.a1_swap)
def test_type_error_on_missing(self):
self.assertRaises(TypeError, self.A, x=1)
self.assertRaises(TypeError, self.A, y=2)
def test_type_error_on_unrecognized(self):
self.assertRaises(TypeError, self.A, x=1, z=2)
self.assertRaises(TypeError, self.A, x=1, y=2, z=3)
def test_get_attr(self):
self.assertEqual(1, self.a1.x)
self.assertEqual(2, self.a1.y)
self.assertEqual(1, self.a1_swap.x)
self.assertEqual(2, self.a1_swap.y)
def test_set_attr_raises_attribute_error(self):
self.assertRaises(
AttributeError, functools.partial(self.a1.__setattr__, 'x'), 10)
def test_equal(self):
self.assertEqual(self.a1, self.a1)
self.assertEqual(self.a2, self.a2)
self.assertNotEqual(self.a1, self.a2)
def test_hash(self):
self.assertEqual(hash((1, 2)), hash(self.a1))
def test_unhashable(self):
self.assertRaises(TypeError, self.A(x=1, y={}).__hash__)
def test_repr(self):
self.assertEqual('A(x=1, y=2)', repr(self.a1))
self.assertEqual('A(x=1, y=2)', repr(self.a1_swap))
self.assertEqual('B(x=1, y=2)', repr(self.b))
self.assertEqual("B(x='foo', y='bar')", repr(self.B(x='foo', y='bar')))
class frozendictTest(unittest.TestCase): # pylint: disable=invalid-name
"""Tests for acme.jose.util.frozendict."""
def setUp(self):
from acme.jose.util import frozendict
self.fdict = frozendict(x=1, y='2')
def test_init_dict(self):
from acme.jose.util import frozendict
self.assertEqual(self.fdict, frozendict({'x': 1, 'y': '2'}))
def test_init_other_raises_type_error(self):
from acme.jose.util import frozendict
# specifically fail for generators...
self.assertRaises(TypeError, frozendict, six.iteritems({'a': 'b'}))
def test_len(self):
self.assertEqual(2, len(self.fdict))
def test_hash(self):
self.assertTrue(isinstance(hash(self.fdict), int))
def test_getattr_proxy(self):
self.assertEqual(1, self.fdict.x)
self.assertEqual('2', self.fdict.y)
def test_getattr_raises_attribute_error(self):
self.assertRaises(AttributeError, self.fdict.__getattr__, 'z')
def test_setattr_immutable(self):
self.assertRaises(AttributeError, self.fdict.__setattr__, 'z', 3)
def test_repr(self):
self.assertEqual("frozendict(x=1, y='2')", repr(self.fdict))
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -1,10 +1,10 @@
"""ACME-specific JWS.
The JWS implementation in acme.jose only implements the base JOSE standard. In
The JWS implementation in josepy only implements the base JOSE standard. In
order to support the new header fields defined in ACME, this module defines some
ACME-specific classes that layer on top of acme.jose.
ACME-specific classes that layer on top of josepy.
"""
from acme import jose
import josepy as jose
class Header(jose.Header):

View File

@@ -1,7 +1,8 @@
"""Tests for acme.jws."""
import unittest
from acme import jose
import josepy as jose
from acme import test_util

View File

@@ -2,10 +2,11 @@
import collections
import six
import josepy as jose
from acme import challenges
from acme import errors
from acme import fields
from acme import jose
from acme import util
OLD_ERROR_PREFIX = "urn:acme:error:"
@@ -170,9 +171,30 @@ class Directory(jose.JSONDeSerializable):
class Meta(jose.JSONObjectWithFields):
"""Directory Meta."""
terms_of_service = jose.Field('terms-of-service', omitempty=True)
_terms_of_service = jose.Field('terms-of-service', omitempty=True)
_terms_of_service_v2 = jose.Field('termsOfService', omitempty=True)
website = jose.Field('website', omitempty=True)
caa_identities = jose.Field('caa-identities', omitempty=True)
caa_identities = jose.Field('caaIdentities', omitempty=True)
def __init__(self, **kwargs):
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
# pylint: disable=star-args
super(Directory.Meta, self).__init__(**kwargs)
@property
def terms_of_service(self):
"""URL for the CA TOS"""
return self._terms_of_service or self._terms_of_service_v2
def __iter__(self):
# When iterating over fields, use the external name 'terms_of_service' instead of
# the internal '_terms_of_service'.
for name in super(Directory.Meta, self).__iter__():
yield name[1:] if name == '_terms_of_service' else name
def _internal_name(self, name):
return '_' + name if name == 'terms_of_service' else name
@classmethod
def _canon_key(cls, key):
@@ -238,7 +260,7 @@ class ResourceBody(jose.JSONObjectWithFields):
class Registration(ResourceBody):
"""Registration Resource Body.
:ivar acme.jose.jwk.JWK key: Public key.
:ivar josepy.jwk.JWK key: Public key.
:ivar tuple contact: Contact information following ACME spec,
`tuple` of `unicode`.
:ivar unicode agreement:
@@ -250,6 +272,7 @@ class Registration(ResourceBody):
contact = jose.Field('contact', omitempty=True, default=())
agreement = jose.Field('agreement', omitempty=True)
status = jose.Field('status', omitempty=True)
terms_of_service_agreed = jose.Field('termsOfServiceAgreed', omitempty=True)
phone_prefix = 'tel:'
email_prefix = 'mailto:'
@@ -325,13 +348,26 @@ class ChallengeBody(ResourceBody):
"""
__slots__ = ('chall',)
uri = jose.Field('uri')
# ACMEv1 has a "uri" field in challenges. ACMEv2 has a "url" field. This
# challenge object supports either one, but should be accessed through the
# name "uri". In Client.answer_challenge, whichever one is set will be
# used.
_uri = jose.Field('uri', omitempty=True, default=None)
_url = jose.Field('url', omitempty=True, default=None)
status = jose.Field('status', decoder=Status.from_json,
omitempty=True, default=STATUS_PENDING)
validated = fields.RFC3339Field('validated', omitempty=True)
error = jose.Field('error', decoder=Error.from_json,
omitempty=True, default=None)
def __init__(self, **kwargs):
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
# pylint: disable=star-args
super(ChallengeBody, self).__init__(**kwargs)
def encode(self, name):
return super(ChallengeBody, self).encode(self._internal_name(name))
def to_partial_json(self):
jobj = super(ChallengeBody, self).to_partial_json()
jobj.update(self.chall.to_partial_json())
@@ -343,9 +379,23 @@ class ChallengeBody(ResourceBody):
jobj_fields['chall'] = challenges.Challenge.from_json(jobj)
return jobj_fields
@property
def uri(self):
"""The URL of this challenge."""
return self._url or self._uri
def __getattr__(self, name):
return getattr(self.chall, name)
def __iter__(self):
# When iterating over fields, use the external name 'uri' instead of
# the internal '_uri'.
for name in super(ChallengeBody, self).__iter__():
yield name[1:] if name == '_uri' else name
def _internal_name(self, name):
return '_' + name if name == 'uri' else name
class ChallengeResource(Resource):
"""Challenge Resource.
@@ -358,10 +408,10 @@ class ChallengeResource(Resource):
authzr_uri = jose.Field('authzr_uri')
@property
def uri(self): # pylint: disable=missing-docstring,no-self-argument
# bug? 'method already defined line None'
# pylint: disable=function-redefined
return self.body.uri # pylint: disable=no-member
def uri(self):
"""The URL of the challenge body."""
# pylint: disable=function-redefined,no-member
return self.body.uri
class Authorization(ResourceBody):
@@ -385,6 +435,7 @@ class Authorization(ResourceBody):
# be absent'... then acme-spec gives example with 'expires'
# present... That's confusing!
expires = fields.RFC3339Field('expires', omitempty=True)
wildcard = jose.Field('wildcard', omitempty=True)
@challenges.decoder
def challenges(value): # pylint: disable=missing-docstring,no-self-argument
@@ -419,7 +470,7 @@ class AuthorizationResource(ResourceWithURI):
class CertificateRequest(jose.JSONObjectWithFields):
"""ACME new-cert request.
:ivar acme.jose.util.ComparableX509 csr:
:ivar josepy.util.ComparableX509 csr:
`OpenSSL.crypto.X509Req` wrapped in `.ComparableX509`
"""
@@ -431,7 +482,7 @@ class CertificateRequest(jose.JSONObjectWithFields):
class CertificateResource(ResourceWithURI):
"""Certificate Resource.
:ivar acme.jose.util.ComparableX509 body:
:ivar josepy.util.ComparableX509 body:
`OpenSSL.crypto.X509` wrapped in `.ComparableX509`
:ivar unicode cert_chain_uri: URI found in the 'up' ``Link`` header
:ivar tuple authzrs: `tuple` of `AuthorizationResource`.
@@ -454,3 +505,50 @@ class Revocation(jose.JSONObjectWithFields):
certificate = jose.Field(
'certificate', decoder=jose.decode_cert, encoder=jose.encode_cert)
reason = jose.Field('reason')
class Order(ResourceBody):
"""Order Resource Body.
:ivar list of .Identifier: List of identifiers for the certificate.
:ivar acme.messages.Status status:
:ivar list of str authorizations: URLs of authorizations.
:ivar str certificate: URL to download certificate as a fullchain PEM.
:ivar str finalize: URL to POST to to request issuance once all
authorizations have "valid" status.
:ivar datetime.datetime expires: When the order expires.
:ivar .Error error: Any error that occurred during finalization, if applicable.
"""
identifiers = jose.Field('identifiers', omitempty=True)
status = jose.Field('status', decoder=Status.from_json,
omitempty=True, default=STATUS_PENDING)
authorizations = jose.Field('authorizations', omitempty=True)
certificate = jose.Field('certificate', omitempty=True)
finalize = jose.Field('finalize', omitempty=True)
expires = fields.RFC3339Field('expires', omitempty=True)
error = jose.Field('error', omitempty=True, decoder=Error.from_json)
@identifiers.decoder
def identifiers(value): # pylint: disable=missing-docstring,no-self-argument
return tuple(Identifier.from_json(identifier) for identifier in value)
class OrderResource(ResourceWithURI):
"""Order Resource.
:ivar acme.messages.Order body:
:ivar str csr_pem: The CSR this Order will be finalized with.
:ivar list of acme.messages.AuthorizationResource authorizations:
Fully-fetched AuthorizationResource objects.
:ivar str fullchain_pem: The fetched contents of the certificate URL
produced once the order was finalized, if it's present.
"""
body = jose.Field('body', decoder=Order.from_json)
csr_pem = jose.Field('csr_pem', omitempty=True)
authorizations = jose.Field('authorizations')
fullchain_pem = jose.Field('fullchain_pem', omitempty=True)
@Directory.register
class NewOrder(Order):
"""New order."""
resource_type = 'new-order'
resource = fields.Resource(resource_type)

View File

@@ -1,10 +1,10 @@
"""Tests for acme.messages."""
import unittest
import josepy as jose
import mock
from acme import challenges
from acme import jose
from acme import test_util
@@ -71,6 +71,12 @@ class ErrorTest(unittest.TestCase):
self.assertTrue(is_acme_error(Error.with_code('badCSR')))
self.assertRaises(ValueError, Error.with_code, 'not an ACME error code')
def test_str(self):
self.assertEqual(
str(self.error),
u"{0.typ} :: {0.description} :: {0.detail} :: {0.title}"
.format(self.error))
class ConstantTest(unittest.TestCase):
"""Tests for acme.messages._Constant."""
@@ -151,7 +157,7 @@ class DirectoryTest(unittest.TestCase):
'meta': {
'terms-of-service': 'https://example.com/acme/terms',
'website': 'https://www.example.com/',
'caa-identities': ['example.com'],
'caaIdentities': ['example.com'],
},
})
@@ -159,6 +165,13 @@ class DirectoryTest(unittest.TestCase):
from acme.messages import Directory
Directory.from_json({'foo': 'bar'})
def test_iter_meta(self):
result = False
for k in self.dir.meta:
if k == 'terms_of_service':
result = self.dir.meta[k] == 'https://example.com/acme/terms'
self.assertTrue(result)
class RegistrationTest(unittest.TestCase):
"""Tests for acme.messages.Registration."""
@@ -277,6 +290,9 @@ class ChallengeBodyTest(unittest.TestCase):
'detail': 'Unable to communicate with DNS server',
}
def test_encode(self):
self.assertEqual(self.challb.encode('uri'), self.challb.uri)
def test_to_partial_json(self):
self.assertEqual(self.jobj_to, self.challb.to_partial_json())
@@ -392,5 +408,21 @@ class RevocationTest(unittest.TestCase):
hash(Revocation.from_json(self.rev.to_json()))
class OrderResourceTest(unittest.TestCase):
"""Tests for acme.messages.OrderResource."""
def setUp(self):
from acme.messages import OrderResource
self.regr = OrderResource(
body=mock.sentinel.body, uri=mock.sentinel.uri)
def test_to_partial_json(self):
self.assertEqual(self.regr.to_json(), {
'body': mock.sentinel.body,
'uri': mock.sentinel.uri,
'authorizations': None,
})
if __name__ == '__main__':
unittest.main() # pragma: no cover

View File

@@ -10,20 +10,19 @@ import unittest
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
import josepy as jose
import mock
import requests
from acme import challenges
from acme import crypto_util
from acme import errors
from acme import jose
from acme import test_util
class TLSServerTest(unittest.TestCase):
"""Tests for acme.standalone.TLSServer."""
_multiprocess_can_split_ = True
def test_bind(self): # pylint: disable=no-self-use
from acme.standalone import TLSServer
@@ -42,7 +41,6 @@ class TLSServerTest(unittest.TestCase):
class TLSSNI01ServerTest(unittest.TestCase):
"""Test for acme.standalone.TLSSNI01Server."""
_multiprocess_can_split_ = True
def setUp(self):
self.certs = {b'localhost': (
@@ -70,7 +68,6 @@ class TLSSNI01ServerTest(unittest.TestCase):
class HTTP01ServerTest(unittest.TestCase):
"""Tests for acme.standalone.HTTP01Server."""
_multiprocess_can_split_ = True
def setUp(self):
self.account_key = jose.JWK.load(
@@ -124,7 +121,6 @@ class HTTP01ServerTest(unittest.TestCase):
class BaseDualNetworkedServersTest(unittest.TestCase):
"""Test for acme.standalone.BaseDualNetworkedServers."""
_multiprocess_can_split_ = True
class SingleProtocolServer(socketserver.TCPServer):
"""Server that only serves on a single protocol. FreeBSD has this behavior for AF_INET6."""
@@ -174,7 +170,6 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
class TLSSNI01DualNetworkedServersTest(unittest.TestCase):
"""Test for acme.standalone.TLSSNI01DualNetworkedServers."""
_multiprocess_can_split_ = True
def setUp(self):
self.certs = {b'localhost': (
@@ -202,7 +197,6 @@ class TLSSNI01DualNetworkedServersTest(unittest.TestCase):
class HTTP01DualNetworkedServersTest(unittest.TestCase):
"""Tests for acme.standalone.HTTP01DualNetworkedServers."""
_multiprocess_can_split_ = True
def setUp(self):
self.account_key = jose.JWK.load(
@@ -254,7 +248,6 @@ class HTTP01DualNetworkedServersTest(unittest.TestCase):
class TestSimpleTLSSNI01Server(unittest.TestCase):
"""Tests for acme.standalone.simple_tls_sni_01_server."""
_multiprocess_can_split_ = True
def setUp(self):
# mirror ../examples/standalone

View File

@@ -9,10 +9,9 @@ import unittest
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import serialization
import josepy as jose
import OpenSSL
from acme import jose
def vector_path(*names):
"""Path to a test vector."""

BIN
acme/acme/testdata/cert-nocn.der vendored Normal file

Binary file not shown.

View File

@@ -1,10 +1,7 @@
JOSE
----
.. automodule:: acme.jose
:members:
The ``acme.jose`` module was moved to its own package "josepy_".
Please refer to its documentation there.
.. toctree::
:glob:
jose/*
.. _josepy: https://josepy.readthedocs.io/

View File

@@ -1,5 +0,0 @@
JOSE Base64
-----------
.. automodule:: acme.jose.b64
:members:

View File

@@ -1,5 +0,0 @@
Errors
------
.. automodule:: acme.jose.errors
:members:

View File

@@ -1,5 +0,0 @@
Interfaces
----------
.. automodule:: acme.jose.interfaces
:members:

View File

@@ -1,5 +0,0 @@
JSON utilities
--------------
.. automodule:: acme.jose.json_util
:members:

View File

@@ -1,5 +0,0 @@
JSON Web Algorithms
-------------------
.. automodule:: acme.jose.jwa
:members:

View File

@@ -1,5 +0,0 @@
JSON Web Key
------------
.. automodule:: acme.jose.jwk
:members:

View File

@@ -1,5 +0,0 @@
JSON Web Signature
------------------
.. automodule:: acme.jose.jws
:members:

View File

@@ -1,5 +0,0 @@
Utilities
---------
.. automodule:: acme.jose.util
:members:

View File

@@ -1,5 +0,0 @@
Other ACME objects
------------------
.. automodule:: acme.other
:members:

View File

@@ -308,4 +308,5 @@ texinfo_documents = [
intersphinx_mapping = {
'python': ('https://docs.python.org/', None),
'josepy': ('https://josepy.readthedocs.io/en/latest/', None),
}

View File

@@ -5,11 +5,11 @@ import pkg_resources
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import rsa
import josepy as jose
import OpenSSL
from acme import client
from acme import messages
from acme import jose
logging.basicConfig(level=logging.DEBUG)

View File

@@ -4,34 +4,28 @@ from setuptools import setup
from setuptools import find_packages
version = '0.18.0'
version = '0.24.0.dev0'
# Please update tox.ini when modifying dependency version requirements
install_requires = [
# load_pem_private/public_key (>=0.6)
# rsa_recover_prime_factors (>=0.8)
'cryptography>=0.8',
# formerly known as acme.jose:
'josepy>=1.0.0',
# Connection.set_tlsext_host_name (>=0.13)
'mock',
'PyOpenSSL>=0.13',
'pyrfc3339',
'pytz',
'requests[security]>=2.4.1', # security extras added in 2.4.1
# For pkg_resources. >=1.0 so pip resolves it to a version cryptography
# will tolerate; see #2599:
'setuptools>=1.0',
'six',
'setuptools',
'six>=1.9.0', # needed for python_2_unicode_compatible
]
# env markers cause problems with older pip and setuptools
if sys.version_info < (2, 7):
install_requires.extend([
'argparse',
'ordereddict',
])
dev_extras = [
'nose',
'pytest',
'pytest-xdist',
'tox',
]
@@ -49,16 +43,15 @@ setup(
author="Certbot Project",
author_email='client-dev@letsencrypt.org',
license='Apache License 2.0',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
classifiers=[
'Development Status :: 3 - Alpha',
'Intended Audience :: Developers',
'License :: OSI Approved :: Apache Software License',
'Programming Language :: Python',
'Programming Language :: Python :: 2',
'Programming Language :: Python :: 2.6',
'Programming Language :: Python :: 2.7',
'Programming Language :: Python :: 3',
'Programming Language :: Python :: 3.3',
'Programming Language :: Python :: 3.4',
'Programming Language :: Python :: 3.5',
'Programming Language :: Python :: 3.6',
@@ -73,10 +66,5 @@ setup(
'dev': dev_extras,
'docs': docs_extras,
},
entry_points={
'console_scripts': [
'jws = acme.jose.jws:CLI.run',
],
},
test_suite='acme',
)

View File

@@ -0,0 +1,100 @@
""" Utility functions for certbot-apache plugin """
import os
from certbot import util
def get_mod_deps(mod_name):
"""Get known module dependencies.
.. note:: This does not need to be accurate in order for the client to
run. This simply keeps things clean if the user decides to revert
changes.
.. warning:: If all deps are not included, it may cause incorrect parsing
behavior, due to enable_mod's shortcut for updating the parser's
currently defined modules (`.ApacheParser.add_mod`)
This would only present a major problem in extremely atypical
configs that use ifmod for the missing deps.
"""
deps = {
"ssl": ["setenvif", "mime"]
}
return deps.get(mod_name, [])
def get_file_path(vhost_path):
"""Get file path from augeas_vhost_path.
Takes in Augeas path and returns the file name
:param str vhost_path: Augeas virtual host path
:returns: filename of vhost
:rtype: str
"""
if not vhost_path or not vhost_path.startswith("/files/"):
return None
return _split_aug_path(vhost_path)[0]
def get_internal_aug_path(vhost_path):
"""Get the Augeas path for a vhost with the file path removed.
:param str vhost_path: Augeas virtual host path
:returns: Augeas path to vhost relative to the containing file
:rtype: str
"""
return _split_aug_path(vhost_path)[1]
def _split_aug_path(vhost_path):
"""Splits an Augeas path into a file path and an internal path.
After removing "/files", this function splits vhost_path into the
file path and the remaining Augeas path.
:param str vhost_path: Augeas virtual host path
:returns: file path and internal Augeas path
:rtype: `tuple` of `str`
"""
# Strip off /files
file_path = vhost_path[6:]
internal_path = []
# Remove components from the end of file_path until it becomes valid
while not os.path.exists(file_path):
file_path, _, internal_path_part = file_path.rpartition("/")
internal_path.append(internal_path_part)
return file_path, "/".join(reversed(internal_path))
def parse_define_file(filepath, varname):
""" Parses Defines from a variable in configuration file
:param str filepath: Path of file to parse
:param str varname: Name of the variable
:returns: Dict of Define:Value pairs
:rtype: `dict`
"""
return_vars = {}
# Get list of words in the variable
a_opts = util.get_var_from_file(varname, filepath).split()
for i, v in enumerate(a_opts):
# Handle Define statements and make sure it has an argument
if v == "-D" and len(a_opts) >= i+2:
var_parts = a_opts[i+1].partition("=")
return_vars[var_parts[0]] = var_parts[2]
elif len(v) > 2 and v.startswith("-D"):
# Found var with no whitespace separator
var_parts = v[2:].partition("=")
return_vars[var_parts[0]] = var_parts[2]
return return_vars

View File

@@ -76,26 +76,26 @@ class AugeasConfigurator(common.Installer):
self.aug.get(path + "/message")))
raise errors.PluginError(msg)
# TODO: Cleanup this function
def save(self, title=None, temporary=False):
"""Saves all changes to the configuration files.
def ensure_augeas_state(self):
"""Makes sure that all Augeas dom changes are written to files to avoid
loss of configuration directives when doing additional augeas parsing,
causing a possible augeas.load() resulting dom reset
"""
This function first checks for save errors, if none are found,
all configuration changes made will be saved. According to the
function parameters. If an exception is raised, a new checkpoint
was not created.
if self.unsaved_files():
self.save_notes += "(autosave)"
self.save()
:param str title: The title of the save. If a title is given, the
configuration will be saved as a new checkpoint and put in a
timestamped directory.
:param bool temporary: Indicates whether the changes made will
be quickly reversed in the future (ie. challenges)
def unsaved_files(self):
"""Lists files that have modified Augeas DOM but the changes have not
been written to the filesystem yet, used by `self.save()` and
ApacheConfigurator to check the file state.
:raises .errors.PluginError: If there was an error in Augeas, in
an attempt to save the configuration, or an error creating a
checkpoint
:returns: `set` of unsaved files
"""
save_state = self.aug.get("/augeas/save")
self.aug.set("/augeas/save", "noop")
@@ -111,21 +111,41 @@ class AugeasConfigurator(common.Installer):
raise errors.PluginError(
"Error saving files, check logs for more info.")
# Return the original save method
self.aug.set("/augeas/save", save_state)
# Retrieve list of modified files
# Note: Noop saves can cause the file to be listed twice, I used a
# set to remove this possibility. This is a known augeas 0.10 error.
save_paths = self.aug.match("/augeas/events/saved")
# If the augeas tree didn't change, no files were saved and a backup
# should not be created
save_files = set()
if save_paths:
for path in save_paths:
save_files.add(self.aug.get(path)[6:])
return save_files
def save(self, title=None, temporary=False):
"""Saves all changes to the configuration files.
This function first checks for save errors, if none are found,
all configuration changes made will be saved. According to the
function parameters. If an exception is raised, a new checkpoint
was not created.
:param str title: The title of the save. If a title is given, the
configuration will be saved as a new checkpoint and put in a
timestamped directory.
:param bool temporary: Indicates whether the changes made will
be quickly reversed in the future (ie. challenges)
"""
save_files = self.unsaved_files()
if save_files:
self.add_to_checkpoint(save_files,
self.save_notes, temporary=temporary)
self.aug.set("/augeas/save", save_state)
self.save_notes = ""
self.aug.save()

View File

@@ -8,7 +8,7 @@ SSLEngine on
# Intermediate configuration, tweak to your needs
SSLProtocol all -SSLv2 -SSLv3
SSLCipherSuite ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-DSS-AES128-GCM-SHA256:kEDH+AESGCM:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-DSS-AES128-SHA256:DHE-RSA-AES256-SHA256:DHE-DSS-AES256-SHA:DHE-RSA-AES256-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:AES:CAMELLIA:DES-CBC3-SHA:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!aECDH:!EDH-DSS-DES-CBC3-SHA:!EDH-RSA-DES-CBC3-SHA:!KRB5-DES-CBC3-SHA
SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
SSLHonorCipherOrder on
SSLOptions +StrictRequire

File diff suppressed because it is too large Load Diff

View File

@@ -1,151 +1,6 @@
"""Apache plugin constants."""
import pkg_resources
from certbot import util
CLI_DEFAULTS_DEFAULT = dict(
server_root="/etc/apache2",
vhost_root="/etc/apache2/sites-available",
vhost_files="*",
logs_root="/var/log/apache2",
version_cmd=['apache2ctl', '-v'],
define_cmd=['apache2ctl', '-t', '-D', 'DUMP_RUN_CFG'],
restart_cmd=['apache2ctl', 'graceful'],
conftest_cmd=['apache2ctl', 'configtest'],
enmod=None,
dismod=None,
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/apache2",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
CLI_DEFAULTS_DEBIAN = dict(
server_root="/etc/apache2",
vhost_root="/etc/apache2/sites-available",
vhost_files="*",
logs_root="/var/log/apache2",
version_cmd=['apache2ctl', '-v'],
define_cmd=['apache2ctl', '-t', '-D', 'DUMP_RUN_CFG'],
restart_cmd=['apache2ctl', 'graceful'],
conftest_cmd=['apache2ctl', 'configtest'],
enmod="a2enmod",
dismod="a2dismod",
le_vhost_ext="-le-ssl.conf",
handle_mods=True,
handle_sites=True,
challenge_location="/etc/apache2",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
CLI_DEFAULTS_CENTOS = dict(
server_root="/etc/httpd",
vhost_root="/etc/httpd/conf.d",
vhost_files="*.conf",
logs_root="/var/log/httpd",
version_cmd=['apachectl', '-v'],
define_cmd=['apachectl', '-t', '-D', 'DUMP_RUN_CFG'],
restart_cmd=['apachectl', 'graceful'],
conftest_cmd=['apachectl', 'configtest'],
enmod=None,
dismod=None,
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/httpd/conf.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "centos-options-ssl-apache.conf")
)
CLI_DEFAULTS_GENTOO = dict(
server_root="/etc/apache2",
vhost_root="/etc/apache2/vhosts.d",
vhost_files="*.conf",
logs_root="/var/log/apache2",
version_cmd=['/usr/sbin/apache2', '-v'],
define_cmd=['apache2ctl', 'virtualhosts'],
restart_cmd=['apache2ctl', 'graceful'],
conftest_cmd=['apache2ctl', 'configtest'],
enmod=None,
dismod=None,
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/apache2/vhosts.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
CLI_DEFAULTS_DARWIN = dict(
server_root="/etc/apache2",
vhost_root="/etc/apache2/other",
vhost_files="*.conf",
logs_root="/var/log/apache2",
version_cmd=['/usr/sbin/httpd', '-v'],
define_cmd=['/usr/sbin/httpd', '-t', '-D', 'DUMP_RUN_CFG'],
restart_cmd=['apachectl', 'graceful'],
conftest_cmd=['apachectl', 'configtest'],
enmod=None,
dismod=None,
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/apache2/other",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
CLI_DEFAULTS_SUSE = dict(
server_root="/etc/apache2",
vhost_root="/etc/apache2/vhosts.d",
vhost_files="*.conf",
logs_root="/var/log/apache2",
version_cmd=['apache2ctl', '-v'],
define_cmd=['apache2ctl', '-t', '-D', 'DUMP_RUN_CFG'],
restart_cmd=['apache2ctl', 'graceful'],
conftest_cmd=['apache2ctl', 'configtest'],
enmod="a2enmod",
dismod="a2dismod",
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/apache2/vhosts.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
CLI_DEFAULTS_ARCH = dict(
server_root="/etc/httpd",
vhost_root="/etc/httpd/conf",
vhost_files="*.conf",
logs_root="/var/log/httpd",
version_cmd=['apachectl', '-v'],
define_cmd=['apachectl', '-t', '-D', 'DUMP_RUN_CFG'],
restart_cmd=['apachectl', 'graceful'],
conftest_cmd=['apachectl', 'configtest'],
enmod=None,
dismod=None,
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/httpd/conf",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
CLI_DEFAULTS = {
"default": CLI_DEFAULTS_DEFAULT,
"debian": CLI_DEFAULTS_DEBIAN,
"ubuntu": CLI_DEFAULTS_DEBIAN,
"centos": CLI_DEFAULTS_CENTOS,
"centos linux": CLI_DEFAULTS_CENTOS,
"fedora": CLI_DEFAULTS_CENTOS,
"red hat enterprise linux server": CLI_DEFAULTS_CENTOS,
"rhel": CLI_DEFAULTS_CENTOS,
"amazon": CLI_DEFAULTS_CENTOS,
"gentoo": CLI_DEFAULTS_GENTOO,
"gentoo base system": CLI_DEFAULTS_GENTOO,
"darwin": CLI_DEFAULTS_DARWIN,
"opensuse": CLI_DEFAULTS_SUSE,
"suse": CLI_DEFAULTS_SUSE,
"arch": CLI_DEFAULTS_ARCH,
}
"""CLI defaults."""
MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
"""Name of the mod_ssl config file as saved in `IConfig.config_dir`."""
@@ -161,6 +16,8 @@ ALL_SSL_OPTIONS_HASHES = [
'4066b90268c03c9ba0201068eaa39abbc02acf9558bb45a788b630eb85dadf27',
'f175e2e7c673bd88d0aff8220735f385f916142c44aa83b09f1df88dd4767a88',
'cfdd7c18d2025836ea3307399f509cfb1ebf2612c87dd600a65da2a8e2f2797b',
'80720bd171ccdc2e6b917ded340defae66919e4624962396b992b7218a561791',
'c0c022ea6b8a51ecc8f1003d0a04af6c3f2bc1c3ce506b3c2dfc1f11ef931082',
]
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""
@@ -191,39 +48,3 @@ UIR_ARGS = ["always", "set", "Content-Security-Policy",
HEADER_ARGS = {"Strict-Transport-Security": HSTS_ARGS,
"Upgrade-Insecure-Requests": UIR_ARGS}
def os_constant(key):
"""
Get a constant value for operating system
:param key: name of cli constant
:return: value of constant for active os
"""
os_info = util.get_os_info()
try:
constants = CLI_DEFAULTS[os_info[0].lower()]
except KeyError:
constants = os_like_constants()
if not constants:
constants = CLI_DEFAULTS["default"]
return constants[key]
def os_like_constants():
"""
Try to get constants for distribution with
similar layout and configuration, indicated by
/etc/os-release variable "LIKE"
:returns: Constants dictionary
:rtype: `dict`
"""
os_like = util.get_systemd_os_like()
if os_like:
for os_name in os_like:
if os_name in CLI_DEFAULTS.keys():
return CLI_DEFAULTS[os_name]
return {}

View File

@@ -13,10 +13,44 @@ import certbot.display.util as display_util
logger = logging.getLogger(__name__)
def select_vhost_multiple(vhosts):
"""Select multiple Vhosts to install the certificate for
:param vhosts: Available Apache VirtualHosts
:type vhosts: :class:`list` of type `~obj.Vhost`
:returns: List of VirtualHosts
:rtype: :class:`list`of type `~obj.Vhost`
"""
if not vhosts:
return list()
tags_list = [vhost.display_repr()+"\n" for vhost in vhosts]
# Remove the extra newline from the last entry
if len(tags_list):
tags_list[-1] = tags_list[-1][:-1]
code, names = zope.component.getUtility(interfaces.IDisplay).checklist(
"Which VirtualHosts would you like to install the wildcard certificate for?",
tags=tags_list, force_interactive=True)
if code == display_util.OK:
return_vhosts = _reversemap_vhosts(names, vhosts)
return return_vhosts
return []
def _reversemap_vhosts(names, vhosts):
"""Helper function for select_vhost_multiple for mapping string
representations back to actual vhost objects"""
return_vhosts = list()
for selection in names:
for vhost in vhosts:
if vhost.display_repr().strip() == selection.strip():
return_vhosts.append(vhost)
return return_vhosts
def select_vhost(domain, vhosts):
"""Select an appropriate Apache Vhost.
:param vhosts: Available Apache Virtual Hosts
:param vhosts: Available Apache VirtualHosts
:type vhosts: :class:`list` of type `~obj.Vhost`
:returns: VirtualHost or `None`
@@ -25,13 +59,11 @@ def select_vhost(domain, vhosts):
"""
if not vhosts:
return None
while True:
code, tag = _vhost_menu(domain, vhosts)
if code == display_util.OK:
return vhosts[tag]
else:
return None
code, tag = _vhost_menu(domain, vhosts)
if code == display_util.OK:
return vhosts[tag]
else:
return None
def _vhost_menu(domain, vhosts):
"""Select an appropriate Apache Vhost.
@@ -86,10 +118,11 @@ def _vhost_menu(domain, vhosts):
choices, force_interactive=True)
except errors.MissingCommandlineFlag:
msg = (
"Encountered vhost ambiguity but unable to ask for user "
"Encountered vhost ambiguity when trying to find a vhost for "
"{0} but was unable to ask for user "
"guidance in non-interactive mode. Certbot may need "
"vhosts to be explicitly labelled with ServerName or "
"ServerAlias directives.")
"ServerAlias directives.".format(domain))
logger.warning(msg)
raise errors.MissingCommandlineFlag(msg)

View File

@@ -0,0 +1,48 @@
""" Entry point for Apache Plugin """
from certbot import util
from certbot_apache import configurator
from certbot_apache import override_arch
from certbot_apache import override_darwin
from certbot_apache import override_debian
from certbot_apache import override_centos
from certbot_apache import override_gentoo
from certbot_apache import override_suse
OVERRIDE_CLASSES = {
"arch": override_arch.ArchConfigurator,
"darwin": override_darwin.DarwinConfigurator,
"debian": override_debian.DebianConfigurator,
"ubuntu": override_debian.DebianConfigurator,
"centos": override_centos.CentOSConfigurator,
"centos linux": override_centos.CentOSConfigurator,
"fedora": override_centos.CentOSConfigurator,
"ol": override_centos.CentOSConfigurator,
"red hat enterprise linux server": override_centos.CentOSConfigurator,
"rhel": override_centos.CentOSConfigurator,
"amazon": override_centos.CentOSConfigurator,
"gentoo": override_gentoo.GentooConfigurator,
"gentoo base system": override_gentoo.GentooConfigurator,
"opensuse": override_suse.OpenSUSEConfigurator,
"suse": override_suse.OpenSUSEConfigurator,
}
def get_configurator():
""" Get correct configurator class based on the OS fingerprint """
os_info = util.get_os_info()
override_class = None
try:
override_class = OVERRIDE_CLASSES[os_info[0].lower()]
except KeyError:
# OS not found in the list
os_like = util.get_systemd_os_like()
if os_like:
for os_name in os_like:
if os_name in OVERRIDE_CLASSES.keys():
override_class = OVERRIDE_CLASSES[os_name]
if not override_class:
# No override class found, return the generic configurator
override_class = configurator.ApacheConfigurator
return override_class
ENTRYPOINT = get_configurator()

View File

@@ -0,0 +1,174 @@
"""A class that performs HTTP-01 challenges for Apache"""
import logging
import os
from certbot import errors
from certbot.plugins import common
logger = logging.getLogger(__name__)
class ApacheHttp01(common.TLSSNI01):
"""Class that performs HTTP-01 challenges within the Apache configurator."""
CONFIG_TEMPLATE22_PRE = """\
RewriteEngine on
RewriteRule ^/\\.well-known/acme-challenge/([A-Za-z0-9-_=]+)$ {0}/$1 [L]
"""
CONFIG_TEMPLATE22_POST = """\
<Directory {0}>
Order Allow,Deny
Allow from all
</Directory>
<Location /.well-known/acme-challenge>
Order Allow,Deny
Allow from all
</Location>
"""
CONFIG_TEMPLATE24_PRE = """\
RewriteEngine on
RewriteRule ^/\\.well-known/acme-challenge/([A-Za-z0-9-_=]+)$ {0}/$1 [END]
"""
CONFIG_TEMPLATE24_POST = """\
<Directory {0}>
Require all granted
</Directory>
<Location /.well-known/acme-challenge>
Require all granted
</Location>
"""
def __init__(self, *args, **kwargs):
super(ApacheHttp01, self).__init__(*args, **kwargs)
self.challenge_conf_pre = os.path.join(
self.configurator.conf("challenge-location"),
"le_http_01_challenge_pre.conf")
self.challenge_conf_post = os.path.join(
self.configurator.conf("challenge-location"),
"le_http_01_challenge_post.conf")
self.challenge_dir = os.path.join(
self.configurator.config.work_dir,
"http_challenges")
self.moded_vhosts = set()
def perform(self):
"""Perform all HTTP-01 challenges."""
if not self.achalls:
return []
# Save any changes to the configuration as a precaution
# About to make temporary changes to the config
self.configurator.save("Changes before challenge setup", True)
self.configurator.ensure_listen(str(
self.configurator.config.http01_port))
self.prepare_http01_modules()
responses = self._set_up_challenges()
self._mod_config()
# Save reversible changes
self.configurator.save("HTTP Challenge", True)
return responses
def prepare_http01_modules(self):
"""Make sure that we have the needed modules available for http01"""
if self.configurator.conf("handle-modules"):
needed_modules = ["rewrite"]
if self.configurator.version < (2, 4):
needed_modules.append("authz_host")
else:
needed_modules.append("authz_core")
for mod in needed_modules:
if mod + "_module" not in self.configurator.parser.modules:
self.configurator.enable_mod(mod, temp=True)
def _mod_config(self):
for chall in self.achalls:
vh = self.configurator.find_best_http_vhost(
chall.domain, filter_defaults=False,
port=str(self.configurator.config.http01_port))
if vh:
self._set_up_include_directives(vh)
else:
for vh in self._relevant_vhosts():
self._set_up_include_directives(vh)
self.configurator.reverter.register_file_creation(
True, self.challenge_conf_pre)
self.configurator.reverter.register_file_creation(
True, self.challenge_conf_post)
if self.configurator.version < (2, 4):
config_template_pre = self.CONFIG_TEMPLATE22_PRE
config_template_post = self.CONFIG_TEMPLATE22_POST
else:
config_template_pre = self.CONFIG_TEMPLATE24_PRE
config_template_post = self.CONFIG_TEMPLATE24_POST
config_text_pre = config_template_pre.format(self.challenge_dir)
config_text_post = config_template_post.format(self.challenge_dir)
logger.debug("writing a pre config file with text:\n %s", config_text_pre)
with open(self.challenge_conf_pre, "w") as new_conf:
new_conf.write(config_text_pre)
logger.debug("writing a post config file with text:\n %s", config_text_post)
with open(self.challenge_conf_post, "w") as new_conf:
new_conf.write(config_text_post)
def _relevant_vhosts(self):
http01_port = str(self.configurator.config.http01_port)
relevant_vhosts = []
for vhost in self.configurator.vhosts:
if any(a.is_wildcard() or a.get_port() == http01_port for a in vhost.addrs):
if not vhost.ssl:
relevant_vhosts.append(vhost)
if not relevant_vhosts:
raise errors.PluginError(
"Unable to find a virtual host listening on port {0} which is"
" currently needed for Certbot to prove to the CA that you"
" control your domain. Please add a virtual host for port"
" {0}.".format(http01_port))
return relevant_vhosts
def _set_up_challenges(self):
if not os.path.isdir(self.challenge_dir):
os.makedirs(self.challenge_dir)
os.chmod(self.challenge_dir, 0o755)
responses = []
for achall in self.achalls:
responses.append(self._set_up_challenge(achall))
return responses
def _set_up_challenge(self, achall):
response, validation = achall.response_and_validation()
name = os.path.join(self.challenge_dir, achall.chall.encode("token"))
self.configurator.reverter.register_file_creation(True, name)
with open(name, 'wb') as f:
f.write(validation.encode())
os.chmod(name, 0o644)
return response
def _set_up_include_directives(self, vhost):
"""Includes override configuration to the beginning and to the end of
VirtualHost. Note that this include isn't added to Augeas search tree"""
if vhost not in self.moded_vhosts:
logger.debug(
"Adding a temporary challenge validation Include for name: %s " +
"in: %s", vhost.name, vhost.filep)
self.configurator.parser.add_dir_beginning(
vhost.path, "Include", self.challenge_conf_pre)
self.configurator.parser.add_dir(
vhost.path, "Include", self.challenge_conf_post)
self.moded_vhosts.add(vhost)

View File

@@ -167,6 +167,19 @@ class VirtualHost(object): # pylint: disable=too-few-public-methods
active="Yes" if self.enabled else "No",
modmacro="Yes" if self.modmacro else "No"))
def display_repr(self):
"""Return a representation of VHost to be used in dialog"""
return (
"File: {filename}\n"
"Addresses: {addrs}\n"
"Names: {names}\n"
"HTTPS: {https}\n".format(
filename=self.filep,
addrs=", ".join(str(addr) for addr in self.addrs),
names=", ".join(self.get_names()),
https="Yes" if self.ssl else "No"))
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.filep == other.filep and self.path == other.path and

View File

@@ -8,7 +8,7 @@ SSLEngine on
# Intermediate configuration, tweak to your needs
SSLProtocol all -SSLv2 -SSLv3
SSLCipherSuite ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-DSS-AES128-GCM-SHA256:kEDH+AESGCM:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA384:ECDHE-RSA-AES256-SHA:ECDHE-ECDSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-DSS-AES128-SHA256:DHE-RSA-AES256-SHA256:DHE-DSS-AES256-SHA:DHE-RSA-AES256-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:AES:CAMELLIA:DES-CBC3-SHA:!aNULL:!eNULL:!EXPORT:!DES:!RC4:!MD5:!PSK:!aECDH:!EDH-DSS-DES-CBC3-SHA:!EDH-RSA-DES-CBC3-SHA:!KRB5-DES-CBC3-SHA
SSLCipherSuite ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-AES128-SHA256:ECDHE-RSA-AES128-SHA256:ECDHE-ECDSA-AES128-SHA:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-SHA:ECDHE-ECDSA-AES256-SHA384:ECDHE-ECDSA-AES256-SHA:ECDHE-RSA-AES256-SHA:DHE-RSA-AES128-SHA256:DHE-RSA-AES128-SHA:DHE-RSA-AES256-SHA256:DHE-RSA-AES256-SHA:ECDHE-ECDSA-DES-CBC3-SHA:ECDHE-RSA-DES-CBC3-SHA:EDH-RSA-DES-CBC3-SHA:AES128-GCM-SHA256:AES256-GCM-SHA384:AES128-SHA256:AES256-SHA256:AES128-SHA:AES256-SHA:DES-CBC3-SHA:!DSS
SSLHonorCipherOrder on
SSLCompression off

View File

@@ -0,0 +1,31 @@
""" Distribution specific override class for Arch Linux """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot_apache import configurator
@zope.interface.provider(interfaces.IPluginFactory)
class ArchConfigurator(configurator.ApacheConfigurator):
"""Arch Linux specific ApacheConfigurator override class"""
OS_DEFAULTS = dict(
server_root="/etc/httpd",
vhost_root="/etc/httpd/conf",
vhost_files="*.conf",
logs_root="/var/log/httpd",
version_cmd=['apachectl', '-v'],
apache_cmd="apachectl",
restart_cmd=['apachectl', 'graceful'],
conftest_cmd=['apachectl', 'configtest'],
enmod=None,
dismod=None,
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/httpd/conf",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)

View File

@@ -0,0 +1,60 @@
""" Distribution specific override class for CentOS family (RHEL, Fedora) """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot_apache import apache_util
from certbot_apache import configurator
from certbot_apache import parser
@zope.interface.provider(interfaces.IPluginFactory)
class CentOSConfigurator(configurator.ApacheConfigurator):
"""CentOS specific ApacheConfigurator override class"""
OS_DEFAULTS = dict(
server_root="/etc/httpd",
vhost_root="/etc/httpd/conf.d",
vhost_files="*.conf",
logs_root="/var/log/httpd",
version_cmd=['apachectl', '-v'],
apache_cmd="apachectl",
restart_cmd=['apachectl', 'graceful'],
restart_cmd_alt=['apachectl', 'restart'],
conftest_cmd=['apachectl', 'configtest'],
enmod=None,
dismod=None,
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/httpd/conf.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "centos-options-ssl-apache.conf")
)
def get_parser(self):
"""Initializes the ApacheParser"""
return CentOSParser(
self.aug, self.conf("server-root"), self.conf("vhost-root"),
self.version, configurator=self)
class CentOSParser(parser.ApacheParser):
"""CentOS specific ApacheParser override class"""
def __init__(self, *args, **kwargs):
# CentOS specific configuration file for Apache
self.sysconfig_filep = "/etc/sysconfig/httpd"
super(CentOSParser, self).__init__(*args, **kwargs)
def update_runtime_variables(self, *args, **kwargs):
""" Override for update_runtime_variables for custom parsing """
# Opportunistic, works if SELinux not enforced
super(CentOSParser, self).update_runtime_variables(*args, **kwargs)
self.parse_sysconfig_var()
def parse_sysconfig_var(self):
""" Parses Apache CLI options from CentOS configuration file """
defines = apache_util.parse_define_file(self.sysconfig_filep, "OPTIONS")
for k in defines.keys():
self.variables[k] = defines[k]

View File

@@ -0,0 +1,31 @@
""" Distribution specific override class for macOS """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot_apache import configurator
@zope.interface.provider(interfaces.IPluginFactory)
class DarwinConfigurator(configurator.ApacheConfigurator):
"""macOS specific ApacheConfigurator override class"""
OS_DEFAULTS = dict(
server_root="/etc/apache2",
vhost_root="/etc/apache2/other",
vhost_files="*.conf",
logs_root="/var/log/apache2",
version_cmd=['/usr/sbin/httpd', '-v'],
apache_cmd="/usr/sbin/httpd",
restart_cmd=['apachectl', 'graceful'],
conftest_cmd=['apachectl', 'configtest'],
enmod=None,
dismod=None,
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/apache2/other",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)

View File

@@ -0,0 +1,144 @@
""" Distribution specific override class for Debian family (Ubuntu/Debian) """
import logging
import os
import pkg_resources
import zope.interface
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot_apache import apache_util
from certbot_apache import configurator
logger = logging.getLogger(__name__)
@zope.interface.provider(interfaces.IPluginFactory)
class DebianConfigurator(configurator.ApacheConfigurator):
"""Debian specific ApacheConfigurator override class"""
OS_DEFAULTS = dict(
server_root="/etc/apache2",
vhost_root="/etc/apache2/sites-available",
vhost_files="*",
logs_root="/var/log/apache2",
version_cmd=['apache2ctl', '-v'],
apache_cmd="apache2ctl",
restart_cmd=['apache2ctl', 'graceful'],
conftest_cmd=['apache2ctl', 'configtest'],
enmod="a2enmod",
dismod="a2dismod",
le_vhost_ext="-le-ssl.conf",
handle_mods=True,
handle_sites=True,
challenge_location="/etc/apache2",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
def enable_site(self, vhost):
"""Enables an available site, Apache reload required.
.. note:: Does not make sure that the site correctly works or that all
modules are enabled appropriately.
:param vhost: vhost to enable
:type vhost: :class:`~certbot_apache.obj.VirtualHost`
:raises .errors.NotSupportedError: If filesystem layout is not
supported.
"""
if vhost.enabled:
return
enabled_path = ("%s/sites-enabled/%s" %
(self.parser.root,
os.path.basename(vhost.filep)))
if not os.path.isdir(os.path.dirname(enabled_path)):
# For some reason, sites-enabled / sites-available do not exist
# Call the parent method
return super(DebianConfigurator, self).enable_site(vhost)
self.reverter.register_file_creation(False, enabled_path)
try:
os.symlink(vhost.filep, enabled_path)
except OSError as err:
if os.path.islink(enabled_path) and os.path.realpath(
enabled_path) == vhost.filep:
# Already in shape
vhost.enabled = True
return
else:
logger.warning(
"Could not symlink %s to %s, got error: %s", enabled_path,
vhost.filep, err.strerror)
errstring = ("Encountered error while trying to enable a " +
"newly created VirtualHost located at {0} by " +
"linking to it from {1}")
raise errors.NotSupportedError(errstring.format(vhost.filep,
enabled_path))
vhost.enabled = True
logger.info("Enabling available site: %s", vhost.filep)
self.save_notes += "Enabled site %s\n" % vhost.filep
def enable_mod(self, mod_name, temp=False):
# pylint: disable=unused-argument
"""Enables module in Apache.
Both enables and reloads Apache so module is active.
:param str mod_name: Name of the module to enable. (e.g. 'ssl')
:param bool temp: Whether or not this is a temporary action.
:raises .errors.NotSupportedError: If the filesystem layout is not
supported.
:raises .errors.MisconfigurationError: If a2enmod or a2dismod cannot be
run.
"""
avail_path = os.path.join(self.parser.root, "mods-available")
enabled_path = os.path.join(self.parser.root, "mods-enabled")
if not os.path.isdir(avail_path) or not os.path.isdir(enabled_path):
raise errors.NotSupportedError(
"Unsupported directory layout. You may try to enable mod %s "
"and try again." % mod_name)
deps = apache_util.get_mod_deps(mod_name)
# Enable all dependencies
for dep in deps:
if (dep + "_module") not in self.parser.modules:
self._enable_mod_debian(dep, temp)
self.parser.add_mod(dep)
note = "Enabled dependency of %s module - %s" % (mod_name, dep)
if not temp:
self.save_notes += note + os.linesep
logger.debug(note)
# Enable actual module
self._enable_mod_debian(mod_name, temp)
self.parser.add_mod(mod_name)
if not temp:
self.save_notes += "Enabled %s module in Apache\n" % mod_name
logger.info("Enabled Apache %s module", mod_name)
# Modules can enable additional config files. Variables may be defined
# within these new configuration sections.
# Reload is not necessary as DUMP_RUN_CFG uses latest config.
self.parser.update_runtime_variables()
def _enable_mod_debian(self, mod_name, temp):
"""Assumes mods-available, mods-enabled layout."""
# Generate reversal command.
# Try to be safe here... check that we can probably reverse before
# applying enmod command
if not util.exe_exists(self.conf("dismod")):
raise errors.MisconfigurationError(
"Unable to find a2dismod, please make sure a2enmod and "
"a2dismod are configured correctly for certbot.")
self.reverter.register_undo_command(
temp, [self.conf("dismod"), "-f", mod_name])
util.run_script([self.conf("enmod"), mod_name])

View File

@@ -0,0 +1,67 @@
""" Distribution specific override class for Gentoo Linux """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot_apache import apache_util
from certbot_apache import configurator
from certbot_apache import parser
@zope.interface.provider(interfaces.IPluginFactory)
class GentooConfigurator(configurator.ApacheConfigurator):
"""Gentoo specific ApacheConfigurator override class"""
OS_DEFAULTS = dict(
server_root="/etc/apache2",
vhost_root="/etc/apache2/vhosts.d",
vhost_files="*.conf",
logs_root="/var/log/apache2",
version_cmd=['/usr/sbin/apache2', '-v'],
apache_cmd="apache2ctl",
restart_cmd=['apache2ctl', 'graceful'],
restart_cmd_alt=['apache2ctl', 'restart'],
conftest_cmd=['apache2ctl', 'configtest'],
enmod=None,
dismod=None,
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/apache2/vhosts.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)
def get_parser(self):
"""Initializes the ApacheParser"""
return GentooParser(
self.aug, self.conf("server-root"), self.conf("vhost-root"),
self.version, configurator=self)
class GentooParser(parser.ApacheParser):
"""Gentoo specific ApacheParser override class"""
def __init__(self, *args, **kwargs):
# Gentoo specific configuration file for Apache2
self.apacheconfig_filep = "/etc/conf.d/apache2"
super(GentooParser, self).__init__(*args, **kwargs)
def update_runtime_variables(self):
""" Override for update_runtime_variables for custom parsing """
self.parse_sysconfig_var()
self.update_modules()
def parse_sysconfig_var(self):
""" Parses Apache CLI options from Gentoo configuration file """
defines = apache_util.parse_define_file(self.apacheconfig_filep,
"APACHE2_OPTS")
for k in defines.keys():
self.variables[k] = defines[k]
def update_modules(self):
"""Get loaded modules from httpd process, and add them to DOM"""
mod_cmd = [self.configurator.constant("apache_cmd"), "modules"]
matches = self.parse_from_subprocess(mod_cmd, r"(.*)_module")
for mod in matches:
self.add_mod(mod.strip())

View File

@@ -0,0 +1,31 @@
""" Distribution specific override class for OpenSUSE """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot_apache import configurator
@zope.interface.provider(interfaces.IPluginFactory)
class OpenSUSEConfigurator(configurator.ApacheConfigurator):
"""OpenSUSE specific ApacheConfigurator override class"""
OS_DEFAULTS = dict(
server_root="/etc/apache2",
vhost_root="/etc/apache2/vhosts.d",
vhost_files="*.conf",
logs_root="/var/log/apache2",
version_cmd=['apache2ctl', '-v'],
apache_cmd="apache2ctl",
restart_cmd=['apache2ctl', 'graceful'],
conftest_cmd=['apache2ctl', 'configtest'],
enmod="a2enmod",
dismod="a2dismod",
le_vhost_ext="-le-ssl.conf",
handle_mods=False,
handle_sites=False,
challenge_location="/etc/apache2/vhosts.d",
MOD_SSL_CONF_SRC=pkg_resources.resource_filename(
"certbot_apache", "options-ssl-apache.conf")
)

View File

@@ -1,4 +1,5 @@
"""ApacheParser is a member object of the ApacheConfigurator class."""
import copy
import fnmatch
import logging
import os
@@ -10,8 +11,6 @@ import six
from certbot import errors
from certbot_apache import constants
logger = logging.getLogger(__name__)
@@ -30,86 +29,130 @@ class ApacheParser(object):
arg_var_interpreter = re.compile(r"\$\{[^ \}]*}")
fnmatch_chars = set(["*", "?", "\\", "[", "]"])
def __init__(self, aug, root, vhostroot, version=(2, 4)):
def __init__(self, aug, root, vhostroot=None, version=(2, 4),
configurator=None):
# Note: Order is important here.
# This uses the binary, so it can be done first.
# https://httpd.apache.org/docs/2.4/mod/core.html#define
# https://httpd.apache.org/docs/2.4/mod/core.html#ifdefine
# This only handles invocation parameters and Define directives!
# Needed for calling save() with reverter functionality that resides in
# AugeasConfigurator superclass of ApacheConfigurator. This resolves
# issues with aug.load() after adding new files / defines to parse tree
self.configurator = configurator
self.modules = set()
self.parser_paths = {}
self.variables = {}
if version >= (2, 4):
self.update_runtime_variables()
self.aug = aug
# Find configuration root and make sure augeas can parse it.
self.root = os.path.abspath(root)
self.loc = {"root": self._find_config_root()}
self._parse_file(self.loc["root"])
self.parse_file(self.loc["root"])
self.vhostroot = os.path.abspath(vhostroot)
if version >= (2, 4):
# Look up variables from httpd and add to DOM if not already parsed
self.update_runtime_variables()
# This problem has been fixed in Augeas 1.0
self.standardize_excl()
# Temporarily set modules to be empty, so that find_dirs can work
# https://httpd.apache.org/docs/2.4/mod/core.html#ifmodule
# This needs to come before locations are set.
self.modules = set()
self.init_modules()
# Parse LoadModule directives from configuration files
self.parse_modules()
# Set up rest of locations
self.loc.update(self._set_locations())
# Must also attempt to parse virtual host root
self._parse_file(self.vhostroot + "/" +
constants.os_constant("vhost_files"))
# list of the active include paths, before modifications
self.existing_paths = copy.deepcopy(self.parser_paths)
# Must also attempt to parse additional virtual host root
if vhostroot:
self.parse_file(os.path.abspath(vhostroot) + "/" +
self.configurator.constant("vhost_files"))
# check to see if there were unparsed define statements
if version < (2, 4):
if self.find_dir("Define", exclude=False):
raise errors.PluginError("Error parsing runtime variables")
def init_modules(self):
def add_include(self, main_config, inc_path):
"""Add Include for a new configuration file if one does not exist
:param str main_config: file path to main Apache config file
:param str inc_path: path of file to include
"""
if len(self.find_dir(case_i("Include"), inc_path)) == 0:
logger.debug("Adding Include %s to %s",
inc_path, get_aug_path(main_config))
self.add_dir(
get_aug_path(main_config),
"Include", inc_path)
# Add new path to parser paths
new_dir = os.path.dirname(inc_path)
new_file = os.path.basename(inc_path)
if new_dir in self.existing_paths.keys():
# Add to existing path
self.existing_paths[new_dir].append(new_file)
else:
# Create a new path
self.existing_paths[new_dir] = [new_file]
def add_mod(self, mod_name):
"""Shortcut for updating parser modules."""
if mod_name + "_module" not in self.modules:
self.modules.add(mod_name + "_module")
if "mod_" + mod_name + ".c" not in self.modules:
self.modules.add("mod_" + mod_name + ".c")
def reset_modules(self):
"""Reset the loaded modules list. This is called from cleanup to clear
temporarily loaded modules."""
self.modules = set()
self.update_modules()
self.parse_modules()
def parse_modules(self):
"""Iterates on the configuration until no new modules are loaded.
..todo:: This should be attempted to be done with a binary to avoid
the iteration issue. Else... parse and enable mods at same time.
"""
# Since modules are being initiated... clear existing set.
self.modules = set()
mods = set()
matches = self.find_dir("LoadModule")
iterator = iter(matches)
# Make sure prev_size != cur_size for do: while: iteration
prev_size = -1
while len(self.modules) != prev_size:
prev_size = len(self.modules)
while len(mods) != prev_size:
prev_size = len(mods)
for match_name, match_filename in six.moves.zip(
iterator, iterator):
self.modules.add(self.get_arg(match_name))
self.modules.add(
os.path.basename(self.get_arg(match_filename))[:-2] + "c")
mod_name = self.get_arg(match_name)
mod_filename = self.get_arg(match_filename)
if mod_name and mod_filename:
mods.add(mod_name)
mods.add(os.path.basename(mod_filename)[:-2] + "c")
else:
logger.debug("Could not read LoadModule directive from " +
"Augeas path: {0}".format(match_name[6:]))
self.modules.update(mods)
def update_runtime_variables(self):
""""
"""Update Includes, Defines and Includes from httpd config dump data"""
self.update_defines()
self.update_includes()
self.update_modules()
.. note:: Compile time variables (apache2ctl -V) are not used within
the dynamic configuration files. These should not be parsed or
interpreted.
.. todo:: Create separate compile time variables...
simply for arg_get()
"""
stdout = self._get_runtime_cfg()
def update_defines(self):
"""Get Defines from httpd process"""
variables = dict()
matches = re.compile(r"Define: ([^ \n]*)").findall(stdout)
define_cmd = [self.configurator.constant("apache_cmd"), "-t", "-D",
"DUMP_RUN_CFG"]
matches = self.parse_from_subprocess(define_cmd, r"Define: ([^ \n]*)")
try:
matches.remove("DUMP_RUN_CFG")
except ValueError:
@@ -126,15 +169,54 @@ class ApacheParser(object):
self.variables = variables
def _get_runtime_cfg(self): # pylint: disable=no-self-use
"""Get runtime configuration info.
def update_includes(self):
"""Get includes from httpd process, and add them to DOM if needed"""
:returns: stdout from DUMP_RUN_CFG
# Find_dir iterates over configuration for Include and IncludeOptional
# directives to make sure we see the full include tree present in the
# configuration files
_ = self.find_dir("Include")
inc_cmd = [self.configurator.constant("apache_cmd"), "-t", "-D",
"DUMP_INCLUDES"]
matches = self.parse_from_subprocess(inc_cmd, r"\(.*\) (.*)")
if matches:
for i in matches:
if not self.parsed_in_current(i):
self.parse_file(i)
def update_modules(self):
"""Get loaded modules from httpd process, and add them to DOM"""
mod_cmd = [self.configurator.constant("apache_cmd"), "-t", "-D",
"DUMP_MODULES"]
matches = self.parse_from_subprocess(mod_cmd, r"(.*)_module")
for mod in matches:
self.add_mod(mod.strip())
def parse_from_subprocess(self, command, regexp):
"""Get values from stdout of subprocess command
:param list command: Command to run
:param str regexp: Regexp for parsing
:returns: list parsed from command output
:rtype: list
"""
stdout = self._get_runtime_cfg(command)
return re.compile(regexp).findall(stdout)
def _get_runtime_cfg(self, command): # pylint: disable=no-self-use
"""Get runtime configuration info.
:param command: Command to run
:returns: stdout from command
"""
try:
proc = subprocess.Popen(
constants.os_constant("define_cmd"),
command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True)
@@ -143,10 +225,10 @@ class ApacheParser(object):
except (OSError, ValueError):
logger.error(
"Error running command %s for runtime parameters!%s",
constants.os_constant("define_cmd"), os.linesep)
command, os.linesep)
raise errors.MisconfigurationError(
"Error accessing loaded Apache parameters: %s",
constants.os_constant("define_cmd"))
command)
# Small errors that do not impede
if proc.returncode != 0:
logger.warning("Error in checking parameter list: %s", stderr)
@@ -250,6 +332,23 @@ class ApacheParser(object):
else:
self.aug.set(aug_conf_path + "/directive[last()]/arg", args)
def add_dir_beginning(self, aug_conf_path, dirname, args):
"""Adds the directive to the beginning of defined aug_conf_path.
:param str aug_conf_path: Augeas configuration path to add directive
:param str dirname: Directive to add
:param args: Value of the directive. ie. Listen 443, 443 is arg
:type args: list or str
"""
first_dir = aug_conf_path + "/directive[1]"
self.aug.insert(first_dir, "directive", True)
self.aug.set(first_dir, dirname)
if isinstance(args, list):
for i, value in enumerate(args, 1):
self.aug.set(first_dir + "/arg[%d]" % (i), value)
else:
self.aug.set(first_dir + "/arg", args)
def find_dir(self, directive, arg=None, start=None, exclude=True):
"""Finds directive in the configuration.
@@ -339,7 +438,10 @@ class ApacheParser(object):
# Note: normal argument may be a quoted variable
# e.g. strip now, not later
value = value.strip("'\"")
if not value:
return None
else:
value = value.strip("'\"")
variables = ApacheParser.arg_var_interpreter.findall(value)
@@ -428,9 +530,9 @@ class ApacheParser(object):
# Attempts to add a transform to the file if one does not already exist
if os.path.isdir(arg):
self._parse_file(os.path.join(arg, "*"))
self.parse_file(os.path.join(arg, "*"))
else:
self._parse_file(arg)
self.parse_file(arg)
# Argument represents an fnmatch regular expression, convert it
# Split up the path and convert each into an Augeas accepted regex
@@ -470,7 +572,7 @@ class ApacheParser(object):
# Since Python 3.6, it returns a different pattern like (?s:.*\.load)\Z
return fnmatch.translate(clean_fn_match)[4:-3]
def _parse_file(self, filepath):
def parse_file(self, filepath):
"""Parse file with Augeas
Checks to see if file_path is parsed by Augeas
@@ -480,6 +582,10 @@ class ApacheParser(object):
"""
use_new, remove_old = self._check_path_actions(filepath)
# Ensure that we have the latest Augeas DOM state on disk before
# calling aug.load() which reloads the state from disk
if self.configurator:
self.configurator.ensure_augeas_state()
# Test if augeas included file for Httpd.lens
# Note: This works for augeas globs, ie. *.conf
if use_new:
@@ -494,6 +600,39 @@ class ApacheParser(object):
self._add_httpd_transform(filepath)
self.aug.load()
def parsed_in_current(self, filep):
"""Checks if the file path is parsed by current Augeas parser config
ie. returns True if the file is found on a path that's found in live
Augeas configuration.
:param str filep: Path to match
:returns: True if file is parsed in existing configuration tree
:rtype: bool
"""
return self._parsed_by_parser_paths(filep, self.parser_paths)
def parsed_in_original(self, filep):
"""Checks if the file path is parsed by existing Apache config.
ie. returns True if the file is found on a path that matches Include or
IncludeOptional statement in the Apache configuration.
:param str filep: Path to match
:returns: True if file is parsed in existing configuration tree
:rtype: bool
"""
return self._parsed_by_parser_paths(filep, self.existing_paths)
def _parsed_by_parser_paths(self, filep, paths):
"""Helper function that searches through provided paths and returns
True if file path is found in the set"""
for directory in paths.keys():
for filename in paths[directory]:
if fnmatch.fnmatch(filep, os.path.join(directory, filename)):
return True
return False
def _check_path_actions(self, filepath):
"""Determine actions to take with a new augeas path
@@ -622,7 +761,6 @@ class ApacheParser(object):
for name in location:
if os.path.isfile(os.path.join(self.root, name)):
return os.path.join(self.root, name)
raise errors.NoInstallationError("Could not find configuration root")

View File

@@ -26,6 +26,7 @@ function Setup() {
ErrorLog /tmp/error.log
CustomLog /tmp/requests.log combined
</VirtualHost>" | sudo tee $EA/sites-available/throwaway-example.conf >/dev/null
sudo ln -sf $EA/sites-available/throwaway-example.conf $EA/sites-enabled/throwaway-example.conf
else
TMP="/tmp/`basename \"$APPEND_APACHECONF\"`.$$"
sudo cp -a "$APPEND_APACHECONF" "$TMP"
@@ -37,6 +38,7 @@ function Cleanup() {
if [ "$APPEND_APACHECONF" = "" ] ; then
sudo rm /etc/apache2/sites-{enabled,available}/"$f"
sudo rm $EA/sites-available/throwaway-example.conf
sudo rm $EA/sites-enabled/throwaway-example.conf
else
sudo mv "$TMP" "$APPEND_APACHECONF"
fi

View File

@@ -13,7 +13,6 @@ from certbot_apache.tests import util
class AugeasConfiguratorTest(util.ApacheTest):
"""Test for Augeas Configurator base class."""
_multiprocess_can_split_ = True
def setUp(self): # pylint: disable=arguments-differ
super(AugeasConfiguratorTest, self).setUp()
@@ -31,7 +30,7 @@ class AugeasConfiguratorTest(util.ApacheTest):
def test_bad_parse(self):
# pylint: disable=protected-access
self.config.parser._parse_file(os.path.join(
self.config.parser.parse_file(os.path.join(
self.config.parser.root, "conf-available", "bad_conf_file.conf"))
self.assertRaises(
errors.PluginError, self.config.check_parsing_errors, "httpd.aug")

View File

@@ -0,0 +1,139 @@
"""Test for certbot_apache.configurator for Centos overrides"""
import os
import unittest
import mock
from certbot import errors
from certbot_apache import obj
from certbot_apache import override_centos
from certbot_apache.tests import util
def get_vh_truth(temp_dir, config_name):
"""Return the ground truth for the specified directory."""
prefix = os.path.join(
temp_dir, config_name, "httpd/conf.d")
aug_pre = "/files" + prefix
vh_truth = [
obj.VirtualHost(
os.path.join(prefix, "centos.example.com.conf"),
os.path.join(aug_pre, "centos.example.com.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80")]),
False, True, "centos.example.com"),
obj.VirtualHost(
os.path.join(prefix, "ssl.conf"),
os.path.join(aug_pre, "ssl.conf/VirtualHost"),
set([obj.Addr.fromstring("_default_:443")]),
True, True, None)
]
return vh_truth
class MultipleVhostsTestCentOS(util.ApacheTest):
"""Multiple vhost tests for CentOS / RHEL family of distros"""
_multiprocess_can_split_ = True
def setUp(self): # pylint: disable=arguments-differ
test_dir = "centos7_apache/apache"
config_root = "centos7_apache/apache/httpd"
vhost_root = "centos7_apache/apache/httpd/conf.d"
super(MultipleVhostsTestCentOS, self).setUp(test_dir=test_dir,
config_root=config_root,
vhost_root=vhost_root)
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
os_info="centos")
self.vh_truth = get_vh_truth(
self.temp_dir, "centos7_apache/apache")
def test_get_parser(self):
self.assertTrue(isinstance(self.config.parser,
override_centos.CentOSParser))
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
def test_opportunistic_httpd_runtime_parsing(self, mock_get):
define_val = (
'Define: TEST1\n'
'Define: TEST2\n'
'Define: DUMP_RUN_CFG\n'
)
mod_val = (
'Loaded Modules:\n'
' mock_module (static)\n'
' another_module (static)\n'
)
def mock_get_cfg(command):
"""Mock httpd process stdout"""
if command == ['apachectl', '-t', '-D', 'DUMP_RUN_CFG']:
return define_val
elif command == ['apachectl', '-t', '-D', 'DUMP_MODULES']:
return mod_val
return ""
mock_get.side_effect = mock_get_cfg
self.config.parser.modules = set()
self.config.parser.variables = {}
with mock.patch("certbot.util.get_os_info") as mock_osi:
# Make sure we have the have the CentOS httpd constants
mock_osi.return_value = ("centos", "7")
self.config.parser.update_runtime_variables()
self.assertEquals(mock_get.call_count, 3)
self.assertEquals(len(self.config.parser.modules), 4)
self.assertEquals(len(self.config.parser.variables), 2)
self.assertTrue("TEST2" in self.config.parser.variables.keys())
self.assertTrue("mod_another.c" in self.config.parser.modules)
def test_get_virtual_hosts(self):
"""Make sure all vhosts are being properly found."""
vhs = self.config.get_virtual_hosts()
self.assertEqual(len(vhs), 2)
found = 0
for vhost in vhs:
for centos_truth in self.vh_truth:
if vhost == centos_truth:
found += 1
break
else:
raise Exception("Missed: %s" % vhost) # pragma: no cover
self.assertEqual(found, 2)
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
def test_get_sysconfig_vars(self, mock_cfg):
"""Make sure we read the sysconfig OPTIONS variable correctly"""
# Return nothing for the process calls
mock_cfg.return_value = ""
self.config.parser.sysconfig_filep = os.path.realpath(
os.path.join(self.config.parser.root, "../sysconfig/httpd"))
self.config.parser.variables = {}
with mock.patch("certbot.util.get_os_info") as mock_osi:
# Make sure we have the have the CentOS httpd constants
mock_osi.return_value = ("centos", "7")
self.config.parser.update_runtime_variables()
self.assertTrue("mock_define" in self.config.parser.variables.keys())
self.assertTrue("mock_define_too" in self.config.parser.variables.keys())
self.assertTrue("mock_value" in self.config.parser.variables.keys())
self.assertEqual("TRUE", self.config.parser.variables["mock_value"])
self.assertTrue("MOCK_NOSEP" in self.config.parser.variables.keys())
self.assertEqual("NOSEP_VAL", self.config.parser.variables["NOSEP_TWO"])
@mock.patch("certbot_apache.configurator.util.run_script")
def test_alt_restart_works(self, mock_run_script):
mock_run_script.side_effect = [None, errors.SubprocessError, None]
self.config.restart()
self.assertEquals(mock_run_script.call_count, 3)
@mock.patch("certbot_apache.configurator.util.run_script")
def test_alt_restart_errors(self, mock_run_script):
mock_run_script.side_effect = [None,
errors.SubprocessError,
errors.SubprocessError]
self.assertRaises(errors.MisconfigurationError, self.config.restart)
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -18,7 +18,7 @@ class ComplexParserTest(util.ParserTest):
self.setup_variables()
# This needs to happen after due to setup_variables not being run
# until after
self.parser.init_modules() # pylint: disable=protected-access
self.parser.parse_modules() # pylint: disable=protected-access
def tearDown(self):
shutil.rmtree(self.temp_dir)

File diff suppressed because it is too large Load Diff

View File

@@ -1,44 +0,0 @@
"""Test for certbot_apache.configurator."""
import mock
import unittest
from certbot_apache import constants
class ConstantsTest(unittest.TestCase):
@mock.patch("certbot.util.get_os_info")
def test_get_debian_value(self, os_info):
os_info.return_value = ('Debian', '', '')
self.assertEqual(constants.os_constant("vhost_root"),
"/etc/apache2/sites-available")
@mock.patch("certbot.util.get_os_info")
def test_get_centos_value(self, os_info):
os_info.return_value = ('CentOS Linux', '', '')
self.assertEqual(constants.os_constant("vhost_root"),
"/etc/httpd/conf.d")
@mock.patch("certbot.util.get_systemd_os_like")
@mock.patch("certbot.util.get_os_info")
def test_get_default_values(self, os_info, os_like):
os_info.return_value = ('Nonexistent Linux', '', '')
os_like.return_value = {}
self.assertFalse(constants.os_constant("handle_mods"))
self.assertEqual(constants.os_constant("server_root"), "/etc/apache2")
self.assertEqual(constants.os_constant("vhost_root"),
"/etc/apache2/sites-available")
@mock.patch("certbot.util.get_systemd_os_like")
@mock.patch("certbot.util.get_os_info")
def test_get_darwin_like_values(self, os_info, os_like):
os_info.return_value = ('Nonexistent Linux', '', '')
os_like.return_value = ["something", "nonexistent", "darwin"]
self.assertFalse(constants.os_constant("enmod"))
self.assertEqual(constants.os_constant("vhost_root"),
"/etc/apache2/other")
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -0,0 +1,213 @@
"""Test for certbot_apache.configurator for Debian overrides"""
import os
import shutil
import unittest
import mock
from certbot import errors
from certbot_apache import apache_util
from certbot_apache import obj
from certbot_apache.tests import util
class MultipleVhostsTestDebian(util.ApacheTest):
"""Multiple vhost tests for Debian family of distros"""
_multiprocess_can_split_ = True
def setUp(self): # pylint: disable=arguments-differ
super(MultipleVhostsTestDebian, self).setUp()
self.config = util.get_apache_configurator(
self.config_path, None, self.config_dir, self.work_dir,
os_info="debian")
self.config = self.mock_deploy_cert(self.config)
self.vh_truth = util.get_vh_truth(self.temp_dir,
"debian_apache_2_4/multiple_vhosts")
def mock_deploy_cert(self, config):
"""A test for a mock deploy cert"""
config.real_deploy_cert = self.config.deploy_cert
def mocked_deploy_cert(*args, **kwargs):
"""a helper to mock a deployed cert"""
g_mod = "certbot_apache.configurator.ApacheConfigurator.enable_mod"
d_mod = "certbot_apache.override_debian.DebianConfigurator.enable_mod"
with mock.patch(g_mod):
with mock.patch(d_mod):
config.real_deploy_cert(*args, **kwargs)
self.config.deploy_cert = mocked_deploy_cert
return self.config
def test_enable_mod_unsupported_dirs(self):
shutil.rmtree(os.path.join(self.config.parser.root, "mods-enabled"))
self.assertRaises(
errors.NotSupportedError, self.config.enable_mod, "ssl")
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
@mock.patch("certbot_apache.parser.subprocess.Popen")
def test_enable_mod(self, mock_popen, mock_exe_exists, mock_run_script):
mock_popen().communicate.return_value = ("Define: DUMP_RUN_CFG", "")
mock_popen().returncode = 0
mock_exe_exists.return_value = True
self.config.enable_mod("ssl")
self.assertTrue("ssl_module" in self.config.parser.modules)
self.assertTrue("mod_ssl.c" in self.config.parser.modules)
self.assertTrue(mock_run_script.called)
def test_deploy_cert_enable_new_vhost(self):
# Create
ssl_vhost = self.config.make_vhost_ssl(self.vh_truth[0])
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
self.assertFalse(ssl_vhost.enabled)
self.config.deploy_cert(
"encryption-example.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.assertTrue(ssl_vhost.enabled)
# Make sure that we don't error out if symlink already exists
ssl_vhost.enabled = False
self.assertFalse(ssl_vhost.enabled)
self.config.deploy_cert(
"encryption-example.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.assertTrue(ssl_vhost.enabled)
def test_enable_site_failure(self):
self.config.parser.root = "/tmp/nonexistent"
with mock.patch("os.path.isdir") as mock_dir:
mock_dir.return_value = True
with mock.patch("os.path.islink") as mock_link:
mock_link.return_value = False
self.assertRaises(
errors.NotSupportedError,
self.config.enable_site,
obj.VirtualHost("asdf", "afsaf", set(), False, False))
def test_deploy_cert_newssl(self):
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir,
self.work_dir, version=(2, 4, 16))
self.config = self.mock_deploy_cert(self.config)
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
# Get the default 443 vhost
self.config.assoc["random.demo"] = self.vh_truth[1]
self.config.deploy_cert(
"random.demo", "example/cert.pem", "example/key.pem",
"example/cert_chain.pem", "example/fullchain.pem")
self.config.save()
# Verify ssl_module was enabled.
self.assertTrue(self.vh_truth[1].enabled)
self.assertTrue("ssl_module" in self.config.parser.modules)
loc_cert = self.config.parser.find_dir(
"sslcertificatefile", "example/fullchain.pem",
self.vh_truth[1].path)
loc_key = self.config.parser.find_dir(
"sslcertificateKeyfile", "example/key.pem", self.vh_truth[1].path)
# Verify one directive was found in the correct file
self.assertEqual(len(loc_cert), 1)
self.assertEqual(
apache_util.get_file_path(loc_cert[0]),
self.vh_truth[1].filep)
self.assertEqual(len(loc_key), 1)
self.assertEqual(
apache_util.get_file_path(loc_key[0]),
self.vh_truth[1].filep)
def test_deploy_cert_newssl_no_fullchain(self):
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir,
self.work_dir, version=(2, 4, 16))
self.config = self.mock_deploy_cert(self.config)
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
# Get the default 443 vhost
self.config.assoc["random.demo"] = self.vh_truth[1]
self.assertRaises(errors.PluginError,
lambda: self.config.deploy_cert(
"random.demo", "example/cert.pem",
"example/key.pem"))
def test_deploy_cert_old_apache_no_chain(self):
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir,
self.work_dir, version=(2, 4, 7))
self.config = self.mock_deploy_cert(self.config)
self.config.parser.modules.add("ssl_module")
self.config.parser.modules.add("mod_ssl.c")
# Get the default 443 vhost
self.config.assoc["random.demo"] = self.vh_truth[1]
self.assertRaises(errors.PluginError,
lambda: self.config.deploy_cert(
"random.demo", "example/cert.pem",
"example/key.pem"))
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
def test_ocsp_stapling_enable_mod(self, mock_exe, _):
self.config.parser.update_runtime_variables = mock.Mock()
self.config.parser.modules.add("mod_ssl.c")
self.config.get_version = mock.Mock(return_value=(2, 4, 7))
mock_exe.return_value = True
# This will create an ssl vhost for certbot.demo
self.config.choose_vhost("certbot.demo")
self.config.enhance("certbot.demo", "staple-ocsp")
self.assertTrue("socache_shmcb_module" in self.config.parser.modules)
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
def test_ensure_http_header_enable_mod(self, mock_exe, _):
self.config.parser.update_runtime_variables = mock.Mock()
self.config.parser.modules.add("mod_ssl.c")
mock_exe.return_value = True
# This will create an ssl vhost for certbot.demo
self.config.choose_vhost("certbot.demo")
self.config.enhance("certbot.demo", "ensure-http-header",
"Strict-Transport-Security")
self.assertTrue("headers_module" in self.config.parser.modules)
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
def test_redirect_enable_mod(self, mock_exe, _):
self.config.parser.update_runtime_variables = mock.Mock()
mock_exe.return_value = True
self.config.get_version = mock.Mock(return_value=(2, 2))
# This will create an ssl vhost for certbot.demo
self.config.choose_vhost("certbot.demo")
self.config.enhance("certbot.demo", "redirect")
self.assertTrue("rewrite_module" in self.config.parser.modules)
def test_enable_site_already_enabled(self):
self.assertTrue(self.vh_truth[1].enabled)
self.config.enable_site(self.vh_truth[1])
def test_enable_site_call_parent(self):
with mock.patch(
"certbot_apache.configurator.ApacheConfigurator.enable_site") as e_s:
self.config.parser.root = "/tmp/nonexistent"
vh = self.vh_truth[0]
vh.enabled = False
self.config.enable_site(vh)
self.assertTrue(e_s.called)
@mock.patch("certbot.util.exe_exists")
def test_enable_mod_no_disable(self, mock_exe_exists):
mock_exe_exists.return_value = False
self.assertRaises(
errors.MisconfigurationError, self.config.enable_mod, "ssl")
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -11,9 +11,39 @@ from certbot.tests import util as certbot_util
from certbot_apache import obj
from certbot_apache.display_ops import select_vhost_multiple
from certbot_apache.tests import util
class SelectVhostMultiTest(unittest.TestCase):
"""Tests for certbot_apache.display_ops.select_vhost_multiple."""
def setUp(self):
self.base_dir = "/example_path"
self.vhosts = util.get_vh_truth(
self.base_dir, "debian_apache_2_4/multiple_vhosts")
def test_select_no_input(self):
self.assertFalse(select_vhost_multiple([]))
@certbot_util.patch_get_utility()
def test_select_correct(self, mock_util):
mock_util().checklist.return_value = (
display_util.OK, [self.vhosts[3].display_repr(),
self.vhosts[2].display_repr()])
vhs = select_vhost_multiple([self.vhosts[3],
self.vhosts[2],
self.vhosts[1]])
self.assertTrue(self.vhosts[2] in vhs)
self.assertTrue(self.vhosts[3] in vhs)
self.assertFalse(self.vhosts[1] in vhs)
@certbot_util.patch_get_utility()
def test_select_cancel(self, mock_util):
mock_util().checklist.return_value = (display_util.CANCEL, "whatever")
vhs = select_vhost_multiple([self.vhosts[2], self.vhosts[3]])
self.assertFalse(vhs)
class SelectVhostTest(unittest.TestCase):
"""Tests for certbot_apache.display_ops.select_vhost."""

View File

@@ -0,0 +1,41 @@
"""Test for certbot_apache.entrypoint for override class resolution"""
import unittest
import mock
from certbot_apache import configurator
from certbot_apache import entrypoint
class EntryPointTest(unittest.TestCase):
"""Entrypoint tests"""
_multiprocess_can_split_ = True
def test_get_configurator(self):
with mock.patch("certbot.util.get_os_info") as mock_info:
for distro in entrypoint.OVERRIDE_CLASSES.keys():
mock_info.return_value = (distro, "whatever")
self.assertEqual(entrypoint.get_configurator(),
entrypoint.OVERRIDE_CLASSES[distro])
def test_nonexistent_like(self):
with mock.patch("certbot.util.get_os_info") as mock_info:
mock_info.return_value = ("nonexistent", "irrelevant")
with mock.patch("certbot.util.get_systemd_os_like") as mock_like:
for like in entrypoint.OVERRIDE_CLASSES.keys():
mock_like.return_value = [like]
self.assertEqual(entrypoint.get_configurator(),
entrypoint.OVERRIDE_CLASSES[like])
def test_nonexistent_generic(self):
with mock.patch("certbot.util.get_os_info") as mock_info:
mock_info.return_value = ("nonexistent", "irrelevant")
with mock.patch("certbot.util.get_systemd_os_like") as mock_like:
mock_like.return_value = ["unknonwn"]
self.assertEqual(entrypoint.get_configurator(),
configurator.ApacheConfigurator)
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -0,0 +1,135 @@
"""Test for certbot_apache.configurator for Gentoo overrides"""
import os
import unittest
import mock
from certbot import errors
from certbot_apache import override_gentoo
from certbot_apache import obj
from certbot_apache.tests import util
def get_vh_truth(temp_dir, config_name):
"""Return the ground truth for the specified directory."""
prefix = os.path.join(
temp_dir, config_name, "apache2/vhosts.d")
aug_pre = "/files" + prefix
vh_truth = [
obj.VirtualHost(
os.path.join(prefix, "gentoo.example.com.conf"),
os.path.join(aug_pre, "gentoo.example.com.conf/VirtualHost"),
set([obj.Addr.fromstring("*:80")]),
False, True, "gentoo.example.com"),
obj.VirtualHost(
os.path.join(prefix, "00_default_vhost.conf"),
os.path.join(aug_pre, "00_default_vhost.conf/IfDefine/VirtualHost"),
set([obj.Addr.fromstring("*:80")]),
False, True, "localhost"),
obj.VirtualHost(
os.path.join(prefix, "00_default_ssl_vhost.conf"),
os.path.join(aug_pre,
"00_default_ssl_vhost.conf" +
"/IfDefine/IfDefine/IfModule/VirtualHost"),
set([obj.Addr.fromstring("_default_:443")]),
True, True, "localhost")
]
return vh_truth
class MultipleVhostsTestGentoo(util.ApacheTest):
"""Multiple vhost tests for non-debian distro"""
_multiprocess_can_split_ = True
def setUp(self): # pylint: disable=arguments-differ
test_dir = "gentoo_apache/apache"
config_root = "gentoo_apache/apache/apache2"
vhost_root = "gentoo_apache/apache/apache2/vhosts.d"
super(MultipleVhostsTestGentoo, self).setUp(test_dir=test_dir,
config_root=config_root,
vhost_root=vhost_root)
with mock.patch("certbot_apache.override_gentoo.GentooParser.update_runtime_variables"):
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir,
os_info="gentoo")
self.vh_truth = get_vh_truth(
self.temp_dir, "gentoo_apache/apache")
def test_get_parser(self):
self.assertTrue(isinstance(self.config.parser,
override_gentoo.GentooParser))
def test_get_virtual_hosts(self):
"""Make sure all vhosts are being properly found."""
vhs = self.config.get_virtual_hosts()
self.assertEqual(len(vhs), 3)
found = 0
for vhost in vhs:
for gentoo_truth in self.vh_truth:
if vhost == gentoo_truth:
found += 1
break
else:
raise Exception("Missed: %s" % vhost) # pragma: no cover
self.assertEqual(found, 3)
def test_get_sysconfig_vars(self):
"""Make sure we read the Gentoo APACHE2_OPTS variable correctly"""
defines = ['DEFAULT_VHOST', 'INFO',
'SSL', 'SSL_DEFAULT_VHOST', 'LANGUAGE']
self.config.parser.apacheconfig_filep = os.path.realpath(
os.path.join(self.config.parser.root, "../conf.d/apache2"))
self.config.parser.variables = {}
with mock.patch("certbot_apache.override_gentoo.GentooParser.update_modules"):
self.config.parser.update_runtime_variables()
for define in defines:
self.assertTrue(define in self.config.parser.variables.keys())
@mock.patch("certbot_apache.parser.ApacheParser.parse_from_subprocess")
def test_no_binary_configdump(self, mock_subprocess):
"""Make sure we don't call binary dumps other than modules from Apache
as this is not supported in Gentoo currently"""
with mock.patch("certbot_apache.override_gentoo.GentooParser.update_modules"):
self.config.parser.update_runtime_variables()
self.config.parser.reset_modules()
self.assertFalse(mock_subprocess.called)
self.config.parser.update_runtime_variables()
self.config.parser.reset_modules()
self.assertTrue(mock_subprocess.called)
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
def test_opportunistic_httpd_runtime_parsing(self, mock_get):
mod_val = (
'Loaded Modules:\n'
' mock_module (static)\n'
' another_module (static)\n'
)
def mock_get_cfg(command):
"""Mock httpd process stdout"""
if command == ['apache2ctl', 'modules']:
return mod_val
mock_get.side_effect = mock_get_cfg
self.config.parser.modules = set()
with mock.patch("certbot.util.get_os_info") as mock_osi:
# Make sure we have the have the CentOS httpd constants
mock_osi.return_value = ("gentoo", "123")
self.config.parser.update_runtime_variables()
self.assertEquals(mock_get.call_count, 1)
self.assertEquals(len(self.config.parser.modules), 4)
self.assertTrue("mod_another.c" in self.config.parser.modules)
@mock.patch("certbot_apache.configurator.util.run_script")
def test_alt_restart_works(self, mock_run_script):
mock_run_script.side_effect = [None, errors.SubprocessError, None]
self.config.restart()
self.assertEquals(mock_run_script.call_count, 3)
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -0,0 +1,206 @@
"""Test for certbot_apache.http_01."""
import mock
import os
import unittest
from acme import challenges
from certbot import achallenges
from certbot import errors
from certbot.tests import acme_util
from certbot_apache.tests import util
NUM_ACHALLS = 3
class ApacheHttp01Test(util.ApacheTest):
"""Test for certbot_apache.http_01.ApacheHttp01."""
def setUp(self, *args, **kwargs):
super(ApacheHttp01Test, self).setUp(*args, **kwargs)
self.account_key = self.rsa512jwk
self.achalls = []
vh_truth = util.get_vh_truth(
self.temp_dir, "debian_apache_2_4/multiple_vhosts")
# Takes the vhosts for encryption-example.demo, certbot.demo, and
# vhost.in.rootconf
self.vhosts = [vh_truth[0], vh_truth[3], vh_truth[10]]
for i in range(NUM_ACHALLS):
self.achalls.append(
achallenges.KeyAuthorizationAnnotatedChallenge(
challb=acme_util.chall_to_challb(
challenges.HTTP01(token=((chr(ord('a') + i).encode() * 16))),
"pending"),
domain=self.vhosts[i].name, account_key=self.account_key))
modules = ["rewrite", "authz_core", "authz_host"]
for mod in modules:
self.config.parser.modules.add("mod_{0}.c".format(mod))
self.config.parser.modules.add(mod + "_module")
from certbot_apache.http_01 import ApacheHttp01
self.http = ApacheHttp01(self.config)
def test_empty_perform(self):
self.assertFalse(self.http.perform())
@mock.patch("certbot_apache.configurator.ApacheConfigurator.enable_mod")
def test_enable_modules_apache_2_2(self, mock_enmod):
self.config.version = (2, 2)
self.config.parser.modules.remove("authz_host_module")
self.config.parser.modules.remove("mod_authz_host.c")
enmod_calls = self.common_enable_modules_test(mock_enmod)
self.assertEqual(enmod_calls[0][0][0], "authz_host")
@mock.patch("certbot_apache.configurator.ApacheConfigurator.enable_mod")
def test_enable_modules_apache_2_4(self, mock_enmod):
self.config.parser.modules.remove("authz_core_module")
self.config.parser.modules.remove("mod_authz_core.c")
enmod_calls = self.common_enable_modules_test(mock_enmod)
self.assertEqual(enmod_calls[0][0][0], "authz_core")
def common_enable_modules_test(self, mock_enmod):
"""Tests enabling mod_rewrite and other modules."""
self.config.parser.modules.remove("rewrite_module")
self.config.parser.modules.remove("mod_rewrite.c")
self.http.prepare_http01_modules()
self.assertTrue(mock_enmod.called)
calls = mock_enmod.call_args_list
other_calls = []
for call in calls:
if "rewrite" != call[0][0]:
other_calls.append(call)
# If these lists are equal, we never enabled mod_rewrite
self.assertNotEqual(calls, other_calls)
return other_calls
def test_same_vhost(self):
vhost = next(v for v in self.config.vhosts if v.name == "certbot.demo")
achalls = [
achallenges.KeyAuthorizationAnnotatedChallenge(
challb=acme_util.chall_to_challb(
challenges.HTTP01(token=((b'a' * 16))),
"pending"),
domain=vhost.name, account_key=self.account_key),
achallenges.KeyAuthorizationAnnotatedChallenge(
challb=acme_util.chall_to_challb(
challenges.HTTP01(token=((b'b' * 16))),
"pending"),
domain=next(iter(vhost.aliases)), account_key=self.account_key)
]
self.common_perform_test(achalls, [vhost])
def test_anonymous_vhost(self):
vhosts = [v for v in self.config.vhosts if not v.ssl]
achalls = [
achallenges.KeyAuthorizationAnnotatedChallenge(
challb=acme_util.chall_to_challb(
challenges.HTTP01(token=((b'a' * 16))),
"pending"),
domain="something.nonexistent", account_key=self.account_key)]
self.common_perform_test(achalls, vhosts)
def test_no_vhost(self):
for achall in self.achalls:
self.http.add_chall(achall)
self.config.config.http01_port = 12345
self.assertRaises(errors.PluginError, self.http.perform)
def test_perform_1_achall_apache_2_2(self):
self.combinations_perform_test(num_achalls=1, minor_version=2)
def test_perform_1_achall_apache_2_4(self):
self.combinations_perform_test(num_achalls=1, minor_version=4)
def test_perform_2_achall_apache_2_2(self):
self.combinations_perform_test(num_achalls=2, minor_version=2)
def test_perform_2_achall_apache_2_4(self):
self.combinations_perform_test(num_achalls=2, minor_version=4)
def test_perform_3_achall_apache_2_2(self):
self.combinations_perform_test(num_achalls=3, minor_version=2)
def test_perform_3_achall_apache_2_4(self):
self.combinations_perform_test(num_achalls=3, minor_version=4)
def combinations_perform_test(self, num_achalls, minor_version):
"""Test perform with the given achall count and Apache version."""
achalls = self.achalls[:num_achalls]
vhosts = self.vhosts[:num_achalls]
self.config.version = (2, minor_version)
self.common_perform_test(achalls, vhosts)
def common_perform_test(self, achalls, vhosts):
"""Tests perform with the given achalls."""
challenge_dir = self.http.challenge_dir
self.assertFalse(os.path.exists(challenge_dir))
for achall in achalls:
self.http.add_chall(achall)
expected_response = [
achall.response(self.account_key) for achall in achalls]
self.assertEqual(self.http.perform(), expected_response)
self.assertTrue(os.path.isdir(self.http.challenge_dir))
self._has_min_permissions(self.http.challenge_dir, 0o755)
self._test_challenge_conf()
for achall in achalls:
self._test_challenge_file(achall)
for vhost in vhosts:
if not vhost.ssl:
matches = self.config.parser.find_dir("Include",
self.http.challenge_conf_pre,
vhost.path)
self.assertEqual(len(matches), 1)
matches = self.config.parser.find_dir("Include",
self.http.challenge_conf_post,
vhost.path)
self.assertEqual(len(matches), 1)
self.assertTrue(os.path.exists(challenge_dir))
def _test_challenge_conf(self):
with open(self.http.challenge_conf_pre) as f:
pre_conf_contents = f.read()
with open(self.http.challenge_conf_post) as f:
post_conf_contents = f.read()
self.assertTrue("RewriteEngine on" in pre_conf_contents)
self.assertTrue("RewriteRule" in pre_conf_contents)
self.assertTrue(self.http.challenge_dir in post_conf_contents)
if self.config.version < (2, 4):
self.assertTrue("Allow from all" in post_conf_contents)
else:
self.assertTrue("Require all granted" in post_conf_contents)
def _test_challenge_file(self, achall):
name = os.path.join(self.http.challenge_dir, achall.chall.encode("token"))
validation = achall.validation(self.account_key)
self._has_min_permissions(name, 0o644)
with open(name, 'rb') as f:
self.assertEqual(f.read(), validation.encode())
def _has_min_permissions(self, path, min_mode):
"""Tests the given file has at least the permissions in mode."""
st_mode = os.stat(path).st_mode
self.assertEqual(st_mode, st_mode | min_mode)
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -38,7 +38,7 @@ class BasicParserTest(util.ParserTest):
file_path = os.path.join(
self.config_path, "not-parsed-by-default", "certbot.conf")
self.parser._parse_file(file_path) # pylint: disable=protected-access
self.parser.parse_file(file_path) # pylint: disable=protected-access
# search for the httpd incl
matches = self.parser.aug.match(
@@ -52,7 +52,7 @@ class BasicParserTest(util.ParserTest):
test2 = self.parser.find_dir("documentroot")
self.assertEqual(len(test), 1)
self.assertEqual(len(test2), 4)
self.assertEqual(len(test2), 7)
def test_add_dir(self):
aug_default = "/files" + self.parser.loc["default"]
@@ -66,6 +66,27 @@ class BasicParserTest(util.ParserTest):
for i, match in enumerate(matches):
self.assertEqual(self.parser.aug.get(match), str(i + 1))
def test_add_dir_beginning(self):
aug_default = "/files" + self.parser.loc["default"]
self.parser.add_dir_beginning(aug_default,
"AddDirectiveBeginning",
"testBegin")
self.assertTrue(
self.parser.find_dir("AddDirectiveBeginning", "testBegin", aug_default))
self.assertEqual(
self.parser.aug.get(aug_default+"/directive[1]"),
"AddDirectiveBeginning")
self.parser.add_dir_beginning(aug_default, "AddList", ["1", "2", "3", "4"])
matches = self.parser.find_dir("AddList", None, aug_default)
for i, match in enumerate(matches):
self.assertEqual(self.parser.aug.get(match), str(i + 1))
def test_empty_arg(self):
self.assertEquals(None,
self.parser.get_arg("/files/whatever/nonexistent"))
def test_add_dir_to_ifmodssl(self):
"""test add_dir_to_ifmodssl.
@@ -114,9 +135,20 @@ class BasicParserTest(util.ParserTest):
self.assertEqual(results["default"], results["listen"])
self.assertEqual(results["default"], results["name"])
@mock.patch("certbot_apache.parser.ApacheParser.find_dir")
@mock.patch("certbot_apache.parser.ApacheParser.get_arg")
def test_parse_modules_bad_syntax(self, mock_arg, mock_find):
mock_find.return_value = ["1", "2", "3", "4", "5", "6", "7", "8"]
mock_arg.return_value = None
with mock.patch("certbot_apache.parser.logger") as mock_logger:
self.parser.parse_modules()
# Make sure that we got None return value and logged the file
self.assertTrue(mock_logger.debug.called)
@mock.patch("certbot_apache.parser.ApacheParser.find_dir")
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
def test_update_runtime_variables(self, mock_cfg):
mock_cfg.return_value = (
def test_update_runtime_variables(self, mock_cfg, _):
define_val = (
'ServerRoot: "/etc/apache2"\n'
'Main DocumentRoot: "/var/www"\n'
'Main ErrorLog: "/var/log/apache2/error.log"\n'
@@ -133,11 +165,113 @@ class BasicParserTest(util.ParserTest):
'User: name="www-data" id=33 not_used\n'
'Group: name="www-data" id=33 not_used\n'
)
inc_val = (
'Included configuration files:\n'
' (*) /etc/apache2/apache2.conf\n'
' (146) /etc/apache2/mods-enabled/access_compat.load\n'
' (146) /etc/apache2/mods-enabled/alias.load\n'
' (146) /etc/apache2/mods-enabled/auth_basic.load\n'
' (146) /etc/apache2/mods-enabled/authn_core.load\n'
' (146) /etc/apache2/mods-enabled/authn_file.load\n'
' (146) /etc/apache2/mods-enabled/authz_core.load\n'
' (146) /etc/apache2/mods-enabled/authz_host.load\n'
' (146) /etc/apache2/mods-enabled/authz_user.load\n'
' (146) /etc/apache2/mods-enabled/autoindex.load\n'
' (146) /etc/apache2/mods-enabled/deflate.load\n'
' (146) /etc/apache2/mods-enabled/dir.load\n'
' (146) /etc/apache2/mods-enabled/env.load\n'
' (146) /etc/apache2/mods-enabled/filter.load\n'
' (146) /etc/apache2/mods-enabled/mime.load\n'
' (146) /etc/apache2/mods-enabled/mpm_event.load\n'
' (146) /etc/apache2/mods-enabled/negotiation.load\n'
' (146) /etc/apache2/mods-enabled/reqtimeout.load\n'
' (146) /etc/apache2/mods-enabled/setenvif.load\n'
' (146) /etc/apache2/mods-enabled/socache_shmcb.load\n'
' (146) /etc/apache2/mods-enabled/ssl.load\n'
' (146) /etc/apache2/mods-enabled/status.load\n'
' (147) /etc/apache2/mods-enabled/alias.conf\n'
' (147) /etc/apache2/mods-enabled/autoindex.conf\n'
' (147) /etc/apache2/mods-enabled/deflate.conf\n'
)
mod_val = (
'Loaded Modules:\n'
' core_module (static)\n'
' so_module (static)\n'
' watchdog_module (static)\n'
' http_module (static)\n'
' log_config_module (static)\n'
' logio_module (static)\n'
' version_module (static)\n'
' unixd_module (static)\n'
' access_compat_module (shared)\n'
' alias_module (shared)\n'
' auth_basic_module (shared)\n'
' authn_core_module (shared)\n'
' authn_file_module (shared)\n'
' authz_core_module (shared)\n'
' authz_host_module (shared)\n'
' authz_user_module (shared)\n'
' autoindex_module (shared)\n'
' deflate_module (shared)\n'
' dir_module (shared)\n'
' env_module (shared)\n'
' filter_module (shared)\n'
' mime_module (shared)\n'
' mpm_event_module (shared)\n'
' negotiation_module (shared)\n'
' reqtimeout_module (shared)\n'
' setenvif_module (shared)\n'
' socache_shmcb_module (shared)\n'
' ssl_module (shared)\n'
' status_module (shared)\n'
)
def mock_get_vars(cmd):
"""Mock command output"""
if cmd[-1] == "DUMP_RUN_CFG":
return define_val
elif cmd[-1] == "DUMP_INCLUDES":
return inc_val
elif cmd[-1] == "DUMP_MODULES":
return mod_val
mock_cfg.side_effect = mock_get_vars
expected_vars = {"TEST": "", "U_MICH": "", "TLS": "443",
"example_path": "Documents/path"}
self.parser.update_runtime_variables()
self.assertEqual(self.parser.variables, expected_vars)
self.parser.modules = set()
with mock.patch(
"certbot_apache.parser.ApacheParser.parse_file") as mock_parse:
self.parser.update_runtime_variables()
self.assertEqual(self.parser.variables, expected_vars)
self.assertEqual(len(self.parser.modules), 58)
# None of the includes in inc_val should be in parsed paths.
# Make sure we tried to include them all.
self.assertEqual(mock_parse.call_count, 25)
@mock.patch("certbot_apache.parser.ApacheParser.find_dir")
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
def test_update_runtime_variables_alt_values(self, mock_cfg, _):
inc_val = (
'Included configuration files:\n'
' (*) {0}\n'
' (146) /etc/apache2/mods-enabled/access_compat.load\n'
' (146) {1}/mods-enabled/alias.load\n'
).format(self.parser.loc["root"],
os.path.dirname(self.parser.loc["root"]))
mock_cfg.return_value = inc_val
self.parser.modules = set()
with mock.patch(
"certbot_apache.parser.ApacheParser.parse_file") as mock_parse:
self.parser.update_runtime_variables()
# No matching modules should have been found
self.assertEqual(len(self.parser.modules), 0)
# Only one of the three includes do not exist in already parsed
# path derived from root configuration Include statements
self.assertEqual(mock_parse.call_count, 1)
@mock.patch("certbot_apache.parser.ApacheParser._get_runtime_cfg")
def test_update_runtime_vars_bad_output(self, mock_cfg):
@@ -148,7 +282,7 @@ class BasicParserTest(util.ParserTest):
self.assertRaises(
errors.PluginError, self.parser.update_runtime_variables)
@mock.patch("certbot_apache.constants.os_constant")
@mock.patch("certbot_apache.configurator.ApacheConfigurator.constant")
@mock.patch("certbot_apache.parser.subprocess.Popen")
def test_update_runtime_vars_bad_ctl(self, mock_popen, mock_const):
mock_popen.side_effect = OSError
@@ -184,7 +318,7 @@ class ParserInitTest(util.ApacheTest):
self.assertRaises(
errors.PluginError,
ApacheParser, self.aug, os.path.relpath(self.config_path),
"/dummy/vhostpath", version=(2, 2, 22))
"/dummy/vhostpath", version=(2, 2, 22), configurator=self.config)
def test_root_normalized(self):
from certbot_apache.parser import ApacheParser
@@ -196,7 +330,7 @@ class ParserInitTest(util.ApacheTest):
"debian_apache_2_4/////multiple_vhosts/../multiple_vhosts/apache2")
parser = ApacheParser(self.aug, path,
"/dummy/vhostpath")
"/dummy/vhostpath", configurator=self.config)
self.assertEqual(parser.root, self.config_path)
@@ -206,7 +340,7 @@ class ParserInitTest(util.ApacheTest):
"update_runtime_variables"):
parser = ApacheParser(
self.aug, os.path.relpath(self.config_path),
"/dummy/vhostpath")
"/dummy/vhostpath", configurator=self.config)
self.assertEqual(parser.root, self.config_path)
@@ -216,7 +350,7 @@ class ParserInitTest(util.ApacheTest):
"update_runtime_variables"):
parser = ApacheParser(
self.aug, self.config_path + os.path.sep,
"/dummy/vhostpath")
"/dummy/vhostpath", configurator=self.config)
self.assertEqual(parser.root, self.config_path)

View File

@@ -0,0 +1,9 @@
This directory holds configuration files for the Apache HTTP Server;
any files in this directory which have the ".conf" extension will be
processed as httpd configuration files. The directory is used in
addition to the directory /etc/httpd/conf.modules.d/, which contains
configuration files necessary to load modules.
Files are processed in alphabetical order.

View File

@@ -0,0 +1,94 @@
#
# Directives controlling the display of server-generated directory listings.
#
# Required modules: mod_authz_core, mod_authz_host,
# mod_autoindex, mod_alias
#
# To see the listing of a directory, the Options directive for the
# directory must include "Indexes", and the directory must not contain
# a file matching those listed in the DirectoryIndex directive.
#
#
# IndexOptions: Controls the appearance of server-generated directory
# listings.
#
IndexOptions FancyIndexing HTMLTable VersionSort
# We include the /icons/ alias for FancyIndexed directory listings. If
# you do not use FancyIndexing, you may comment this out.
#
Alias /icons/ "/usr/share/httpd/icons/"
<Directory "/usr/share/httpd/icons">
Options Indexes MultiViews FollowSymlinks
AllowOverride None
Require all granted
</Directory>
#
# AddIcon* directives tell the server which icon to show for different
# files or filename extensions. These are only displayed for
# FancyIndexed directories.
#
AddIconByEncoding (CMP,/icons/compressed.gif) x-compress x-gzip
AddIconByType (TXT,/icons/text.gif) text/*
AddIconByType (IMG,/icons/image2.gif) image/*
AddIconByType (SND,/icons/sound2.gif) audio/*
AddIconByType (VID,/icons/movie.gif) video/*
AddIcon /icons/binary.gif .bin .exe
AddIcon /icons/binhex.gif .hqx
AddIcon /icons/tar.gif .tar
AddIcon /icons/world2.gif .wrl .wrl.gz .vrml .vrm .iv
AddIcon /icons/compressed.gif .Z .z .tgz .gz .zip
AddIcon /icons/a.gif .ps .ai .eps
AddIcon /icons/layout.gif .html .shtml .htm .pdf
AddIcon /icons/text.gif .txt
AddIcon /icons/c.gif .c
AddIcon /icons/p.gif .pl .py
AddIcon /icons/f.gif .for
AddIcon /icons/dvi.gif .dvi
AddIcon /icons/uuencoded.gif .uu
AddIcon /icons/script.gif .conf .sh .shar .csh .ksh .tcl
AddIcon /icons/tex.gif .tex
AddIcon /icons/bomb.gif /core
AddIcon /icons/bomb.gif */core.*
AddIcon /icons/back.gif ..
AddIcon /icons/hand.right.gif README
AddIcon /icons/folder.gif ^^DIRECTORY^^
AddIcon /icons/blank.gif ^^BLANKICON^^
#
# DefaultIcon is which icon to show for files which do not have an icon
# explicitly set.
#
DefaultIcon /icons/unknown.gif
#
# AddDescription allows you to place a short description after a file in
# server-generated indexes. These are only displayed for FancyIndexed
# directories.
# Format: AddDescription "description" filename
#
#AddDescription "GZIP compressed document" .gz
#AddDescription "tar archive" .tar
#AddDescription "GZIP compressed tar archive" .tgz
#
# ReadmeName is the name of the README file the server will look for by
# default, and append to directory listings.
#
# HeaderName is the name of a file which should be prepended to
# directory indexes.
ReadmeName README.html
HeaderName HEADER.html
#
# IndexIgnore is a set of filenames which directory indexing should ignore
# and not include in the listing. Shell-style wildcarding is permitted.
#
IndexIgnore .??* *~ *# HEADER* README* RCS CVS *,v *,t

View File

@@ -0,0 +1,7 @@
<VirtualHost *:80>
ServerName centos.example.com
ServerAdmin webmaster@localhost
DocumentRoot /var/www/html
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
</VirtualHost>

View File

@@ -0,0 +1,211 @@
#
# When we also provide SSL we have to listen to the
# the HTTPS port in addition.
#
Listen 443 https
##
## SSL Global Context
##
## All SSL configuration in this context applies both to
## the main server and all SSL-enabled virtual hosts.
##
# Pass Phrase Dialog:
# Configure the pass phrase gathering process.
# The filtering dialog program (`builtin' is a internal
# terminal dialog) has to provide the pass phrase on stdout.
SSLPassPhraseDialog exec:/usr/libexec/httpd-ssl-pass-dialog
# Inter-Process Session Cache:
# Configure the SSL Session Cache: First the mechanism
# to use and second the expiring timeout (in seconds).
SSLSessionCache shmcb:/run/httpd/sslcache(512000)
SSLSessionCacheTimeout 300
# Pseudo Random Number Generator (PRNG):
# Configure one or more sources to seed the PRNG of the
# SSL library. The seed data should be of good random quality.
# WARNING! On some platforms /dev/random blocks if not enough entropy
# is available. This means you then cannot use the /dev/random device
# because it would lead to very long connection times (as long as
# it requires to make more entropy available). But usually those
# platforms additionally provide a /dev/urandom device which doesn't
# block. So, if available, use this one instead. Read the mod_ssl User
# Manual for more details.
SSLRandomSeed startup file:/dev/urandom 256
SSLRandomSeed connect builtin
#SSLRandomSeed startup file:/dev/random 512
#SSLRandomSeed connect file:/dev/random 512
#SSLRandomSeed connect file:/dev/urandom 512
#
# Use "SSLCryptoDevice" to enable any supported hardware
# accelerators. Use "openssl engine -v" to list supported
# engine names. NOTE: If you enable an accelerator and the
# server does not start, consult the error logs and ensure
# your accelerator is functioning properly.
#
SSLCryptoDevice builtin
#SSLCryptoDevice ubsec
##
## SSL Virtual Host Context
##
<VirtualHost _default_:443>
# General setup for the virtual host, inherited from global configuration
#DocumentRoot "/var/www/html"
#ServerName www.example.com:443
# Use separate log files for the SSL virtual host; note that LogLevel
# is not inherited from httpd.conf.
ErrorLog logs/ssl_error_log
TransferLog logs/ssl_access_log
LogLevel warn
# SSL Engine Switch:
# Enable/Disable SSL for this virtual host.
SSLEngine on
# SSL Protocol support:
# List the enable protocol levels with which clients will be able to
# connect. Disable SSLv2 access by default:
SSLProtocol all -SSLv2
# SSL Cipher Suite:
# List the ciphers that the client is permitted to negotiate.
# See the mod_ssl documentation for a complete list.
SSLCipherSuite HIGH:MEDIUM:!aNULL:!MD5:!SEED:!IDEA
# Speed-optimized SSL Cipher configuration:
# If speed is your main concern (on busy HTTPS servers e.g.),
# you might want to force clients to specific, performance
# optimized ciphers. In this case, prepend those ciphers
# to the SSLCipherSuite list, and enable SSLHonorCipherOrder.
# Caveat: by giving precedence to RC4-SHA and AES128-SHA
# (as in the example below), most connections will no longer
# have perfect forward secrecy - if the server's key is
# compromised, captures of past or future traffic must be
# considered compromised, too.
#SSLCipherSuite RC4-SHA:AES128-SHA:HIGH:MEDIUM:!aNULL:!MD5
#SSLHonorCipherOrder on
# Server Certificate:
# Point SSLCertificateFile at a PEM encoded certificate. If
# the certificate is encrypted, then you will be prompted for a
# pass phrase. Note that a kill -HUP will prompt again. A new
# certificate can be generated using the genkey(1) command.
# Server Private Key:
# If the key is not combined with the certificate, use this
# directive to point at the key file. Keep in mind that if
# you've both a RSA and a DSA private key you can configure
# both in parallel (to also allow the use of DSA ciphers, etc.)
# Server Certificate Chain:
# Point SSLCertificateChainFile at a file containing the
# concatenation of PEM encoded CA certificates which form the
# certificate chain for the server certificate. Alternatively
# the referenced file can be the same as SSLCertificateFile
# when the CA certificates are directly appended to the server
# certificate for convinience.
#SSLCertificateChainFile /etc/pki/tls/certs/server-chain.crt
# Certificate Authority (CA):
# Set the CA certificate verification path where to find CA
# certificates for client authentication or alternatively one
# huge file containing all of them (file must be PEM encoded)
#SSLCACertificateFile /etc/pki/tls/certs/ca-bundle.crt
# Client Authentication (Type):
# Client certificate verification type and depth. Types are
# none, optional, require and optional_no_ca. Depth is a
# number which specifies how deeply to verify the certificate
# issuer chain before deciding the certificate is not valid.
#SSLVerifyClient require
#SSLVerifyDepth 10
# Access Control:
# With SSLRequire you can do per-directory access control based
# on arbitrary complex boolean expressions containing server
# variable checks and other lookup directives. The syntax is a
# mixture between C and Perl. See the mod_ssl documentation
# for more details.
#<Location />
#SSLRequire ( %{SSL_CIPHER} !~ m/^(EXP|NULL)/ \
# and %{SSL_CLIENT_S_DN_O} eq "Snake Oil, Ltd." \
# and %{SSL_CLIENT_S_DN_OU} in {"Staff", "CA", "Dev"} \
# and %{TIME_WDAY} >= 1 and %{TIME_WDAY} <= 5 \
# and %{TIME_HOUR} >= 8 and %{TIME_HOUR} <= 20 ) \
# or %{REMOTE_ADDR} =~ m/^192\.76\.162\.[0-9]+$/
#</Location>
# SSL Engine Options:
# Set various options for the SSL engine.
# o FakeBasicAuth:
# Translate the client X.509 into a Basic Authorisation. This means that
# the standard Auth/DBMAuth methods can be used for access control. The
# user name is the `one line' version of the client's X.509 certificate.
# Note that no password is obtained from the user. Every entry in the user
# file needs this password: `xxj31ZMTZzkVA'.
# o ExportCertData:
# This exports two additional environment variables: SSL_CLIENT_CERT and
# SSL_SERVER_CERT. These contain the PEM-encoded certificates of the
# server (always existing) and the client (only existing when client
# authentication is used). This can be used to import the certificates
# into CGI scripts.
# o StdEnvVars:
# This exports the standard SSL/TLS related `SSL_*' environment variables.
# Per default this exportation is switched off for performance reasons,
# because the extraction step is an expensive operation and is usually
# useless for serving static content. So one usually enables the
# exportation for CGI and SSI requests only.
# o StrictRequire:
# This denies access when "SSLRequireSSL" or "SSLRequire" applied even
# under a "Satisfy any" situation, i.e. when it applies access is denied
# and no other module can change it.
# o OptRenegotiate:
# This enables optimized SSL connection renegotiation handling when SSL
# directives are used in per-directory context.
#SSLOptions +FakeBasicAuth +ExportCertData +StrictRequire
<Files ~ "\.(cgi|shtml|phtml|php3?)$">
SSLOptions +StdEnvVars
</Files>
<Directory "/var/www/cgi-bin">
SSLOptions +StdEnvVars
</Directory>
# SSL Protocol Adjustments:
# The safe and default but still SSL/TLS standard compliant shutdown
# approach is that mod_ssl sends the close notify alert but doesn't wait for
# the close notify alert from client. When you need a different shutdown
# approach you can use one of the following variables:
# o ssl-unclean-shutdown:
# This forces an unclean shutdown when the connection is closed, i.e. no
# SSL close notify alert is send or allowed to received. This violates
# the SSL/TLS standard but is needed for some brain-dead browsers. Use
# this when you receive I/O errors because of the standard approach where
# mod_ssl sends the close notify alert.
# o ssl-accurate-shutdown:
# This forces an accurate shutdown when the connection is closed, i.e. a
# SSL close notify alert is send and mod_ssl waits for the close notify
# alert of the client. This is 100% SSL/TLS standard compliant, but in
# practice often causes hanging connections with brain-dead browsers. Use
# this only for browsers where you know that their SSL implementation
# works correctly.
# Notice: Most problems of broken clients are also related to the HTTP
# keep-alive facility, so you usually additionally want to disable
# keep-alive for those clients, too. Use variable "nokeepalive" for this.
# Similarly, one has to force some clients to use HTTP/1.0 to workaround
# their broken HTTP/1.1 implementation. Use variables "downgrade-1.0" and
# "force-response-1.0" for this.
BrowserMatch "MSIE [2-5]" nokeepalive ssl-unclean-shutdown downgrade-1.0 force-response-1.0
# Per-Server Logging:
# The home of a custom SSL log file. Use this when you want a
# compact non-error SSL logfile on a virtual host basis.
CustomLog logs/ssl_request_log "%t %h %{SSL_PROTOCOL}x %{SSL_CIPHER}x \"%r\" %b"
</VirtualHost>

View File

@@ -0,0 +1,36 @@
#
# UserDir: The name of the directory that is appended onto a user's home
# directory if a ~user request is received.
#
# The path to the end user account 'public_html' directory must be
# accessible to the webserver userid. This usually means that ~userid
# must have permissions of 711, ~userid/public_html must have permissions
# of 755, and documents contained therein must be world-readable.
# Otherwise, the client will only receive a "403 Forbidden" message.
#
<IfModule mod_userdir.c>
#
# UserDir is disabled by default since it can confirm the presence
# of a username on the system (depending on home directory
# permissions).
#
UserDir disabled
#
# To enable requests to /~user/ to serve the user's public_html
# directory, remove the "UserDir disabled" line above, and uncomment
# the following line instead:
#
#UserDir public_html
</IfModule>
#
# Control access to UserDir directories. The following is an example
# for a site where these directories are restricted to read-only.
#
<Directory "/home/*/public_html">
AllowOverride FileInfo AuthConfig Limit Indexes
Options MultiViews Indexes SymLinksIfOwnerMatch IncludesNoExec
Require method GET POST OPTIONS
</Directory>

View File

@@ -0,0 +1,22 @@
#
# This configuration file enables the default "Welcome" page if there
# is no default index page present for the root URL. To disable the
# Welcome page, comment out all the lines below.
#
# NOTE: if this file is removed, it will be restored on upgrades.
#
<LocationMatch "^/+$">
Options -Indexes
ErrorDocument 403 /.noindex.html
</LocationMatch>
<Directory /usr/share/httpd/noindex>
AllowOverride None
Require all granted
</Directory>
Alias /.noindex.html /usr/share/httpd/noindex/index.html
Alias /noindex/css/bootstrap.min.css /usr/share/httpd/noindex/css/bootstrap.min.css
Alias /noindex/css/open-sans.css /usr/share/httpd/noindex/css/open-sans.css
Alias /images/apache_pb.gif /usr/share/httpd/noindex/images/apache_pb.gif
Alias /images/poweredby.png /usr/share/httpd/noindex/images/poweredby.png

View File

@@ -0,0 +1,77 @@
#
# This file loads most of the modules included with the Apache HTTP
# Server itself.
#
LoadModule access_compat_module modules/mod_access_compat.so
LoadModule actions_module modules/mod_actions.so
LoadModule alias_module modules/mod_alias.so
LoadModule allowmethods_module modules/mod_allowmethods.so
LoadModule auth_basic_module modules/mod_auth_basic.so
LoadModule auth_digest_module modules/mod_auth_digest.so
LoadModule authn_anon_module modules/mod_authn_anon.so
LoadModule authn_core_module modules/mod_authn_core.so
LoadModule authn_dbd_module modules/mod_authn_dbd.so
LoadModule authn_dbm_module modules/mod_authn_dbm.so
LoadModule authn_file_module modules/mod_authn_file.so
LoadModule authn_socache_module modules/mod_authn_socache.so
LoadModule authz_core_module modules/mod_authz_core.so
LoadModule authz_dbd_module modules/mod_authz_dbd.so
LoadModule authz_dbm_module modules/mod_authz_dbm.so
LoadModule authz_groupfile_module modules/mod_authz_groupfile.so
LoadModule authz_host_module modules/mod_authz_host.so
LoadModule authz_owner_module modules/mod_authz_owner.so
LoadModule authz_user_module modules/mod_authz_user.so
LoadModule autoindex_module modules/mod_autoindex.so
LoadModule cache_module modules/mod_cache.so
LoadModule cache_disk_module modules/mod_cache_disk.so
LoadModule data_module modules/mod_data.so
LoadModule dbd_module modules/mod_dbd.so
LoadModule deflate_module modules/mod_deflate.so
LoadModule dir_module modules/mod_dir.so
LoadModule dumpio_module modules/mod_dumpio.so
LoadModule echo_module modules/mod_echo.so
LoadModule env_module modules/mod_env.so
LoadModule expires_module modules/mod_expires.so
LoadModule ext_filter_module modules/mod_ext_filter.so
LoadModule filter_module modules/mod_filter.so
LoadModule headers_module modules/mod_headers.so
LoadModule include_module modules/mod_include.so
LoadModule info_module modules/mod_info.so
LoadModule log_config_module modules/mod_log_config.so
LoadModule logio_module modules/mod_logio.so
LoadModule mime_magic_module modules/mod_mime_magic.so
LoadModule mime_module modules/mod_mime.so
LoadModule negotiation_module modules/mod_negotiation.so
LoadModule remoteip_module modules/mod_remoteip.so
LoadModule reqtimeout_module modules/mod_reqtimeout.so
LoadModule rewrite_module modules/mod_rewrite.so
LoadModule setenvif_module modules/mod_setenvif.so
LoadModule slotmem_plain_module modules/mod_slotmem_plain.so
LoadModule slotmem_shm_module modules/mod_slotmem_shm.so
LoadModule socache_dbm_module modules/mod_socache_dbm.so
LoadModule socache_memcache_module modules/mod_socache_memcache.so
LoadModule socache_shmcb_module modules/mod_socache_shmcb.so
LoadModule status_module modules/mod_status.so
LoadModule substitute_module modules/mod_substitute.so
LoadModule suexec_module modules/mod_suexec.so
LoadModule unique_id_module modules/mod_unique_id.so
LoadModule unixd_module modules/mod_unixd.so
LoadModule userdir_module modules/mod_userdir.so
LoadModule version_module modules/mod_version.so
LoadModule vhost_alias_module modules/mod_vhost_alias.so
#LoadModule buffer_module modules/mod_buffer.so
#LoadModule watchdog_module modules/mod_watchdog.so
#LoadModule heartbeat_module modules/mod_heartbeat.so
#LoadModule heartmonitor_module modules/mod_heartmonitor.so
#LoadModule usertrack_module modules/mod_usertrack.so
#LoadModule dialup_module modules/mod_dialup.so
#LoadModule charset_lite_module modules/mod_charset_lite.so
#LoadModule log_debug_module modules/mod_log_debug.so
#LoadModule ratelimit_module modules/mod_ratelimit.so
#LoadModule reflector_module modules/mod_reflector.so
#LoadModule request_module modules/mod_request.so
#LoadModule sed_module modules/mod_sed.so
#LoadModule speling_module modules/mod_speling.so

View File

@@ -0,0 +1,3 @@
LoadModule dav_module modules/mod_dav.so
LoadModule dav_fs_module modules/mod_dav_fs.so
LoadModule dav_lock_module modules/mod_dav_lock.so

View File

@@ -0,0 +1 @@
LoadModule lua_module modules/mod_lua.so

View File

@@ -0,0 +1,19 @@
# Select the MPM module which should be used by uncommenting exactly
# one of the following LoadModule lines:
# prefork MPM: Implements a non-threaded, pre-forking web server
# See: http://httpd.apache.org/docs/2.4/mod/prefork.html
LoadModule mpm_prefork_module modules/mod_mpm_prefork.so
# worker MPM: Multi-Processing Module implementing a hybrid
# multi-threaded multi-process web server
# See: http://httpd.apache.org/docs/2.4/mod/worker.html
#
#LoadModule mpm_worker_module modules/mod_mpm_worker.so
# event MPM: A variant of the worker MPM with the goal of consuming
# threads only for connections with active processing
# See: http://httpd.apache.org/docs/2.4/mod/event.html
#
#LoadModule mpm_event_module modules/mod_mpm_event.so

View File

@@ -0,0 +1,16 @@
# This file configures all the proxy modules:
LoadModule proxy_module modules/mod_proxy.so
LoadModule lbmethod_bybusyness_module modules/mod_lbmethod_bybusyness.so
LoadModule lbmethod_byrequests_module modules/mod_lbmethod_byrequests.so
LoadModule lbmethod_bytraffic_module modules/mod_lbmethod_bytraffic.so
LoadModule lbmethod_heartbeat_module modules/mod_lbmethod_heartbeat.so
LoadModule proxy_ajp_module modules/mod_proxy_ajp.so
LoadModule proxy_balancer_module modules/mod_proxy_balancer.so
LoadModule proxy_connect_module modules/mod_proxy_connect.so
LoadModule proxy_express_module modules/mod_proxy_express.so
LoadModule proxy_fcgi_module modules/mod_proxy_fcgi.so
LoadModule proxy_fdpass_module modules/mod_proxy_fdpass.so
LoadModule proxy_ftp_module modules/mod_proxy_ftp.so
LoadModule proxy_http_module modules/mod_proxy_http.so
LoadModule proxy_scgi_module modules/mod_proxy_scgi.so
LoadModule proxy_wstunnel_module modules/mod_proxy_wstunnel.so

View File

@@ -0,0 +1 @@
LoadModule ssl_module modules/mod_ssl.so

View File

@@ -0,0 +1,2 @@
# This file configures systemd module:
LoadModule systemd_module modules/mod_systemd.so

View File

@@ -0,0 +1,14 @@
# This configuration file loads a CGI module appropriate to the MPM
# which has been configured in 00-mpm.conf. mod_cgid should be used
# with a threaded MPM; mod_cgi with the prefork MPM.
<IfModule mpm_worker_module>
LoadModule cgid_module modules/mod_cgid.so
</IfModule>
<IfModule mpm_event_module>
LoadModule cgid_module modules/mod_cgid.so
</IfModule>
<IfModule mpm_prefork_module>
LoadModule cgi_module modules/mod_cgi.so
</IfModule>

View File

@@ -0,0 +1,353 @@
#
# This is the main Apache HTTP server configuration file. It contains the
# configuration directives that give the server its instructions.
# See <URL:http://httpd.apache.org/docs/2.4/> for detailed information.
# In particular, see
# <URL:http://httpd.apache.org/docs/2.4/mod/directives.html>
# for a discussion of each configuration directive.
#
# Do NOT simply read the instructions in here without understanding
# what they do. They're here only as hints or reminders. If you are unsure
# consult the online docs. You have been warned.
#
# Configuration and logfile names: If the filenames you specify for many
# of the server's control files begin with "/" (or "drive:/" for Win32), the
# server will use that explicit path. If the filenames do *not* begin
# with "/", the value of ServerRoot is prepended -- so 'log/access_log'
# with ServerRoot set to '/www' will be interpreted by the
# server as '/www/log/access_log', where as '/log/access_log' will be
# interpreted as '/log/access_log'.
#
# ServerRoot: The top of the directory tree under which the server's
# configuration, error, and log files are kept.
#
# Do not add a slash at the end of the directory path. If you point
# ServerRoot at a non-local disk, be sure to specify a local disk on the
# Mutex directive, if file-based mutexes are used. If you wish to share the
# same ServerRoot for multiple httpd daemons, you will need to change at
# least PidFile.
#
ServerRoot "/etc/httpd"
#
# Listen: Allows you to bind Apache to specific IP addresses and/or
# ports, instead of the default. See also the <VirtualHost>
# directive.
#
# Change this to Listen on specific IP addresses as shown below to
# prevent Apache from glomming onto all bound IP addresses.
#
#Listen 12.34.56.78:80
Listen 80
#
# Dynamic Shared Object (DSO) Support
#
# To be able to use the functionality of a module which was built as a DSO you
# have to place corresponding `LoadModule' lines at this location so the
# directives contained in it are actually available _before_ they are used.
# Statically compiled modules (those listed by `httpd -l') do not need
# to be loaded here.
#
# Example:
# LoadModule foo_module modules/mod_foo.so
#
Include conf.modules.d/*.conf
#
# If you wish httpd to run as a different user or group, you must run
# httpd as root initially and it will switch.
#
# User/Group: The name (or #number) of the user/group to run httpd as.
# It is usually good practice to create a dedicated user and group for
# running httpd, as with most system services.
#
User apache
Group apache
# 'Main' server configuration
#
# The directives in this section set up the values used by the 'main'
# server, which responds to any requests that aren't handled by a
# <VirtualHost> definition. These values also provide defaults for
# any <VirtualHost> containers you may define later in the file.
#
# All of these directives may appear inside <VirtualHost> containers,
# in which case these default settings will be overridden for the
# virtual host being defined.
#
#
# ServerAdmin: Your address, where problems with the server should be
# e-mailed. This address appears on some server-generated pages, such
# as error documents. e.g. admin@your-domain.com
#
ServerAdmin root@localhost
#
# ServerName gives the name and port that the server uses to identify itself.
# This can often be determined automatically, but we recommend you specify
# it explicitly to prevent problems during startup.
#
# If your host doesn't have a registered DNS name, enter its IP address here.
#
#ServerName www.example.com:80
#
# Deny access to the entirety of your server's filesystem. You must
# explicitly permit access to web content directories in other
# <Directory> blocks below.
#
<Directory />
AllowOverride none
Require all denied
</Directory>
#
# Note that from this point forward you must specifically allow
# particular features to be enabled - so if something's not working as
# you might expect, make sure that you have specifically enabled it
# below.
#
#
# DocumentRoot: The directory out of which you will serve your
# documents. By default, all requests are taken from this directory, but
# symbolic links and aliases may be used to point to other locations.
#
DocumentRoot "/var/www/html"
#
# Relax access to content within /var/www.
#
<Directory "/var/www">
AllowOverride None
# Allow open access:
Require all granted
</Directory>
# Further relax access to the default document root:
<Directory "/var/www/html">
#
# Possible values for the Options directive are "None", "All",
# or any combination of:
# Indexes Includes FollowSymLinks SymLinksifOwnerMatch ExecCGI MultiViews
#
# Note that "MultiViews" must be named *explicitly* --- "Options All"
# doesn't give it to you.
#
# The Options directive is both complicated and important. Please see
# http://httpd.apache.org/docs/2.4/mod/core.html#options
# for more information.
#
Options Indexes FollowSymLinks
#
# AllowOverride controls what directives may be placed in .htaccess files.
# It can be "All", "None", or any combination of the keywords:
# Options FileInfo AuthConfig Limit
#
AllowOverride None
#
# Controls who can get stuff from this server.
#
Require all granted
</Directory>
#
# DirectoryIndex: sets the file that Apache will serve if a directory
# is requested.
#
<IfModule dir_module>
DirectoryIndex index.html
</IfModule>
#
# The following lines prevent .htaccess and .htpasswd files from being
# viewed by Web clients.
#
<Files ".ht*">
Require all denied
</Files>
#
# ErrorLog: The location of the error log file.
# If you do not specify an ErrorLog directive within a <VirtualHost>
# container, error messages relating to that virtual host will be
# logged here. If you *do* define an error logfile for a <VirtualHost>
# container, that host's errors will be logged there and not here.
#
ErrorLog "logs/error_log"
#
# LogLevel: Control the number of messages logged to the error_log.
# Possible values include: debug, info, notice, warn, error, crit,
# alert, emerg.
#
LogLevel warn
<IfModule log_config_module>
#
# The following directives define some format nicknames for use with
# a CustomLog directive (see below).
#
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combined
LogFormat "%h %l %u %t \"%r\" %>s %b" common
<IfModule logio_module>
# You need to enable mod_logio.c to use %I and %O
LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\" %I %O" combinedio
</IfModule>
#
# The location and format of the access logfile (Common Logfile Format).
# If you do not define any access logfiles within a <VirtualHost>
# container, they will be logged here. Contrariwise, if you *do*
# define per-<VirtualHost> access logfiles, transactions will be
# logged therein and *not* in this file.
#
#CustomLog "logs/access_log" common
#
# If you prefer a logfile with access, agent, and referer information
# (Combined Logfile Format) you can use the following directive.
#
CustomLog "logs/access_log" combined
</IfModule>
<IfModule alias_module>
#
# Redirect: Allows you to tell clients about documents that used to
# exist in your server's namespace, but do not anymore. The client
# will make a new request for the document at its new location.
# Example:
# Redirect permanent /foo http://www.example.com/bar
#
# Alias: Maps web paths into filesystem paths and is used to
# access content that does not live under the DocumentRoot.
# Example:
# Alias /webpath /full/filesystem/path
#
# If you include a trailing / on /webpath then the server will
# require it to be present in the URL. You will also likely
# need to provide a <Directory> section to allow access to
# the filesystem path.
#
# ScriptAlias: This controls which directories contain server scripts.
# ScriptAliases are essentially the same as Aliases, except that
# documents in the target directory are treated as applications and
# run by the server when requested rather than as documents sent to the
# client. The same rules about trailing "/" apply to ScriptAlias
# directives as to Alias.
#
ScriptAlias /cgi-bin/ "/var/www/cgi-bin/"
</IfModule>
#
# "/var/www/cgi-bin" should be changed to whatever your ScriptAliased
# CGI directory exists, if you have that configured.
#
<Directory "/var/www/cgi-bin">
AllowOverride None
Options None
Require all granted
</Directory>
<IfModule mime_module>
#
# TypesConfig points to the file containing the list of mappings from
# filename extension to MIME-type.
#
TypesConfig /etc/mime.types
#
# AddType allows you to add to or override the MIME configuration
# file specified in TypesConfig for specific file types.
#
#AddType application/x-gzip .tgz
#
# AddEncoding allows you to have certain browsers uncompress
# information on the fly. Note: Not all browsers support this.
#
#AddEncoding x-compress .Z
#AddEncoding x-gzip .gz .tgz
#
# If the AddEncoding directives above are commented-out, then you
# probably should define those extensions to indicate media types:
#
AddType application/x-compress .Z
AddType application/x-gzip .gz .tgz
#
# AddHandler allows you to map certain file extensions to "handlers":
# actions unrelated to filetype. These can be either built into the server
# or added with the Action directive (see below)
#
# To use CGI scripts outside of ScriptAliased directories:
# (You will also need to add "ExecCGI" to the "Options" directive.)
#
#AddHandler cgi-script .cgi
# For type maps (negotiated resources):
#AddHandler type-map var
#
# Filters allow you to process content before it is sent to the client.
#
# To parse .shtml files for server-side includes (SSI):
# (You will also need to add "Includes" to the "Options" directive.)
#
AddType text/html .shtml
AddOutputFilter INCLUDES .shtml
</IfModule>
#
# Specify a default charset for all content served; this enables
# interpretation of all content as UTF-8 by default. To use the
# default browser choice (ISO-8859-1), or to allow the META tags
# in HTML content to override this choice, comment out this
# directive:
#
AddDefaultCharset UTF-8
<IfModule mime_magic_module>
#
# The mod_mime_magic module allows the server to use various hints from the
# contents of the file itself to determine its type. The MIMEMagicFile
# directive tells the module where the hint definitions are located.
#
MIMEMagicFile conf/magic
</IfModule>
#
# Customizable error responses come in three flavors:
# 1) plain text 2) local redirects 3) external redirects
#
# Some examples:
#ErrorDocument 500 "The server made a boo boo."
#ErrorDocument 404 /missing.html
#ErrorDocument 404 "/cgi-bin/missing_handler.pl"
#ErrorDocument 402 http://www.example.com/subscription_info.html
#
#
# EnableMMAP and EnableSendfile: On systems that support it,
# memory-mapping or the sendfile syscall may be used to deliver
# files. This usually improves server performance, but must
# be turned off when serving from networked-mounted
# filesystems or if support for these functions is otherwise
# broken on your system.
# Defaults if commented: EnableMMAP On, EnableSendfile Off
#
#EnableMMAP off
EnableSendfile on
# Supplemental configuration
#
# Load config files in the "/etc/httpd/conf.d" directory, if any.
IncludeOptional conf.d/*.conf

View File

@@ -0,0 +1,385 @@
# Magic data for mod_mime_magic Apache module (originally for file(1) command)
# The module is described in /manual/mod/mod_mime_magic.html
#
# The format is 4-5 columns:
# Column #1: byte number to begin checking from, ">" indicates continuation
# Column #2: type of data to match
# Column #3: contents of data to match
# Column #4: MIME type of result
# Column #5: MIME encoding of result (optional)
#------------------------------------------------------------------------------
# Localstuff: file(1) magic for locally observed files
# Add any locally observed files here.
#------------------------------------------------------------------------------
# end local stuff
#------------------------------------------------------------------------------
#------------------------------------------------------------------------------
# Java
0 short 0xcafe
>2 short 0xbabe application/java
#------------------------------------------------------------------------------
# audio: file(1) magic for sound formats
#
# from Jan Nicolai Langfeldt <janl@ifi.uio.no>,
#
# Sun/NeXT audio data
0 string .snd
>12 belong 1 audio/basic
>12 belong 2 audio/basic
>12 belong 3 audio/basic
>12 belong 4 audio/basic
>12 belong 5 audio/basic
>12 belong 6 audio/basic
>12 belong 7 audio/basic
>12 belong 23 audio/x-adpcm
# DEC systems (e.g. DECstation 5000) use a variant of the Sun/NeXT format
# that uses little-endian encoding and has a different magic number
# (0x0064732E in little-endian encoding).
0 lelong 0x0064732E
>12 lelong 1 audio/x-dec-basic
>12 lelong 2 audio/x-dec-basic
>12 lelong 3 audio/x-dec-basic
>12 lelong 4 audio/x-dec-basic
>12 lelong 5 audio/x-dec-basic
>12 lelong 6 audio/x-dec-basic
>12 lelong 7 audio/x-dec-basic
# compressed (G.721 ADPCM)
>12 lelong 23 audio/x-dec-adpcm
# Bytes 0-3 of AIFF, AIFF-C, & 8SVX audio files are "FORM"
# AIFF audio data
8 string AIFF audio/x-aiff
# AIFF-C audio data
8 string AIFC audio/x-aiff
# IFF/8SVX audio data
8 string 8SVX audio/x-aiff
# Creative Labs AUDIO stuff
# Standard MIDI data
0 string MThd audio/unknown
#>9 byte >0 (format %d)
#>11 byte >1 using %d channels
# Creative Music (CMF) data
0 string CTMF audio/unknown
# SoundBlaster instrument data
0 string SBI audio/unknown
# Creative Labs voice data
0 string Creative\ Voice\ File audio/unknown
## is this next line right? it came this way...
#>19 byte 0x1A
#>23 byte >0 - version %d
#>22 byte >0 \b.%d
# [GRR 950115: is this also Creative Labs? Guessing that first line
# should be string instead of unknown-endian long...]
#0 long 0x4e54524b MultiTrack sound data
#0 string NTRK MultiTrack sound data
#>4 long x - version %ld
# Microsoft WAVE format (*.wav)
# [GRR 950115: probably all of the shorts and longs should be leshort/lelong]
# Microsoft RIFF
0 string RIFF audio/unknown
# - WAVE format
>8 string WAVE audio/x-wav
# MPEG audio.
0 beshort&0xfff0 0xfff0 audio/mpeg
# C64 SID Music files, from Linus Walleij <triad@df.lth.se>
0 string PSID audio/prs.sid
#------------------------------------------------------------------------------
# c-lang: file(1) magic for C programs or various scripts
#
# XPM icons (Greg Roelofs, newt@uchicago.edu)
# ideally should go into "images", but entries below would tag XPM as C source
0 string /*\ XPM image/x-xbm 7bit
# this first will upset you if you're a PL/1 shop... (are there any left?)
# in which case rm it; ascmagic will catch real C programs
# C or REXX program text
0 string /* text/plain
# C++ program text
0 string // text/plain
#------------------------------------------------------------------------------
# compress: file(1) magic for pure-compression formats (no archives)
#
# compress, gzip, pack, compact, huf, squeeze, crunch, freeze, yabba, whap, etc.
#
# Formats for various forms of compressed data
# Formats for "compress" proper have been moved into "compress.c",
# because it tries to uncompress it to figure out what's inside.
# standard unix compress
0 string \037\235 application/octet-stream x-compress
# gzip (GNU zip, not to be confused with [Info-ZIP/PKWARE] zip archiver)
0 string \037\213 application/octet-stream x-gzip
# According to gzip.h, this is the correct byte order for packed data.
0 string \037\036 application/octet-stream
#
# This magic number is byte-order-independent.
#
0 short 017437 application/octet-stream
# XXX - why *two* entries for "compacted data", one of which is
# byte-order independent, and one of which is byte-order dependent?
#
# compacted data
0 short 0x1fff application/octet-stream
0 string \377\037 application/octet-stream
# huf output
0 short 0145405 application/octet-stream
# Squeeze and Crunch...
# These numbers were gleaned from the Unix versions of the programs to
# handle these formats. Note that I can only uncrunch, not crunch, and
# I didn't have a crunched file handy, so the crunch number is untested.
# Keith Waclena <keith@cerberus.uchicago.edu>
#0 leshort 0x76FF squeezed data (CP/M, DOS)
#0 leshort 0x76FE crunched data (CP/M, DOS)
# Freeze
#0 string \037\237 Frozen file 2.1
#0 string \037\236 Frozen file 1.0 (or gzip 0.5)
# lzh?
#0 string \037\240 LZH compressed data
#------------------------------------------------------------------------------
# frame: file(1) magic for FrameMaker files
#
# This stuff came on a FrameMaker demo tape, most of which is
# copyright, but this file is "published" as witness the following:
#
0 string \<MakerFile application/x-frame
0 string \<MIFFile application/x-frame
0 string \<MakerDictionary application/x-frame
0 string \<MakerScreenFon application/x-frame
0 string \<MML application/x-frame
0 string \<Book application/x-frame
0 string \<Maker application/x-frame
#------------------------------------------------------------------------------
# html: file(1) magic for HTML (HyperText Markup Language) docs
#
# from Daniel Quinlan <quinlan@yggdrasil.com>
# and Anna Shergold <anna@inext.co.uk>
#
0 string \<!DOCTYPE\ HTML text/html
0 string \<!doctype\ html text/html
0 string \<HEAD text/html
0 string \<head text/html
0 string \<TITLE text/html
0 string \<title text/html
0 string \<html text/html
0 string \<HTML text/html
0 string \<!-- text/html
0 string \<h1 text/html
0 string \<H1 text/html
# XML eXtensible Markup Language, from Linus Walleij <triad@df.lth.se>
0 string \<?xml text/xml
#------------------------------------------------------------------------------
# images: file(1) magic for image formats (see also "c-lang" for XPM bitmaps)
#
# originally from jef@helios.ee.lbl.gov (Jef Poskanzer),
# additions by janl@ifi.uio.no as well as others. Jan also suggested
# merging several one- and two-line files into here.
#
# XXX - byte order for GIF and TIFF fields?
# [GRR: TIFF allows both byte orders; GIF is probably little-endian]
#
# [GRR: what the hell is this doing in here?]
#0 string xbtoa btoa'd file
# PBMPLUS
# PBM file
0 string P1 image/x-portable-bitmap 7bit
# PGM file
0 string P2 image/x-portable-greymap 7bit
# PPM file
0 string P3 image/x-portable-pixmap 7bit
# PBM "rawbits" file
0 string P4 image/x-portable-bitmap
# PGM "rawbits" file
0 string P5 image/x-portable-greymap
# PPM "rawbits" file
0 string P6 image/x-portable-pixmap
# NIFF (Navy Interchange File Format, a modification of TIFF)
# [GRR: this *must* go before TIFF]
0 string IIN1 image/x-niff
# TIFF and friends
# TIFF file, big-endian
0 string MM image/tiff
# TIFF file, little-endian
0 string II image/tiff
# possible GIF replacements; none yet released!
# (Greg Roelofs, newt@uchicago.edu)
#
# GRR 950115: this was mine ("Zip GIF"):
# ZIF image (GIF+deflate alpha)
0 string GIF94z image/unknown
#
# GRR 950115: this is Jeremy Wohl's Free Graphics Format (better):
# FGF image (GIF+deflate beta)
0 string FGF95a image/unknown
#
# GRR 950115: this is Thomas Boutell's Portable Bitmap Format proposal
# (best; not yet implemented):
# PBF image (deflate compression)
0 string PBF image/unknown
# GIF
0 string GIF image/gif
# JPEG images
0 beshort 0xffd8 image/jpeg
# PC bitmaps (OS/2, Windoze BMP files) (Greg Roelofs, newt@uchicago.edu)
0 string BM image/bmp
#>14 byte 12 (OS/2 1.x format)
#>14 byte 64 (OS/2 2.x format)
#>14 byte 40 (Windows 3.x format)
#0 string IC icon
#0 string PI pointer
#0 string CI color icon
#0 string CP color pointer
#0 string BA bitmap array
0 string \x89PNG image/png
0 string FWS application/x-shockwave-flash
0 string CWS application/x-shockwave-flash
#------------------------------------------------------------------------------
# lisp: file(1) magic for lisp programs
#
# various lisp types, from Daniel Quinlan (quinlan@yggdrasil.com)
0 string ;; text/plain 8bit
# Emacs 18 - this is always correct, but not very magical.
0 string \012( application/x-elc
# Emacs 19
0 string ;ELC\023\000\000\000 application/x-elc
#------------------------------------------------------------------------------
# mail.news: file(1) magic for mail and news
#
# There are tests to ascmagic.c to cope with mail and news.
0 string Relay-Version: message/rfc822 7bit
0 string #!\ rnews message/rfc822 7bit
0 string N#!\ rnews message/rfc822 7bit
0 string Forward\ to message/rfc822 7bit
0 string Pipe\ to message/rfc822 7bit
0 string Return-Path: message/rfc822 7bit
0 string Path: message/news 8bit
0 string Xref: message/news 8bit
0 string From: message/rfc822 7bit
0 string Article message/news 8bit
#------------------------------------------------------------------------------
# msword: file(1) magic for MS Word files
#
# Contributor claims:
# Reversed-engineered MS Word magic numbers
#
0 string \376\067\0\043 application/msword
0 string \333\245-\0\0\0 application/msword
# disable this one because it applies also to other
# Office/OLE documents for which msword is not correct. See PR#2608.
#0 string \320\317\021\340\241\261 application/msword
#------------------------------------------------------------------------------
# printer: file(1) magic for printer-formatted files
#
# PostScript
0 string %! application/postscript
0 string \004%! application/postscript
# Acrobat
# (due to clamen@cs.cmu.edu)
0 string %PDF- application/pdf
#------------------------------------------------------------------------------
# sc: file(1) magic for "sc" spreadsheet
#
38 string Spreadsheet application/x-sc
#------------------------------------------------------------------------------
# tex: file(1) magic for TeX files
#
# XXX - needs byte-endian stuff (big-endian and little-endian DVI?)
#
# From <conklin@talisman.kaleida.com>
# Although we may know the offset of certain text fields in TeX DVI
# and font files, we can't use them reliably because they are not
# zero terminated. [but we do anyway, christos]
0 string \367\002 application/x-dvi
#0 string \367\203 TeX generic font data
#0 string \367\131 TeX packed font data
#0 string \367\312 TeX virtual font data
#0 string This\ is\ TeX, TeX transcript text
#0 string This\ is\ METAFONT, METAFONT transcript text
# There is no way to detect TeX Font Metric (*.tfm) files without
# breaking them apart and reading the data. The following patterns
# match most *.tfm files generated by METAFONT or afm2tfm.
#2 string \000\021 TeX font metric data
#2 string \000\022 TeX font metric data
#>34 string >\0 (%s)
# Texinfo and GNU Info, from Daniel Quinlan (quinlan@yggdrasil.com)
#0 string \\input\ texinfo Texinfo source text
#0 string This\ is\ Info\ file GNU Info text
# correct TeX magic for Linux (and maybe more)
# from Peter Tobias (tobias@server.et-inf.fho-emden.de)
#
0 leshort 0x02f7 application/x-dvi
# RTF - Rich Text Format
0 string {\\rtf application/rtf
#------------------------------------------------------------------------------
# animation: file(1) magic for animation/movie formats
#
# animation formats, originally from vax@ccwf.cc.utexas.edu (VaX#n8)
# MPEG file
0 string \000\000\001\263 video/mpeg
#
# The contributor claims:
# I couldn't find a real magic number for these, however, this
# -appears- to work. Note that it might catch other files, too,
# so BE CAREFUL!
#
# Note that title and author appear in the two 20-byte chunks
# at decimal offsets 2 and 22, respectively, but they are XOR'ed with
# 255 (hex FF)! DL format SUCKS BIG ROCKS.
#
# DL file version 1 , medium format (160x100, 4 images/screen)
0 byte 1 video/unknown
0 byte 2 video/unknown
# Quicktime video, from Linus Walleij <triad@df.lth.se>
# from Apple quicktime file format documentation.
4 string moov video/quicktime
4 string mdat video/quicktime

View File

@@ -0,0 +1 @@
conf.d/centos.example.com.conf, centos.example.com

Some files were not shown because too many files have changed in this diff Show More