Compare commits

..

60 Commits

Author SHA1 Message Date
m0namon
f9749f64d8 temp testfixes 2020-01-21 17:17:16 -08:00
sydneyli
c0633ec3ac Implement ApacheBlockNode.delete_child 2020-01-21 16:20:57 -08:00
sydneyli
c9edb49267 Implement ApacheBlockNode.add_child_* 2020-01-21 16:20:32 -08:00
sydneyli
9cc15da90f Implement ApacheBlockNode constructor 2020-01-21 16:19:58 -08:00
m0namon
69bb92d698 Bump apacheconfig dependency to latest version and install dev version of apache for coverage tests 2020-01-21 16:02:17 -08:00
m0namon
bafa9e2eb0 Load apacheconfig dependency, gate behind flag 2020-01-15 10:29:36 -08:00
sydneyli
0e78436b05 [Apache v2] Add apacheconfig as a dependency (#7643)
* Add apacheconfig as a dependency.

* Change apacheconfig to a dev dependency

* Bump apacheconfig dep to 0.3.1
2020-01-07 09:57:43 -08:00
ohemorange
9b5b27597c Merge pull request #7618 from certbot/ap2_merge_master
[Apache v2] Merge changes from master to apache-parser-v2
2020-01-06 17:54:29 -08:00
Joona Hoikkala
3b065238b3 Modifications needed for merging to master 2020-01-06 17:19:33 +02:00
Joona Hoikkala
0f5bda4ff9 Merge remote-tracking branch 'origin/master' into ap2_merge_master 2020-01-06 17:17:31 +02:00
Joona Hoikkala
70be256c66 Fix gating to ensure that no parsernode functionality is run unless explicitly requested (#7654) 2020-01-02 15:44:11 -08:00
Adrien Ferrand
fda655370a Update CHANGELOG.md (#7659) 2020-01-02 23:44:16 +01:00
Adrien Ferrand
887d72fd5d Remove POST-as-GET fallback to GET (#6994) 2020-01-02 12:48:55 -08:00
Brad Warren
6d527bcc42 Include header files for compilation. (#7650) 2019-12-19 14:02:24 -08:00
Barbz
6ca80b7ce8 How to uninstall certbot-auto (#7648) 2019-12-19 13:30:13 -08:00
Joona Hoikkala
4401eacaac Implement get_virtual_hosts() for ParserNode interfaces (#7564) 2019-12-19 10:51:41 +02:00
Brad Warren
f520d482fd Remove other 3.8-dev references. (#7646) 2019-12-18 23:00:49 +01:00
Adrien Ferrand
b5a31bec03 Add docker-compose as a requirement of certbot-ci (#7120)
Fixes #7110 

This PR declares docker-compose as a requirement for certbot-ci. This way, a recent version of docker-compose is installed in the standard virtual environment set up by `tools/venv.py` and `tools/venv3.py`, and so is available to pytest integration tests from `tox` or in the virtual environment enabled.

* Add docker-compose as a dev dependency and declares it in certbot-ci requirements

* Update docker-compose 1.25.0
2019-12-18 13:21:54 -08:00
Brad Warren
6ac7aabaf7 Remove warning about dev preview (#7640) 2019-12-18 11:14:58 -08:00
Brad Warren
24fdea5fd8 discourage dns plugins (#7639) 2019-12-18 11:13:57 -08:00
Adrien Ferrand
4a906484ee Execute Windows installer integration tests on several Windows versions (#7641)
This PRs extends the installer tests on Azure Pipeline, in order to run the integration tests on a certbot instance installed with the Windows installer for several Windows versions, corresponding to the scope of supported versions on Certbot:
* Windows Server 2012 R2
* Windows Server 2016
* Windows Server 2019

One can see the result on: https://dev.azure.com/adferrand/certbot/_build/results?buildId=311

* Try specific installer-build step

* Install Python manually

* Add tests on windows 2019
2019-12-16 16:03:39 -08:00
Adrien Ferrand
9e5bca4bbf Lint certbot code on Python 3, and update Pylint to the latest version (#7551)
Part of #7550

This PR makes appropriate corrections to run pylint on Python 3.

Why not keeping the dependencies unchanged and just run pylint on Python 3?
Because the old version of pylint breaks horribly on Python 3 because of unsupported version of astroid.

Why updating pylint + astroid to the latest version ?
Because this version only fixes some internal errors occuring during the lint of Certbot code, and is also ready to run gracefully on Python 3.8.

Why upgrading mypy ?
Because the old version does not support the new version of astroid required to run pylint correctly.

Why not upgrading mypy to its latest version ?
Because this latest version includes a new typshed version, that adds a lot of new type definitions, and brings dozens of new errors on the Certbot codebase. I would like to fix that in a future PR.

That said so, the work has been to find the correct set of new dependency versions, then configure pylint for sane configuration errors in our situation, disable irrelevant lintings errors, then fixing (or ignoring for good reason) the remaining mypy errors.

I also made PyLint and MyPy checks run correctly on Windows.

* Start configuration

* Reconfigure travis

* Suspend a check specific to python 3. Start fixing code.

* Repair call_args

* Fix return + elif lints

* Reconfigure development to run mainly on python3

* Remove incompatible Python 3.4 jobs

* Suspend pylint in some assertions

* Remove pylint in dev

* Take first mypy that supports typed-ast>=1.4.0 to limit the migration path

* Various return + else lint errors

* Find a set of deps that is working with current mypy version

* Update local oldest requirements

* Remove all current pylint errors

* Rebuild letsencrypt-auto

* Update mypy to fix pylint with new astroid version, and fix mypy issues

* Explain type: ignore

* Reconfigure tox, fix none path

* Simplify pinning

* Remove useless directive

* Remove debugging code

* Remove continue

* Update requirements

* Disable unsubscriptable-object check

* Disable one check, enabling two more

* Plug certbot dev version for oldest requirements

* Remove useless disable directives

* Remove useless no-member disable

* Remove no-else-* checks. Use elif in symetric branches.

* Add back assertion

* Add new line

* Remove unused pylint disable

* Remove other pylint disable
2019-12-10 14:12:50 -08:00
Joona Hoikkala
5c588a6f8d [Apache v2] Implement parsed_files (#7562)
* Implement parsed_files

* Add parsed_files stub to ApacheParserNodes and fix assertions

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Add more descriptive comments

* Update certbot-apache/certbot_apache/augeasparser.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot-apache/certbot_apache/dualparser.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>
2019-12-10 13:20:00 -05:00
Adrien Ferrand
e048da1e38 Reorganize imports (#7616)
* Isort execution

* Fix pylint, adapt coverage

* New isort

* Fix magic_typing lint

* Second round

* Fix pylint

* Third round. Store isort configuration

* Fix latest mistakes

* Other fixes

* Add newline

* Fix lint errors
2019-12-09 15:50:20 -05:00
Brad Warren
34b568f366 Don't list adding type annotations as a PR req. (#7627) 2019-12-04 20:22:10 +01:00
ohemorange
b99bfe8ab7 Merge pull request #7622 from certbot/candidate-1.0.0
Release 1.0
2019-12-04 14:15:49 -05:00
Brad Warren
5da61564d9 Don't list DNS plugins as alpha quality. (#7624)
They should be considered production quality like our other packaged code.
2019-12-03 19:56:16 -08:00
Brad Warren
b45f79d0ab fix bad links in docs (#7623)
This PR fixes the failures at https://travis-ci.com/certbot/website/builds/139193502#L1316.

Once this PR lands, I'll update certbot/website#508 to include this commit.
2019-12-03 11:05:23 -08:00
Brad Warren
3cfa63483d Add full API documentation (#7614)
A lot of Certbot's files don't have API documentation which is fixed by this PR. To do this, from the top level certbot directory I ran:
```
sphinx-apidoc -Me -o docs/api certbot
```
I then merged the resulting `modules.rst` file with `docs/api.rst`.
2019-12-03 09:54:37 -08:00
Brad Warren
27d6f62a96 update external plugin (#7604)
The old plugin at https://github.com/marcan/certbot-external says it's obsolete and points people to https://github.com/EnigmaBridge/certbot-external-auth. The new plugin is also an installer.

I also removed the reference to #2782 about us adding similar functionality since that's been done for a long time. We could reference our manual plugin instead, but I think that devalues their plugin a bit which I don't think is necessary or correct as it has different features.
2019-12-03 09:52:05 -08:00
Brad Warren
e32033f1ec document main (#7610)
I deleted the exceptions because I think it's not feasible to document the possible exceptions raised by all of Certbot.
2019-12-03 09:51:43 -08:00
Brad Warren
d2bad803f3 Bump version to 1.1.0 2019-12-03 09:27:30 -08:00
Brad Warren
5debf7af7e Add contents to certbot/CHANGELOG.md for next version 2019-12-03 09:27:30 -08:00
Joona Hoikkala
6148e5c355 [Apache v2] Move the apachectl parsing to apache_util (#7569)
* Move the Apache CLI parsing to apache_util

* Fix test mocks

* Address review comments

* Fix the parsernode metadata dictionary
2019-12-02 18:00:53 -05:00
Joona Hoikkala
06fdbf2a55 [Apache v2] Implement find_ancestors (#7561)
* Implement find_ancestors

* Create the node properly and add assertions

* Update certbot-apache/certbot_apache/augeasparser.py

Co-Authored-By: ohemorange <ebportnoy@gmail.com>

* Remove comment
2019-12-02 16:25:39 +02:00
Joona Hoikkala
ac1a60ff0b Implement add_child_comment (#7518) 2019-11-18 21:21:51 +02:00
sydneyli
b70f9c4744 [Apache v2] Initial ApacheParser skeleton (#7559)
* Fix metadata & primary references in Augeas tests.

When performing actions only on one of the trees in DualNodeParser, the two
trees get out-of-sync. Similarly, we can't expect that the metadata between
the two trees will remain the same.

Did a pass over the tests to re-wire metadata and primary usage.

* Add ApacheParser skeleton.

Fix plumbing in configurator & dualparser to initialize ApacheParser
alongside AugeasParser.

* Silence coverage reports for now
2019-11-15 12:22:18 +02:00
Joona Hoikkala
517ff5cb19 [Apache v2] Implement delete_child() (#7521)
* Implement delete_child

* Fix linter
2019-11-12 14:19:35 -08:00
Joona Hoikkala
d14eec9ecf [Apache v2] Implement save() and unsaved_files() (#7520)
* Implement save() and unsaved_files()

* Linter fix
2019-11-12 14:19:21 -08:00
Joona Hoikkala
bdf24d2bed Implement add_child_directive (#7517) 2019-11-12 14:19:02 -08:00
Joona Hoikkala
578ca1c6af [Apache v2] Adding nodes 1/3 : add_child_block() (#7497)
* Implement add_child_block()

* Add comments and example

* Check augas path inconsistencies in initialization
2019-11-11 11:33:14 -08:00
Joona Hoikkala
19de05c72f [Apache v2] Implement set_parameters() (#7461)
* find_comments implementation and AugeasCommentNode creation

* set_parameters implementation

* Change parameters to a property

* Remove parameters property setter

* More pythonic iteration handling
2019-11-06 10:33:24 -08:00
Joona Hoikkala
d645574839 [Apache v2] AugeasBlockNode find_comments() implementation (#7457)
* find_comments implementation and AugeasCommentNode creation

* Use dummy value for ancestor

* Add NotImplementedError when calling find_comments with exact parameter

* Remove parameter 'exact' from find_comments interface

* Fix comment
2019-10-30 11:03:23 -07:00
Joona Hoikkala
9b2322a573 Use dummy values for ancestor (#7462) 2019-10-25 14:54:28 -07:00
Joona Hoikkala
79caaa8e6f Merge pull request #7466 from certbot/update-apache-parser-v2-v2
Update the apache-parser-v2 branch
2019-10-25 18:56:47 +03:00
Brad Warren
8620dcf06f Merge branch 'master' into apache-parser-v2 2019-10-24 09:59:53 -07:00
Joona Hoikkala
3f36298716 [Apache v2] find_blocks and find_directives implementation (#7443)
* Implement AugeasDirectiveNode and AugeasBlockNode find and create functions

* Add tests for _aug_get_block_name
2019-10-21 12:52:00 -07:00
Adrien Ferrand
63c7dd109c Merge pull request #7435 from certbot/add-azure-pipelines
Add azure pipelines
2019-10-10 19:38:32 +02:00
Brad Warren
8f6fc67378 Merge branch 'master' into add-azure-pipelines 2019-10-08 17:05:12 -07:00
Joona Hoikkala
a156d37ee1 Merge pull request #7400 from certbot/update-apache-parser-v2
Update apache parser v2
2019-09-26 10:49:36 +03:00
Brad Warren
1756ef8620 Merge branch 'master' into update-apache-parser-v2 2019-09-25 11:55:33 -07:00
Joona Hoikkala
feacbe9671 [Apache v2] DualParserNode implementation 3/3 (#7376)
* DualParserNode, DualCommentNode and DualDirectiveNode implementations

* Add DualBlockNode

* DualBlockNode find_ methods

* Address review comments

* Address review comments

* Simplify isPass

* Add explanation to _create_matching_list pydoc

* Remove unnecessary conditional block

* Address review comments
2019-09-23 16:27:48 -04:00
Joona Hoikkala
c224340330 [Apache v2] DualParserNode implementation 2/3 (#7375)
* DualParserNode, DualCommentNode and DualDirectiveNode implementations

* Add DualBlockNode

* Address review comments

* Address review comments

* Call the right assertion after name change

* Simplify isPass

* Add explanation to _create_matching_list pydoc

* Break when match was found
2019-09-19 17:44:50 -04:00
Joona Hoikkala
23fb6d2877 [Apache v2] DualParserNode implementation 1/3 (#7374)
* DualParserNode, DualCommentNode and DualDirectiveNode implementations

* Address review comments

* Address review comments

* Simplify isPass
2019-09-18 16:31:44 -04:00
Joona Hoikkala
9620cc75d4 [Apache v2] Allow initialization of ParserNode instances using metadata dictionary instead of required arguments (#7366)
Add metadata keyword argument to the ParserNode interface, allowing the initialization of the object from contents of the metadata - if the implementation allows it. As an example, Augeas implementation needs nothing more than the Augeas DOM path of a configuration directive to be able to populate the ParserNode instance with all data relevant to the DirectiveNode.

The checks also allow skipping the otherwise required keyword arguments if metadata is provided.

* Allow creating ParserNode instances using information from metadata dictionary

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Address review comments

* Fix filepath comment

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>
2019-09-09 12:09:09 -07:00
Joona Hoikkala
af1c66b28f [Apache v2] Modifications to ParserNode interfaces (#7330)
This PR contains the changes requested in initial pre-review comments of #7308

Move properties to class pydocs in interfaces.py
Prefer class ABC register() functionality instead of class inheritance for interface classes
Add apache implementation specific functions to interfaces

* Move class argument definitions to class pydoc

* Add apache specific functionality to the interface

* Bring inheritance back

* Define initialization for different ParserNode classes

* Add parsernode utils to check keyword arguments and document the defaults in pydoc

* Fix pydocs and make BlockNode a child of DirectiveNode

* Refine docs, and remove unused __init__ from BlockNode

* Split parsernode util tests to their own respective file

* Skip cover for dummy calls to super

* Add types to method documentation

* Add documentation for children
2019-08-30 13:42:18 -07:00
Joona Hoikkala
270754deff [Apache v2] New ParserNode interface abstraction (#7246)
* New ParserNode interface abstraction

* Add the test assertions, and fix interface

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Add dummy tests and change arguments to properties

* Add more context to comment property docstring

* Add documentation to the main docstring

* Streamline the parameter naming

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Add explicit instructions how whitespaces are treated in set_parameters

* Add the information about lookups being case insensitive.

* Add context about whitespacing to add_ - methods

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>

* Update certbot-apache/certbot_apache/interfaces.py

Co-Authored-By: Brad Warren <bmw@users.noreply.github.com>
2019-08-13 09:22:51 -07:00
Joona Hoikkala
a83f9eb4e4 Merge pull request #7321 from certbot/fix-apache-parser-v2-cover
Fix apache-parser-v2 tests
2019-08-13 15:28:24 +03:00
Brad Warren
fed2264dac Merge branch 'master' into fix-apache-parser-v2-cover 2019-08-09 11:21:15 -07:00
Brad Warren
31a8d086fc Merge pull request #7289 from certbot/fix-apache-parser-v2
Fix AppVeyor on the apache-parser-v2 branch
2019-08-02 11:22:53 -07:00
313 changed files with 4538 additions and 1200 deletions

View File

@@ -1,5 +1,5 @@
jobs:
- job: installer
- job: installer_build
pool:
vmImage: vs2017-win2016
steps:
@@ -20,10 +20,32 @@ jobs:
path: $(Build.ArtifactStagingDirectory)
artifact: windows-installer
displayName: Publish Windows installer
- script: $(Build.ArtifactStagingDirectory)\certbot-beta-installer-win32.exe /S
- job: installer_run
dependsOn: installer_build
strategy:
matrix:
win2019:
imageName: windows-2019
win2016:
imageName: vs2017-win2016
win2012r2:
imageName: vs2015-win2012r2
pool:
vmImage: $(imageName)
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: windows-installer
path: $(Build.SourcesDirectory)/bin
displayName: Retrieve Windows installer
- script: $(Build.SourcesDirectory)\bin\certbot-beta-installer-win32.exe /S
displayName: Install Certbot
- powershell: Invoke-WebRequest https://www.python.org/ftp/python/3.8.0/python-3.8.0-amd64-webinstall.exe -OutFile C:\py3-setup.exe
displayName: Get Python
- script: C:\py3-setup.exe /quiet PrependPath=1 InstallAllUsers=1 Include_launcher=1 InstallLauncherAllUsers=1 Include_test=0 Include_doc=0 Include_dev=1 Include_debug=0 Include_tcltk=0 TargetDir=C:\py3
displayName: Install Python
- script: |
python -m venv venv
py -3 -m venv venv
venv\Scripts\python tools\pip_install.py -e certbot-ci
displayName: Prepare Certbot-CI
- script: |

7
.isort.cfg Normal file
View File

@@ -0,0 +1,7 @@
[settings]
skip_glob=venv*
skip=letsencrypt-auto-source
force_sort_within_sections=True
force_single_line=True
order_by_type=False
line_length=400

View File

@@ -24,6 +24,11 @@ persistent=yes
# usually to register additional checkers.
load-plugins=linter_plugin
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
[MESSAGES CONTROL]
@@ -41,10 +46,14 @@ load-plugins=linter_plugin
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use"--disable=all --enable=classes
# --disable=W"
disable=fixme,locally-disabled,locally-enabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design
# abstract-class-not-used cannot be disabled locally (at least in
# pylint 1.4.1), same for abstract-class-little-used
# CERTBOT COMMENT
# 1) Once certbot codebase is claimed to be compatible exclusively with Python 3,
# the useless-object-inheritance check can be enabled again, and code fixed accordingly.
# 2) Check unsubscriptable-object tends to create a lot of false positives. Let's disable it.
# See https://github.com/PyCQA/pylint/issues/1498.
# 3) Same as point 2 for no-value-for-parameter.
# See https://github.com/PyCQA/pylint/issues/2820.
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue
[REPORTS]

View File

@@ -46,12 +46,9 @@ matrix:
- python: "2.7"
env: TOXENV=py27-cover FYI="py27 tests + code coverage"
- python: "2.7"
- python: "3.7"
env: TOXENV=lint
<<: *not-on-master
- python: "3.4"
env: TOXENV=mypy
<<: *not-on-master
- python: "3.5"
env: TOXENV=mypy
<<: *not-on-master
@@ -60,7 +57,7 @@ matrix:
# cryptography we support cannot be compiled against the version of
# OpenSSL in Xenial or newer.
dist: trusty
env: TOXENV='py27-{acme,apache,certbot,dns,nginx}-oldest'
env: TOXENV='py27-{acme,apache,apache-v2,certbot,dns,nginx}-oldest'
<<: *not-on-master
- python: "3.4"
env: TOXENV=py34
@@ -178,7 +175,7 @@ matrix:
- python: "3.7"
env: TOXENV=py37
<<: *extended-test-suite
- python: "3.8-dev"
- python: "3.8"
env: TOXENV=py38
<<: *extended-test-suite
- python: "3.4"
@@ -221,10 +218,10 @@ matrix:
sudo: required
services: docker
<<: *extended-test-suite
- python: "3.8-dev"
- python: "3.8"
env: ACME_SERVER=boulder-v1 TOXENV=integration
<<: *extended-test-suite
- python: "3.8-dev"
- python: "3.8"
env: ACME_SERVER=boulder-v2 TOXENV=integration
<<: *extended-test-suite
- sudo: required

View File

@@ -6,16 +6,15 @@ EXPOSE 80 443
WORKDIR /opt/certbot/src
# TODO: Install Apache/Nginx for plugin development.
COPY . .
RUN apt-get update && \
apt-get install apache2 git nginx-light -y && \
letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
apt-get install apache2 git python3-dev python3-venv gcc libaugeas0 \
libssl-dev libffi-dev ca-certificates openssl nginx-light -y && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* \
/tmp/* \
/var/tmp/*
RUN VENV_NAME="../venv" python tools/venv.py
RUN VENV_NAME="../venv3" python3 tools/venv3.py
ENV PATH /opt/certbot/venv/bin:$PATH
ENV PATH /opt/certbot/venv3/bin:$PATH

View File

@@ -13,7 +13,6 @@ import warnings
#
# It is based on
# https://github.com/requests/requests/blob/1278ecdf71a312dc2268f3bfc0aabfab3c006dcf/requests/packages.py
import josepy as jose
for mod in list(sys.modules):

View File

@@ -54,8 +54,7 @@ class UnrecognizedChallenge(Challenge):
object.__setattr__(self, "jobj", jobj)
def to_partial_json(self):
# pylint: disable=no-member
return self.jobj
return self.jobj # pylint: disable=no-member
@classmethod
def from_json(cls, jobj):
@@ -113,7 +112,7 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
:rtype: bool
"""
parts = self.key_authorization.split('.') # pylint: disable=no-member
parts = self.key_authorization.split('.')
if len(parts) != 2:
logger.debug("Key authorization (%r) is not well formed",
self.key_authorization)

View File

@@ -5,25 +5,26 @@ import datetime
from email.utils import parsedate_tz
import heapq
import logging
import time
import re
import sys
import time
import six
from six.moves import http_client # pylint: disable=import-error
import josepy as jose
import OpenSSL
import requests
from requests.adapters import HTTPAdapter
from requests_toolbelt.adapters.source import SourceAddressAdapter
import six
from six.moves import http_client # pylint: disable=import-error
from acme import crypto_util
from acme import errors
from acme import jws
from acme import messages
# pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict, List, Set, Text
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Text # pylint: disable=unused-import, no-name-in-module
logger = logging.getLogger(__name__)
@@ -33,7 +34,6 @@ logger = logging.getLogger(__name__)
# https://urllib3.readthedocs.org/en/latest/security.html#insecureplatformwarning
if sys.version_info < (2, 7, 9): # pragma: no cover
try:
# pylint: disable=no-member
requests.packages.urllib3.contrib.pyopenssl.inject_into_urllib3() # type: ignore
except AttributeError:
import urllib3.contrib.pyopenssl # pylint: disable=import-error
@@ -279,7 +279,6 @@ class Client(ClientBase):
assert response.status_code == http_client.CREATED
# "Instance of 'Field' has no key/contact member" bug:
# pylint: disable=no-member
return self._regr_from_response(response)
def query_registration(self, regr):
@@ -464,7 +463,6 @@ class Client(ClientBase):
updated[authzr] = updated_authzr
attempts[authzr] += 1
# pylint: disable=no-member
if updated_authzr.body.status not in (
messages.STATUS_VALID, messages.STATUS_INVALID):
if attempts[authzr] < max_attempts:
@@ -605,7 +603,6 @@ class ClientV2(ClientBase):
if response.status_code == 200 and 'Location' in response.headers:
raise errors.ConflictError(response.headers.get('Location'))
# "Instance of 'Field' has no key/contact member" bug:
# pylint: disable=no-member
regr = self._regr_from_response(response)
self.net.account = regr
return regr
@@ -729,7 +726,7 @@ class ClientV2(ClientBase):
for authzr in responses:
if authzr.body.status != messages.STATUS_VALID:
for chall in authzr.body.challenges:
if chall.error != None:
if chall.error is not None:
failed.append(authzr)
if failed:
raise errors.ValidationError(failed)
@@ -779,29 +776,13 @@ class ClientV2(ClientBase):
def _post_as_get(self, *args, **kwargs):
"""
Send GET request using the POST-as-GET protocol if needed.
The request will be first issued using POST-as-GET for ACME v2. If the ACME CA servers do
not support this yet and return an error, request will be retried using GET.
For ACME v1, only GET request will be tried, as POST-as-GET is not supported.
Send GET request using the POST-as-GET protocol.
:param args:
:param kwargs:
:return:
"""
if self.acme_version >= 2:
# We add an empty payload for POST-as-GET requests
new_args = args[:1] + (None,) + args[1:]
try:
return self._post(*new_args, **kwargs)
except messages.Error as error:
if error.code == 'malformed':
logger.debug('Error during a POST-as-GET request, '
'your ACME CA server may not support it:\n%s', error)
logger.debug('Retrying request with GET.')
else: # pragma: no cover
raise
# If POST-as-GET is not supported yet, we use a GET instead.
return self.net.get(*args, **kwargs)
new_args = args[:1] + (None,) + args[1:]
return self._post(*new_args, **kwargs)
class BackwardsCompatibleClientV2(object):
@@ -1124,10 +1105,9 @@ class ClientNetwork(object):
err_regex = r".*host='(\S*)'.*Max retries exceeded with url\: (\/\w*).*(\[Errno \d+\])([A-Za-z ]*)"
m = re.match(err_regex, str(e))
if m is None:
raise # pragma: no cover
else:
host, path, _err_no, err_msg = m.groups()
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
raise # pragma: no cover
host, path, _err_no, err_msg = m.groups()
raise ValueError("Requesting {0}{1}:{2}".format(host, path, err_msg))
# If content is DER, log the base64 of it instead of raw bytes, to keep
# binary data out of the logs.
@@ -1193,8 +1173,7 @@ class ClientNetwork(object):
if error.code == 'badNonce':
logger.debug('Retrying request after error:\n%s', error)
return self._post_once(*args, **kwargs)
else:
raise
raise
def _post_once(self, url, obj, content_type=JOSE_CONTENT_TYPE,
acme_version=1, **kwargs):

View File

@@ -6,15 +6,15 @@ import os
import re
import socket
from OpenSSL import crypto
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
import josepy as jose
from OpenSSL import crypto
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
from acme import errors
# pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Callable, Union, Tuple, Optional
# pylint: enable=unused-import, no-name-in-module
from acme.magic_typing import Callable # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Optional # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Tuple # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Union # pylint: disable=unused-import, no-name-in-module
logger = logging.getLogger(__name__)

View File

@@ -29,7 +29,12 @@ class NonceError(ClientError):
class BadNonce(NonceError):
"""Bad nonce error."""
def __init__(self, nonce, error, *args, **kwargs):
super(BadNonce, self).__init__(*args, **kwargs)
# MyPy complains here that there is too many arguments for BaseException constructor.
# This is an error fixed in typeshed, see https://github.com/python/mypy/issues/4183
# The fix is included in MyPy>=0.740, but upgrading it would bring dozen of errors due to
# new types definitions. So we ignore the error until the code base is fixed to match
# with MyPy>=0.740 referential.
super(BadNonce, self).__init__(*args, **kwargs) # type: ignore
self.nonce = nonce
self.error = error
@@ -48,7 +53,8 @@ class MissingNonce(NonceError):
"""
def __init__(self, response, *args, **kwargs):
super(MissingNonce, self).__init__(*args, **kwargs)
# See comment in BadNonce constructor above for an explanation of type: ignore here.
super(MissingNonce, self).__init__(*args, **kwargs) # type: ignore
self.response = response
def __str__(self):
@@ -83,6 +89,7 @@ class PollError(ClientError):
return '{0}(exhausted={1!r}, updated={2!r})'.format(
self.__class__.__name__, self.exhausted, self.updated)
class ValidationError(Error):
"""Error for authorization failures. Contains a list of authorization
resources, each of which is invalid and should have an error field.
@@ -91,9 +98,11 @@ class ValidationError(Error):
self.failed_authzrs = failed_authzrs
super(ValidationError, self).__init__()
class TimeoutError(Error):
class TimeoutError(Error): # pylint: disable=redefined-builtin
"""Error for when polling an authorization or an order times out."""
class IssuanceError(Error):
"""Error sent by the server after requesting issuance of a certificate."""
@@ -105,6 +114,7 @@ class IssuanceError(Error):
self.error = error
super(IssuanceError, self).__init__()
class ConflictError(ClientError):
"""Error for when the server returns a 409 (Conflict) HTTP status.

View File

@@ -4,7 +4,6 @@ import logging
import josepy as jose
import pyrfc3339
logger = logging.getLogger(__name__)

View File

@@ -40,7 +40,7 @@ class Signature(jose.Signature):
class JWS(jose.JWS):
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
signature_cls = Signature
__slots__ = jose.JWS._orig_slots # pylint: disable=no-member
__slots__ = jose.JWS._orig_slots
@classmethod
# pylint: disable=arguments-differ

View File

@@ -1,6 +1,7 @@
"""Shim class to not have to depend on typing module in prod."""
import sys
class TypingClass(object):
"""Ignore import errors by getting anything"""
def __getattr__(self, name):

View File

@@ -1,18 +1,21 @@
"""ACME protocol messages."""
import json
import josepy as jose
import six
from acme import challenges
from acme import errors
from acme import fields
from acme import jws
from acme import util
try:
from collections.abc import Hashable # pylint: disable=no-name-in-module
except ImportError: # pragma: no cover
from collections import Hashable
import josepy as jose
from acme import challenges
from acme import errors
from acme import fields
from acme import util
from acme import jws
OLD_ERROR_PREFIX = "urn:acme:error:"
ERROR_PREFIX = "urn:ietf:params:acme:error:"
@@ -143,7 +146,7 @@ class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
raise jose.DeserializationError(
'{0} not recognized'.format(cls.__name__))
return cls.POSSIBLE_NAMES[jobj] # pylint: disable=unsubscriptable-object
return cls.POSSIBLE_NAMES[jobj]
def __repr__(self):
return '{0}({1})'.format(self.__class__.__name__, self.name)

View File

@@ -11,8 +11,7 @@ from six.moves import socketserver # type: ignore # pylint: disable=import-err
from acme import challenges
from acme import crypto_util
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
logger = logging.getLogger(__name__)
@@ -45,7 +44,7 @@ class TLSServer(socketserver.TCPServer):
return socketserver.TCPServer.server_bind(self)
class ACMEServerMixin: # pylint: disable=old-style-class
class ACMEServerMixin:
"""ACME server common settings mixin."""
# TODO: c.f. #858
server_version = "ACME client standalone challenge solver"
@@ -106,7 +105,6 @@ class BaseDualNetworkedServers(object):
"""Wraps socketserver.TCPServer.serve_forever"""
for server in self.servers:
thread = threading.Thread(
# pylint: disable=no-member
target=server.serve_forever)
thread.start()
self.threads.append(thread)

View File

@@ -12,10 +12,9 @@
# All configuration values have a default; values that are commented out
# serve to show the default.
import sys
import os
import shlex
import sys
here = os.path.abspath(os.path.dirname(__file__))

View File

@@ -26,8 +26,10 @@ Workflow:
- Deactivate Account
"""
from contextlib import contextmanager
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import rsa
import josepy as jose
import OpenSSL
from acme import challenges
@@ -36,7 +38,6 @@ from acme import crypto_util
from acme import errors
from acme import messages
from acme import standalone
import josepy as jose
# Constants:

View File

@@ -1,9 +1,10 @@
from setuptools import setup
from setuptools import find_packages
from setuptools.command.test import test as TestCommand
import sys
version = '1.0.0'
from setuptools import find_packages
from setuptools import setup
from setuptools.command.test import test as TestCommand
version = '1.1.0.dev0'
# Please update tox.ini when modifying dependency version requirements
install_requires = [

View File

@@ -4,8 +4,7 @@ import unittest
import josepy as jose
import mock
import requests
from six.moves.urllib import parse as urllib_parse # pylint: disable=relative-import
from six.moves.urllib import parse as urllib_parse
import test_util
@@ -19,7 +18,6 @@ class ChallengeTest(unittest.TestCase):
from acme.challenges import Challenge
from acme.challenges import UnrecognizedChallenge
chall = UnrecognizedChallenge({"type": "foo"})
# pylint: disable=no-member
self.assertEqual(chall, Challenge.from_json(chall.jobj))

View File

@@ -5,19 +5,17 @@ import datetime
import json
import unittest
from six.moves import http_client # pylint: disable=import-error
import josepy as jose
import mock
import OpenSSL
import requests
from six.moves import http_client # pylint: disable=import-error
from acme import challenges
from acme import errors
from acme import jws as acme_jws
from acme import messages
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
import messages_test
import test_util
@@ -63,7 +61,7 @@ class ClientTestBase(unittest.TestCase):
self.contact = ('mailto:cert-admin@example.com', 'tel:+12025551212')
reg = messages.Registration(
contact=self.contact, key=KEY.public_key())
the_arg = dict(reg) # type: Dict
the_arg = dict(reg) # type: Dict
self.new_reg = messages.NewRegistration(**the_arg)
self.regr = messages.RegistrationResource(
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1')
@@ -887,19 +885,6 @@ class ClientV2Test(ClientTestBase):
new_nonce_url='https://www.letsencrypt-demo.org/acme/new-nonce')
self.client.net.get.assert_not_called()
class FakeError(messages.Error):
"""Fake error to reproduce a malformed request ACME error"""
def __init__(self): # pylint: disable=super-init-not-called
pass
@property
def code(self):
return 'malformed'
self.client.net.post.side_effect = FakeError()
self.client.poll(self.authzr2) # pylint: disable=protected-access
self.client.net.get.assert_called_once_with(self.authzr2.uri)
class MockJSONDeSerializable(jose.JSONDeSerializable):
# pylint: disable=missing-docstring
@@ -965,8 +950,8 @@ class ClientNetworkTest(unittest.TestCase):
def test_check_response_not_ok_jobj_error(self):
self.response.ok = False
self.response.json.return_value = messages.Error(
detail='foo', typ='serverInternal', title='some title').to_json()
self.response.json.return_value = messages.Error.with_code(
'serverInternal', detail='foo', title='some title').to_json()
# pylint: disable=protected-access
self.assertRaises(
messages.Error, self.net._check_response, self.response)
@@ -991,7 +976,7 @@ class ClientNetworkTest(unittest.TestCase):
self.response.json.side_effect = ValueError
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
self.response.headers['Content-Type'] = response_ct
# pylint: disable=protected-access,no-value-for-parameter
# pylint: disable=protected-access
self.assertEqual(
self.response, self.net._check_response(self.response))
@@ -1005,7 +990,7 @@ class ClientNetworkTest(unittest.TestCase):
self.response.json.return_value = {}
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
self.response.headers['Content-Type'] = response_ct
# pylint: disable=protected-access,no-value-for-parameter
# pylint: disable=protected-access
self.assertEqual(
self.response, self.net._check_response(self.response))
@@ -1130,8 +1115,8 @@ class ClientNetworkWithMockedResponseTest(unittest.TestCase):
self.response.headers = {}
self.response.links = {}
self.response.checked = False
self.acmev1_nonce_response = mock.MagicMock(ok=False,
status_code=http_client.METHOD_NOT_ALLOWED)
self.acmev1_nonce_response = mock.MagicMock(
ok=False, status_code=http_client.METHOD_NOT_ALLOWED)
self.acmev1_nonce_response.headers = {}
self.obj = mock.MagicMock()
self.wrapped_obj = mock.MagicMock()

View File

@@ -5,17 +5,16 @@ import threading
import time
import unittest
import six
from six.moves import socketserver #type: ignore # pylint: disable=import-error
import josepy as jose
import OpenSSL
import six
from six.moves import socketserver # type: ignore # pylint: disable=import-error
from acme import errors
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
import test_util
class SSLSocketAndProbeSNITest(unittest.TestCase):
"""Tests for acme.crypto_util.SSLSocket/probe_sni."""
@@ -39,7 +38,6 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
self.port = self.server.socket.getsockname()[1]
self.server_thread = threading.Thread(
# pylint: disable=no-member
target=self.server.handle_request)
def tearDown(self):
@@ -66,7 +64,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
def test_probe_connection_error(self):
# pylint has a hard time with six
self.server.server_close() # pylint: disable=no-member
self.server.server_close()
original_timeout = socket.getdefaulttimeout()
try:
socket.setdefaulttimeout(1)

View File

@@ -2,6 +2,7 @@
import importlib
import unittest
class JoseTest(unittest.TestCase):
"""Tests for acme.jose shim."""
@@ -20,11 +21,10 @@ class JoseTest(unittest.TestCase):
# We use the imports below with eval, but pylint doesn't
# understand that.
# pylint: disable=eval-used,unused-variable
import acme
import josepy
acme_jose_mod = eval(acme_jose_path)
josepy_mod = eval(josepy_path)
import acme # pylint: disable=unused-import
import josepy # pylint: disable=unused-import
acme_jose_mod = eval(acme_jose_path) # pylint: disable=eval-used
josepy_mod = eval(josepy_path) # pylint: disable=eval-used
self.assertIs(acme_jose_mod, josepy_mod)
self.assertIs(getattr(acme_jose_mod, attribute), getattr(josepy_mod, attribute))

View File

@@ -5,7 +5,6 @@ import josepy as jose
import test_util
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))

View File

@@ -5,8 +5,7 @@ import josepy as jose
import mock
from acme import challenges
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
import test_util
CERT = test_util.load_comparable_cert('cert.der')
@@ -19,8 +18,7 @@ class ErrorTest(unittest.TestCase):
def setUp(self):
from acme.messages import Error, ERROR_PREFIX
self.error = Error(
detail='foo', typ=ERROR_PREFIX + 'malformed', title='title')
self.error = Error.with_code('malformed', detail='foo', title='title')
self.jobj = {
'detail': 'foo',
'title': 'some title',
@@ -28,7 +26,6 @@ class ErrorTest(unittest.TestCase):
}
self.error_custom = Error(typ='custom', detail='bar')
self.empty_error = Error()
self.jobj_custom = {'type': 'custom', 'detail': 'bar'}
def test_default_typ(self):
from acme.messages import Error
@@ -43,8 +40,7 @@ class ErrorTest(unittest.TestCase):
hash(Error.from_json(self.error.to_json()))
def test_description(self):
self.assertEqual(
'The request message was malformed', self.error.description)
self.assertEqual('The request message was malformed', self.error.description)
self.assertTrue(self.error_custom.description is None)
def test_code(self):
@@ -54,17 +50,17 @@ class ErrorTest(unittest.TestCase):
self.assertEqual(None, Error().code)
def test_is_acme_error(self):
from acme.messages import is_acme_error
from acme.messages import is_acme_error, Error
self.assertTrue(is_acme_error(self.error))
self.assertFalse(is_acme_error(self.error_custom))
self.assertFalse(is_acme_error(Error()))
self.assertFalse(is_acme_error(self.empty_error))
self.assertFalse(is_acme_error("must pet all the {dogs|rabbits}"))
def test_unicode_error(self):
from acme.messages import Error, ERROR_PREFIX, is_acme_error
arabic_error = Error(
detail=u'\u0639\u062f\u0627\u0644\u0629', typ=ERROR_PREFIX + 'malformed',
title='title')
from acme.messages import Error, is_acme_error
arabic_error = Error.with_code(
'malformed', detail=u'\u0639\u062f\u0627\u0644\u0629', title='title')
self.assertTrue(is_acme_error(arabic_error))
def test_with_code(self):
@@ -305,8 +301,7 @@ class ChallengeBodyTest(unittest.TestCase):
from acme.messages import Error
from acme.messages import STATUS_INVALID
self.status = STATUS_INVALID
error = Error(typ='urn:ietf:params:acme:error:serverInternal',
detail='Unable to communicate with DNS server')
error = Error.with_code('serverInternal', detail='Unable to communicate with DNS server')
self.challb = ChallengeBody(
uri='http://challb', chall=self.chall, status=self.status,
error=error)

View File

@@ -3,18 +3,17 @@ import socket
import threading
import unittest
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
import josepy as jose
import mock
import requests
from six.moves import http_client # pylint: disable=import-error
from six.moves import socketserver # type: ignore # pylint: disable=import-error
from acme import challenges
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
import test_util
class TLSServerTest(unittest.TestCase):
"""Tests for acme.standalone.TLSServer."""

View File

@@ -4,12 +4,12 @@
"""
import os
import pkg_resources
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import serialization
import josepy as jose
from OpenSSL import crypto
import pkg_resources
def load_vector(*names):
@@ -25,8 +25,7 @@ def _guess_loader(filename, loader_pem, loader_der):
return loader_pem
elif ext.lower() == '.der':
return loader_der
else: # pragma: no cover
raise ValueError("Loader could not be recognized based on extension")
raise ValueError("Loader could not be recognized based on extension") # pragma: no cover
def load_cert(*names):

View File

@@ -1,9 +1,17 @@
""" Utility functions for certbot-apache plugin """
import binascii
import fnmatch
import logging
import re
import subprocess
from certbot import errors
from certbot import util
from certbot.compat import os
logger = logging.getLogger(__name__)
def get_mod_deps(mod_name):
"""Get known module dependencies.
@@ -105,3 +113,131 @@ def parse_define_file(filepath, varname):
def unique_id():
""" Returns an unique id to be used as a VirtualHost identifier"""
return binascii.hexlify(os.urandom(16)).decode("utf-8")
def included_in_paths(filepath, paths):
"""
Returns true if the filepath is included in the list of paths
that may contain full paths or wildcard paths that need to be
expanded.
:param str filepath: Filepath to check
:params list paths: List of paths to check against
:returns: True if included
:rtype: bool
"""
return any([fnmatch.fnmatch(filepath, path) for path in paths])
def parse_defines(apachectl):
"""
Gets Defines from httpd process and returns a dictionary of
the defined variables.
:param str apachectl: Path to apachectl executable
:returns: dictionary of defined variables
:rtype: dict
"""
variables = dict()
define_cmd = [apachectl, "-t", "-D",
"DUMP_RUN_CFG"]
matches = parse_from_subprocess(define_cmd, r"Define: ([^ \n]*)")
try:
matches.remove("DUMP_RUN_CFG")
except ValueError:
return {}
for match in matches:
if match.count("=") > 1:
logger.error("Unexpected number of equal signs in "
"runtime config dump.")
raise errors.PluginError(
"Error parsing Apache runtime variables")
parts = match.partition("=")
variables[parts[0]] = parts[2]
return variables
def parse_includes(apachectl):
"""
Gets Include directives from httpd process and returns a list of
their values.
:param str apachectl: Path to apachectl executable
:returns: list of found Include directive values
:rtype: list of str
"""
inc_cmd = [apachectl, "-t", "-D",
"DUMP_INCLUDES"]
return parse_from_subprocess(inc_cmd, r"\(.*\) (.*)")
def parse_modules(apachectl):
"""
Get loaded modules from httpd process, and return the list
of loaded module names.
:param str apachectl: Path to apachectl executable
:returns: list of found LoadModule module names
:rtype: list of str
"""
mod_cmd = [apachectl, "-t", "-D",
"DUMP_MODULES"]
return parse_from_subprocess(mod_cmd, r"(.*)_module")
def parse_from_subprocess(command, regexp):
"""Get values from stdout of subprocess command
:param list command: Command to run
:param str regexp: Regexp for parsing
:returns: list parsed from command output
:rtype: list
"""
stdout = _get_runtime_cfg(command)
return re.compile(regexp).findall(stdout)
def _get_runtime_cfg(command):
"""
Get runtime configuration info.
:param command: Command to run
:returns: stdout from command
"""
try:
proc = subprocess.Popen(
command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True)
stdout, stderr = proc.communicate()
except (OSError, ValueError):
logger.error(
"Error running command %s for runtime parameters!%s",
command, os.linesep)
raise errors.MisconfigurationError(
"Error accessing loaded Apache parameters: {0}".format(
command))
# Small errors that do not impede
if proc.returncode != 0:
logger.warning("Error in checking parameter list: %s", stderr)
raise errors.MisconfigurationError(
"Apache is unable to check whether or not the module is "
"loaded because Apache is misconfigured.")
return stdout

View File

@@ -0,0 +1,242 @@
""" apacheconfig implementation of the ParserNode interfaces """
from functools import partial
from certbot import errors
from certbot_apache._internal import assertions
from certbot_apache._internal import interfaces
from certbot_apache._internal import parsernode_util as util
class ApacheParserNode(interfaces.ParserNode):
""" apacheconfig implementation of ParserNode interface.
Expects metadata `ac_ast` to be passed in, where `ac_ast` is the AST provided
by parsing the equivalent configuration text using the apacheconfig library.
"""
def __init__(self, **kwargs):
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs) # pylint: disable=unused-variable
super(ApacheParserNode, self).__init__(**kwargs)
self.ancestor = ancestor
self.filepath = filepath
self.dirty = dirty
self.metadata = metadata
self._raw = self.metadata["ac_ast"]
def save(self, msg): # pragma: no cover
pass
def find_ancestors(self, name): # pylint: disable=unused-variable
"""Find ancestor BlockNodes with a given name"""
return [ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
class ApacheCommentNode(ApacheParserNode):
""" apacheconfig implementation of CommentNode interface """
def __init__(self, **kwargs):
comment, kwargs = util.commentnode_kwargs(kwargs) # pylint: disable=unused-variable
super(ApacheCommentNode, self).__init__(**kwargs)
self.comment = comment
def __eq__(self, other): # pragma: no cover
if isinstance(other, self.__class__):
return (self.comment == other.comment and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata and
self.filepath == other.filepath)
return False
class ApacheDirectiveNode(ApacheParserNode):
""" apacheconfig implementation of DirectiveNode interface """
def __init__(self, **kwargs):
name, parameters, enabled, kwargs = util.directivenode_kwargs(kwargs)
super(ApacheDirectiveNode, self).__init__(**kwargs)
self.name = name
self.parameters = parameters
self.enabled = enabled
self.include = None
def __eq__(self, other): # pragma: no cover
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
def set_parameters(self, _parameters):
"""Sets the parameters for DirectiveNode"""
return
def _parameters_from_string(text):
text = text.strip()
words = []
word = ""
quote = None
escape = False
for c in text:
if c.isspace() and not quote:
if word:
words.append(word)
word = ""
else:
word += c
if not escape:
if not quote and c in "\"\'":
quote = c
elif c == quote:
words.append(word[1:-1])
word = ""
quote = None
escape = c == "\\"
if word:
words.append(word)
return tuple(words)
class ApacheBlockNode(ApacheDirectiveNode):
""" apacheconfig implementation of BlockNode interface """
def __init__(self, **kwargs):
super(ApacheBlockNode, self).__init__(**kwargs)
self._raw_children = self._raw
children = []
for raw_node in self._raw_children:
metadata = self.metadata.copy()
metadata['ac_ast'] = raw_node
if raw_node.typestring == "comment":
node = ApacheCommentNode(comment=raw_node.name[2:],
metadata=metadata, ancestor=self,
filepath=self.filepath)
elif raw_node.typestring == "block":
parameters = _parameters_from_string(raw_node.arguments)
node = ApacheBlockNode(name=raw_node.tag, parameters=parameters,
metadata=metadata, ancestor=self,
filepath=self.filepath, enabled=self.enabled)
else:
parameters = ()
if raw_node.value:
parameters = _parameters_from_string(raw_node.value)
node = ApacheDirectiveNode(name=raw_node.name, parameters=parameters,
metadata=metadata, ancestor=self,
filepath=self.filepath, enabled=self.enabled)
children.append(node)
self.children = tuple(children)
def __eq__(self, other): # pragma: no cover
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.children == other.children and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
def _add_child_thing(self, raw_string, partial_node, position):
position = len(self._raw_children) if not position else position
# Cap position to length to mimic AugeasNode behavior. TODO: document that this happens
position = min(len(self._raw_children), position)
raw_ast = self._raw_children.add(position, raw_string)
metadata = self.metadata.copy()
metadata['ac_ast'] = raw_ast
new_node = partial_node(ancestor=self, metadata=metadata, filepath=self.filepath)
# Update metadata
children = list(self.children)
children.insert(position, new_node)
self.children = tuple(children)
return new_node
def add_child_block(self, name, parameters=None, position=None):
"""Adds a new BlockNode to the sequence of children"""
parameters_str = " " + " ".join(parameters) if parameters else ""
if not parameters:
parameters = []
partial_block = partial(ApacheBlockNode, name=name,
parameters=tuple(parameters), enabled=self.enabled)
return self._add_child_thing("\n<%s%s>\n</%s>" % (name, parameters_str, name),
partial_block, position)
def add_child_directive(self, name, parameters=None, position=None):
"""Adds a new DirectiveNode to the sequence of children"""
parameters_str = " " + " ".join(parameters) if parameters else ""
if not parameters: # TODO (mona): test
parameters = [] # pragma: no cover
partial_block = partial(ApacheDirectiveNode, name=name,
parameters=tuple(parameters), enabled=self.enabled)
return self._add_child_thing("\n%s%s" % (name, parameters_str), partial_block, position)
def add_child_comment(self, comment="", position=None):
"""Adds a new CommentNode to the sequence of children"""
partial_comment = partial(ApacheCommentNode, comment=comment)
return self._add_child_thing(comment, partial_comment, position)
def find_blocks(self, name, exclude=True): # pylint: disable=unused-argument
"""Recursive search of BlockNodes from the sequence of children"""
return [ApacheBlockNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
def find_directives(self, name, exclude=True): # pylint: disable=unused-argument
"""Recursive search of DirectiveNodes from the sequence of children"""
return [ApacheDirectiveNode(name=assertions.PASS,
parameters=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
def find_comments(self, comment, exact=False): # pylint: disable=unused-argument
"""Recursive search of DirectiveNodes from the sequence of children"""
return [ApacheCommentNode(comment=assertions.PASS,
ancestor=self,
filepath=assertions.PASS,
metadata=self.metadata)]
# TODO (mona): test
def delete_child(self, child): # pragma: no cover
"""Deletes a ParserNode from the sequence of children"""
index = -1
i = None
for i, elem in enumerate(self.children):
if elem == child:
index = i
break
if index < 0:
raise errors.PluginError("Could not find child node to delete")
children_list = list(self.children)
thing = children_list.pop(i)
self.children = tuple(children_list)
self._raw_children.remove(i)
return thing
def unsaved_files(self): # pragma: no cover
"""Returns a list of unsaved filepaths"""
return [assertions.PASS]
def parsed_paths(self): # pragma: no cover
"""Returns a list of parsed configuration file paths"""
return [assertions.PASS]
interfaces.CommentNode.register(ApacheCommentNode)
interfaces.DirectiveNode.register(ApacheDirectiveNode)
interfaces.BlockNode.register(ApacheBlockNode)

View File

@@ -0,0 +1,142 @@
"""Dual parser node assertions"""
import fnmatch
from certbot_apache._internal import interfaces
PASS = "CERTBOT_PASS_ASSERT"
def assertEqual(first, second):
""" Equality assertion """
if isinstance(first, interfaces.CommentNode):
assertEqualComment(first, second)
elif isinstance(first, interfaces.DirectiveNode):
assertEqualDirective(first, second)
# Do an extra interface implementation assertion, as the contents were
# already checked for BlockNode in the assertEqualDirective
if isinstance(first, interfaces.BlockNode):
assert isinstance(second, interfaces.BlockNode)
# Skip tests if filepath includes the pass value. This is done
# because filepath is variable of the base ParserNode interface, and
# unless the implementation is actually done, we cannot assume getting
# correct results from boolean assertion for dirty
if not isPass(first.filepath) and not isPass(second.filepath):
assert first.dirty == second.dirty
# We might want to disable this later if testing with two separate
# (but identical) directory structures.
assert first.filepath == second.filepath
def assertEqualComment(first, second): # pragma: no cover
""" Equality assertion for CommentNode """
assert isinstance(first, interfaces.CommentNode)
assert isinstance(second, interfaces.CommentNode)
if not isPass(first.comment) and not isPass(second.comment): # type: ignore
assert first.comment == second.comment # type: ignore
def _assertEqualDirectiveComponents(first, second): # pragma: no cover
""" Handles assertion for instance variables for DirectiveNode and BlockNode"""
# Enabled value cannot be asserted, because Augeas implementation
# is unable to figure that out.
# assert first.enabled == second.enabled
if not isPass(first.name) and not isPass(second.name):
assert first.name == second.name
if not isPass(first.parameters) and not isPass(second.parameters):
assert first.parameters == second.parameters
def assertEqualDirective(first, second):
""" Equality assertion for DirectiveNode """
assert isinstance(first, interfaces.DirectiveNode)
assert isinstance(second, interfaces.DirectiveNode)
_assertEqualDirectiveComponents(first, second)
def isPass(value): # pragma: no cover
"""Checks if the value is set to PASS"""
if isinstance(value, bool):
return True
return PASS in value
def isPassDirective(block):
""" Checks if BlockNode or DirectiveNode should pass the assertion """
if isPass(block.name):
return True
if isPass(block.parameters): # pragma: no cover
return True
if isPass(block.filepath): # pragma: no cover
return True
return False
def isPassComment(comment):
""" Checks if CommentNode should pass the assertion """
if isPass(comment.comment):
return True
if isPass(comment.filepath): # pragma: no cover
return True
return False
def isPassNodeList(nodelist): # pragma: no cover
""" Checks if a ParserNode in the nodelist should pass the assertion,
this function is used for results of find_* methods. Unimplemented find_*
methods should return a sequence containing a single ParserNode instance
with assertion pass string."""
try:
node = nodelist[0]
except IndexError:
node = None
if not node: # pragma: no cover
return False
if isinstance(node, interfaces.DirectiveNode):
return isPassDirective(node)
return isPassComment(node)
def assertEqualSimple(first, second):
""" Simple assertion """
if not isPass(first) and not isPass(second):
assert first == second
def isEqualVirtualHost(first, second):
"""
Checks that two VirtualHost objects are similar. There are some built
in differences with the implementations: VirtualHost created by ParserNode
implementation doesn't have "path" defined, as it was used for Augeas path
and that cannot obviously be used in the future. Similarly the legacy
version lacks "node" variable, that has a reference to the BlockNode for the
VirtualHost.
"""
return (
first.name == second.name and
first.aliases == second.aliases and
first.filep == second.filep and
first.addrs == second.addrs and
first.ssl == second.ssl and
first.enabled == second.enabled and
first.modmacro == second.modmacro and
first.ancestor == second.ancestor
)
def assertEqualPathsList(first, second): # pragma: no cover
"""
Checks that the two lists of file paths match. This assertion allows for wildcard
paths.
"""
if any([isPass(path) for path in first]):
return
if any([isPass(path) for path in second]):
return
for fpath in first:
assert any([fnmatch.fnmatch(fpath, spath) for spath in second])
for spath in second:
assert any([fnmatch.fnmatch(fpath, spath) for fpath in first])

View File

@@ -0,0 +1,538 @@
"""
Augeas implementation of the ParserNode interfaces.
Augeas works internally by using XPATH notation. The following is a short example
of how this all works internally, to better understand what's going on under the
hood.
A configuration file /etc/apache2/apache2.conf with the following content:
# First comment line
# Second comment line
WhateverDirective whatevervalue
<ABlock>
DirectiveInABlock dirvalue
</ABlock>
SomeDirective somedirectivevalue
<ABlock>
AnotherDirectiveInABlock dirvalue
</ABlock>
# Yet another comment
Translates over to Augeas path notation (of immediate children), when calling
for example: aug.match("/files/etc/apache2/apache2.conf/*")
[
"/files/etc/apache2/apache2.conf/#comment[1]",
"/files/etc/apache2/apache2.conf/#comment[2]",
"/files/etc/apache2/apache2.conf/directive[1]",
"/files/etc/apache2/apache2.conf/ABlock[1]",
"/files/etc/apache2/apache2.conf/directive[2]",
"/files/etc/apache2/apache2.conf/ABlock[2]",
"/files/etc/apache2/apache2.conf/#comment[3]"
]
Regardless of directives name, its key in the Augeas tree is always "directive",
with index where needed of course. Comments work similarly, while blocks
have their own key in the Augeas XPATH notation.
It's important to note that all of the unique keys have their own indices.
Augeas paths are case sensitive, while Apache configuration is case insensitive.
It looks like this:
<block>
directive value
</block>
<Block>
Directive Value
</Block>
<block>
directive value
</block>
<bLoCk>
DiReCtiVe VaLuE
</bLoCk>
Translates over to:
[
"/files/etc/apache2/apache2.conf/block[1]",
"/files/etc/apache2/apache2.conf/Block[1]",
"/files/etc/apache2/apache2.conf/block[2]",
"/files/etc/apache2/apache2.conf/bLoCk[1]",
]
"""
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import assertions
from certbot_apache._internal import interfaces
from certbot_apache._internal import parser
from certbot_apache._internal import parsernode_util as util
class AugeasParserNode(interfaces.ParserNode):
""" Augeas implementation of ParserNode interface """
def __init__(self, **kwargs):
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs) # pylint: disable=unused-variable
super(AugeasParserNode, self).__init__(**kwargs)
self.ancestor = ancestor
self.filepath = filepath
self.dirty = dirty
self.metadata = metadata
self.parser = self.metadata.get("augeasparser")
try:
if self.metadata["augeaspath"].endswith("/"):
raise errors.PluginError(
"Augeas path: {} has a trailing slash".format(
self.metadata["augeaspath"]
)
)
except KeyError:
raise errors.PluginError("Augeas path is required")
def save(self, msg):
self.parser.save(msg)
def find_ancestors(self, name):
"""
Searches for ancestor BlockNodes with a given name.
:param str name: Name of the BlockNode parent to search for
:returns: List of matching ancestor nodes.
:rtype: list of AugeasBlockNode
"""
ancestors = []
parent = self.metadata["augeaspath"]
while True:
# Get the path of ancestor node
parent = parent.rpartition("/")[0]
# Root of the tree
if not parent or parent == "/files":
break
anc = self._create_blocknode(parent)
if anc.name.lower() == name.lower():
ancestors.append(anc)
return ancestors
def _create_blocknode(self, path):
"""
Helper function to create a BlockNode from Augeas path. This is used by
AugeasParserNode.find_ancestors and AugeasBlockNode.
and AugeasBlockNode.find_blocks
"""
name = self._aug_get_name(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(path)
)
return AugeasBlockNode(name=name,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _aug_get_name(self, path):
"""
Helper function to get name of a configuration block or variable from path.
"""
# Remove the ending slash if any
if path[-1] == "/": # pragma: no cover
path = path[:-1]
# Get the block name
name = path.split("/")[-1]
# remove [...], it's not allowed in Apache configuration and is used
# for indexing within Augeas
name = name.split("[")[0]
return name
class AugeasCommentNode(AugeasParserNode):
""" Augeas implementation of CommentNode interface """
def __init__(self, **kwargs):
comment, kwargs = util.commentnode_kwargs(kwargs) # pylint: disable=unused-variable
super(AugeasCommentNode, self).__init__(**kwargs)
# self.comment = comment
self.comment = comment
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.comment == other.comment and
self.filepath == other.filepath and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
class AugeasDirectiveNode(AugeasParserNode):
""" Augeas implementation of DirectiveNode interface """
def __init__(self, **kwargs):
name, parameters, enabled, kwargs = util.directivenode_kwargs(kwargs)
super(AugeasDirectiveNode, self).__init__(**kwargs)
self.name = name
self.enabled = enabled
if parameters:
self.set_parameters(parameters)
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
def set_parameters(self, parameters):
"""
Sets parameters of a DirectiveNode or BlockNode object.
:param list parameters: List of all parameters for the node to set.
"""
orig_params = self._aug_get_params(self.metadata["augeaspath"])
# Clear out old parameters
for _ in orig_params:
# When the first parameter is removed, the indices get updated
param_path = "{}/arg[1]".format(self.metadata["augeaspath"])
self.parser.aug.remove(param_path)
# Insert new ones
for pi, param in enumerate(parameters):
param_path = "{}/arg[{}]".format(self.metadata["augeaspath"], pi+1)
self.parser.aug.set(param_path, param)
@property
def parameters(self):
"""
Fetches the parameters from Augeas tree, ensuring that the sequence always
represents the current state
:returns: Tuple of parameters for this DirectiveNode
:rtype: tuple:
"""
return tuple(self._aug_get_params(self.metadata["augeaspath"]))
def _aug_get_params(self, path):
"""Helper function to get parameters for DirectiveNodes and BlockNodes"""
arg_paths = self.parser.aug.match(path + "/arg")
return [self.parser.get_arg(apath) for apath in arg_paths]
class AugeasBlockNode(AugeasDirectiveNode):
""" Augeas implementation of BlockNode interface """
def __init__(self, **kwargs):
super(AugeasBlockNode, self).__init__(**kwargs)
self.children = ()
def __eq__(self, other):
if isinstance(other, self.__class__):
return (self.name == other.name and
self.filepath == other.filepath and
self.parameters == other.parameters and
self.children == other.children and
self.enabled == other.enabled and
self.dirty == other.dirty and
self.ancestor == other.ancestor and
self.metadata == other.metadata)
return False
# pylint: disable=unused-argument
def add_child_block(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new BlockNode to the sequence of children"""
insertpath, realpath, before = self._aug_resolve_child_position(
name,
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
# Create the new block
self.parser.aug.insert(insertpath, name, before)
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(realpath)
)
# Parameters will be set at the initialization of the new object
new_block = AugeasBlockNode(name=name,
parameters=parameters,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_block
# pylint: disable=unused-argument
def add_child_directive(self, name, parameters=None, position=None): # pragma: no cover
"""Adds a new DirectiveNode to the sequence of children"""
if not parameters:
raise errors.PluginError("Directive requires parameters and none were set.")
insertpath, realpath, before = self._aug_resolve_child_position(
"directive",
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
# Create the new directive
self.parser.aug.insert(insertpath, "directive", before)
# Set the directive key
self.parser.aug.set(realpath, name)
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(realpath)
)
new_dir = AugeasDirectiveNode(name=name,
parameters=parameters,
enabled=enabled,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_dir
def add_child_comment(self, comment="", position=None):
"""Adds a new CommentNode to the sequence of children"""
insertpath, realpath, before = self._aug_resolve_child_position(
"#comment",
position
)
new_metadata = {"augeasparser": self.parser, "augeaspath": realpath}
# Create the new comment
self.parser.aug.insert(insertpath, "#comment", before)
# Set the comment content
self.parser.aug.set(realpath, comment)
new_comment = AugeasCommentNode(comment=comment,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(realpath),
metadata=new_metadata)
return new_comment
def find_blocks(self, name, exclude=True):
"""Recursive search of BlockNodes from the sequence of children"""
nodes = list()
paths = self._aug_find_blocks(name)
if exclude:
paths = self.parser.exclude_dirs(paths)
for path in paths:
nodes.append(self._create_blocknode(path))
return nodes
def find_directives(self, name, exclude=True):
"""Recursive search of DirectiveNodes from the sequence of children"""
nodes = list()
ownpath = self.metadata.get("augeaspath")
directives = self.parser.find_dir(name, start=ownpath, exclude=exclude)
already_parsed = set() # type: Set[str]
for directive in directives:
# Remove the /arg part from the Augeas path
directive = directive.partition("/arg")[0]
# find_dir returns an object for each _parameter_ of a directive
# so we need to filter out duplicates.
if directive not in already_parsed:
nodes.append(self._create_directivenode(directive))
already_parsed.add(directive)
return nodes
def find_comments(self, comment):
"""
Recursive search of DirectiveNodes from the sequence of children.
:param str comment: Comment content to search for.
"""
nodes = list()
ownpath = self.metadata.get("augeaspath")
comments = self.parser.find_comments(comment, start=ownpath)
for com in comments:
nodes.append(self._create_commentnode(com))
return nodes
def delete_child(self, child):
"""
Deletes a ParserNode from the sequence of children, and raises an
exception if it's unable to do so.
:param AugeasParserNode: child: A node to delete.
"""
if not self.parser.aug.remove(child.metadata["augeaspath"]):
raise errors.PluginError(
("Could not delete child node, the Augeas path: {} doesn't " +
"seem to exist.").format(child.metadata["augeaspath"])
)
def unsaved_files(self):
"""Returns a list of unsaved filepaths"""
return self.parser.unsaved_files()
def parsed_paths(self):
"""
Returns a list of file paths that have currently been parsed into the parser
tree. The returned list may include paths with wildcard characters, for
example: ['/etc/apache2/conf.d/*.load']
This is typically called on the root node of the ParserNode tree.
:returns: list of file paths of files that have been parsed
"""
res_paths = []
paths = self.parser.existing_paths
for directory in paths:
for filename in paths[directory]:
res_paths.append(os.path.join(directory, filename))
return res_paths
def _create_commentnode(self, path):
"""Helper function to create a CommentNode from Augeas path"""
comment = self.parser.aug.get(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
# Because of the dynamic nature of AugeasParser and the fact that we're
# not populating the complete node tree, the ancestor has a dummy value
return AugeasCommentNode(comment=comment,
ancestor=assertions.PASS,
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _create_directivenode(self, path):
"""Helper function to create a DirectiveNode from Augeas path"""
name = self.parser.get_arg(path)
metadata = {"augeasparser": self.parser, "augeaspath": path}
# Check if the file was included from the root config or initial state
enabled = self.parser.parsed_in_original(
apache_util.get_file_path(path)
)
return AugeasDirectiveNode(name=name,
ancestor=assertions.PASS,
enabled=enabled,
filepath=apache_util.get_file_path(path),
metadata=metadata)
def _aug_find_blocks(self, name):
"""Helper function to perform a search to Augeas DOM tree to search
configuration blocks with a given name"""
# The code here is modified from configurator.get_virtual_hosts()
blk_paths = set()
for vhost_path in list(self.parser.parser_paths):
paths = self.parser.aug.match(
("/files%s//*[label()=~regexp('%s')]" %
(vhost_path, parser.case_i(name))))
blk_paths.update([path for path in paths if
name.lower() in os.path.basename(path).lower()])
return blk_paths
def _aug_resolve_child_position(self, name, position):
"""
Helper function that iterates through the immediate children and figures
out the insertion path for a new AugeasParserNode.
Augeas also generalizes indices for directives and comments, simply by
using "directive" or "comment" respectively as their names.
This function iterates over the existing children of the AugeasBlockNode,
returning their insertion path, resulting Augeas path and if the new node
should be inserted before or after the returned insertion path.
Note: while Apache is case insensitive, Augeas is not, and blocks like
Nameofablock and NameOfABlock have different indices.
:param str name: Name of the AugeasBlockNode to insert, "directive" for
AugeasDirectiveNode or "comment" for AugeasCommentNode
:param int position: The position to insert the child AugeasParserNode to
:returns: Tuple of insert path, resulting path and a boolean if the new
node should be inserted before it.
:rtype: tuple of str, str, bool
"""
# Default to appending
before = False
all_children = self.parser.aug.match("{}/*".format(
self.metadata["augeaspath"])
)
# Calculate resulting_path
# Augeas indices start at 1. We use counter to calculate the index to
# be used in resulting_path.
counter = 1
for i, child in enumerate(all_children):
if position is not None and i >= position:
# We're not going to insert the new node to an index after this
break
childname = self._aug_get_name(child)
if name == childname:
counter += 1
resulting_path = "{}/{}[{}]".format(
self.metadata["augeaspath"],
name,
counter
)
# Form the correct insert_path
# Inserting the only child and appending as the last child work
# similarly in Augeas.
append = not all_children or position is None or position >= len(all_children)
if append:
insert_path = "{}/*[last()]".format(
self.metadata["augeaspath"]
)
elif position == 0:
# Insert as the first child, before the current first one.
insert_path = all_children[0]
before = True
else:
insert_path = "{}/*[{}]".format(
self.metadata["augeaspath"],
position
)
return (insert_path, resulting_path, before)
interfaces.CommentNode.register(AugeasCommentNode)
interfaces.DirectiveNode.register(AugeasDirectiveNode)
interfaces.BlockNode.register(AugeasBlockNode)

View File

@@ -1,5 +1,6 @@
"""Apache Configurator."""
# pylint: disable=too-many-lines
from collections import defaultdict
import copy
import fnmatch
import logging
@@ -7,31 +8,36 @@ import re
import socket
import time
from collections import defaultdict
import pkg_resources
import six
import zope.component
import zope.interface
try:
import apacheconfig
HAS_APACHECONFIG = True
except ImportError: # pragma: no cover
HAS_APACHECONFIG = False
from acme import challenges
from acme.magic_typing import DefaultDict, Dict, List, Set, Union # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import DefaultDict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Union # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.achallenges import KeyAuthorizationAnnotatedChallenge # pylint: disable=unused-import
from certbot.compat import filesystem
from certbot.compat import os
from certbot.plugins import common
from certbot.plugins.util import path_surgery
from certbot.plugins.enhancements import AutoHSTSEnhancement
from certbot.plugins.util import path_surgery
from certbot_apache._internal import apache_util
from certbot_apache._internal import assertions
from certbot_apache._internal import constants
from certbot_apache._internal import display_ops
from certbot_apache._internal import dualparser
from certbot_apache._internal import http_01
from certbot_apache._internal import obj
from certbot_apache._internal import parser
@@ -182,6 +188,7 @@ class ApacheConfigurator(common.Installer):
"""
version = kwargs.pop("version", None)
use_parsernode = kwargs.pop("use_parsernode", False)
super(ApacheConfigurator, self).__init__(*args, **kwargs)
# Add name_server association dict
@@ -197,10 +204,15 @@ class ApacheConfigurator(common.Installer):
self._autohsts = {} # type: Dict[str, Dict[str, Union[int, float]]]
# Reverter save notes
self.save_notes = ""
# Should we use ParserNode implementation instead of the old behavior
self.USE_PARSERNODE = use_parsernode
# Saves the list of file paths that were parsed initially, and
# not added to parser tree by self.conf("vhost-root") for example.
self.parsed_paths = [] # type: List[str]
# These will be set in the prepare function
self._prepared = False
self.parser = None
self.parser_root = None
self.version = version
self.vhosts = None
self.options = copy.deepcopy(self.OS_DEFAULTS)
@@ -250,6 +262,14 @@ class ApacheConfigurator(common.Installer):
# Perform the actual Augeas initialization to be able to react
self.parser = self.get_parser()
# Set up ParserNode root
pn_meta = {"augeasparser": self.parser,
"augeaspath": self.parser.get_root_augpath(),
"ac_ast": None}
if self.USE_PARSERNODE and HAS_APACHECONFIG:
self.parser_root = self.get_parsernode_root(pn_meta)
self.parsed_paths = self.parser_root.parsed_paths()
# Check for errors in parsing files with Augeas
self.parser.check_parsing_errors("httpd.aug")
@@ -345,6 +365,27 @@ class ApacheConfigurator(common.Installer):
self.option("server_root"), self.conf("vhost-root"),
self.version, configurator=self)
def get_parsernode_root(self, metadata):
"""Initializes the ParserNode parser root instance."""
apache_vars = dict()
apache_vars["defines"] = apache_util.parse_defines(self.option("ctl"))
apache_vars["includes"] = apache_util.parse_includes(self.option("ctl"))
apache_vars["modules"] = apache_util.parse_modules(self.option("ctl"))
metadata["apache_vars"] = apache_vars
with open(self.parser.loc["root"]) as f:
with apacheconfig.make_loader(writable=True,
**apacheconfig.flavors.NATIVE_APACHE) as loader:
metadata["ac_ast"] = loader.loads(f.read())
return dualparser.DualBlockNode(
name=assertions.PASS,
ancestor=None,
filepath=self.parser.loc["root"],
metadata=metadata
)
def _wildcard_domain(self, domain):
"""
Checks if domain is a wildcard domain
@@ -450,7 +491,7 @@ class ApacheConfigurator(common.Installer):
filtered_vhosts[name] = vhost
# Only unique VHost objects
dialog_input = set([vhost for vhost in filtered_vhosts.values()])
dialog_input = set(filtered_vhosts.values())
# Ask the user which of names to enable, expect list of names back
dialog_output = display_ops.select_vhost_multiple(list(dialog_input))
@@ -601,9 +642,9 @@ class ApacheConfigurator(common.Installer):
"in the Apache config.",
target_name)
raise errors.PluginError("No vhost selected")
elif temp:
if temp:
return vhost
elif not vhost.ssl:
if not vhost.ssl:
addrs = self._get_proposed_addrs(vhost, "443")
# TODO: Conflicts is too conservative
if not any(vhost.enabled and vhost.conflicts(addrs) for
@@ -869,6 +910,29 @@ class ApacheConfigurator(common.Installer):
return vhost
def get_virtual_hosts(self):
"""
Temporary wrapper for legacy and ParserNode version for
get_virtual_hosts. This should be replaced with the ParserNode
implementation when ready.
"""
v1_vhosts = self.get_virtual_hosts_v1()
if self.USE_PARSERNODE and HAS_APACHECONFIG:
v2_vhosts = self.get_virtual_hosts_v2()
for v1_vh in v1_vhosts:
found = False
for v2_vh in v2_vhosts:
if assertions.isEqualVirtualHost(v1_vh, v2_vh):
found = True
break
if not found:
raise AssertionError("Equivalent for {} was not found".format(v1_vh.path))
return v2_vhosts
return v1_vhosts
def get_virtual_hosts_v1(self):
"""Returns list of virtual hosts found in the Apache configuration.
:returns: List of :class:`~certbot_apache._internal.obj.VirtualHost`
@@ -921,6 +985,80 @@ class ApacheConfigurator(common.Installer):
vhs.append(new_vhost)
return vhs
def get_virtual_hosts_v2(self):
"""Returns list of virtual hosts found in the Apache configuration using
ParserNode interface.
:returns: List of :class:`~certbot_apache.obj.VirtualHost`
objects found in configuration
:rtype: list
"""
vhs = []
vhosts = self.parser_root.find_blocks("VirtualHost", exclude=False)
for vhblock in vhosts:
vhs.append(self._create_vhost_v2(vhblock))
return vhs
def _create_vhost_v2(self, node):
"""Used by get_virtual_hosts_v2 to create vhost objects using ParserNode
interfaces.
:param interfaces.BlockNode node: The BlockNode object of VirtualHost block
:returns: newly created vhost
:rtype: :class:`~certbot_apache.obj.VirtualHost`
"""
addrs = set()
for param in node.parameters:
addrs.add(obj.Addr.fromstring(param))
is_ssl = False
# Exclusion to match the behavior in get_virtual_hosts_v2
sslengine = node.find_directives("SSLEngine", exclude=False)
if sslengine:
for directive in sslengine:
if directive.parameters[0].lower() == "on":
is_ssl = True
break
# "SSLEngine on" might be set outside of <VirtualHost>
# Treat vhosts with port 443 as ssl vhosts
for addr in addrs:
if addr.get_port() == "443":
is_ssl = True
enabled = apache_util.included_in_paths(node.filepath, self.parsed_paths)
macro = False
# Check if the VirtualHost is contained in a mod_macro block
if node.find_ancestors("Macro"):
macro = True
vhost = obj.VirtualHost(
node.filepath, None, addrs, is_ssl, enabled, modmacro=macro, node=node
)
self._populate_vhost_names_v2(vhost)
return vhost
def _populate_vhost_names_v2(self, vhost):
"""Helper function that populates the VirtualHost names.
:param host: In progress vhost whose names will be added
:type host: :class:`~certbot_apache.obj.VirtualHost`
"""
servername_match = vhost.node.find_directives("ServerName",
exclude=False)
serveralias_match = vhost.node.find_directives("ServerAlias",
exclude=False)
servername = None
if servername_match:
servername = servername_match[-1].parameters[-1]
if not vhost.modmacro:
for alias in serveralias_match:
for serveralias in alias.parameters:
vhost.aliases.add(serveralias)
vhost.name = servername
def is_name_vhost(self, target_addr):
"""Returns if vhost is a name based vhost
@@ -952,13 +1090,12 @@ class ApacheConfigurator(common.Installer):
loc = parser.get_aug_path(self.parser.loc["name"])
if addr.get_port() == "443":
path = self.parser.add_dir_to_ifmodssl(
self.parser.add_dir_to_ifmodssl(
loc, "NameVirtualHost", [str(addr)])
else:
path = self.parser.add_dir(loc, "NameVirtualHost", [str(addr)])
self.parser.add_dir(loc, "NameVirtualHost", [str(addr)])
msg = ("Setting %s to be NameBasedVirtualHost\n"
"\tDirective added to %s\n" % (addr, path))
msg = "Setting {0} to be NameBasedVirtualHost\n".format(addr)
logger.debug(msg)
self.save_notes += msg
@@ -1366,12 +1503,9 @@ class ApacheConfigurator(common.Installer):
result.append(comment)
sift = True
result.append('\n'.join(
['# ' + l for l in chunk]))
continue
result.append('\n'.join(['# ' + l for l in chunk]))
else:
result.append('\n'.join(chunk))
continue
return result, sift
def _get_vhost_block(self, vhost):
@@ -2514,4 +2648,4 @@ class ApacheConfigurator(common.Installer):
self._autohsts_save_state()
AutoHSTSEnhancement.register(ApacheConfigurator) # pylint: disable=no-member
AutoHSTSEnhancement.register(ApacheConfigurator)

View File

@@ -3,7 +3,6 @@ import pkg_resources
from certbot.compat import os
MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
"""Name of the mod_ssl config file as saved in `IConfig.config_dir`."""

View File

@@ -3,10 +3,10 @@ import logging
import zope.component
import certbot.display.util as display_util
from certbot import errors
from certbot import interfaces
from certbot.compat import os
import certbot.display.util as display_util
logger = logging.getLogger(__name__)

View File

@@ -0,0 +1,306 @@
""" Dual ParserNode implementation """
from certbot_apache._internal import assertions
from certbot_apache._internal import augeasparser
from certbot_apache._internal import apacheparser
class DualNodeBase(object):
""" Dual parser interface for in development testing. This is used as the
base class for dual parser interface classes. This class handles runtime
attribute value assertions."""
def save(self, msg): # pragma: no cover
""" Call save for both parsers """
self.primary.save(msg)
self.secondary.save(msg)
def __getattr__(self, aname):
""" Attribute value assertion """
firstval = getattr(self.primary, aname)
secondval = getattr(self.secondary, aname)
exclusions = [
# Metadata will inherently be different, as ApacheParserNode does
# not have Augeas paths and so on.
aname == "metadata",
callable(firstval)
]
if not any(exclusions):
assertions.assertEqualSimple(firstval, secondval)
return firstval
def find_ancestors(self, name):
""" Traverses the ancestor tree and returns ancestors matching name """
return self._find_helper(DualBlockNode, "find_ancestors", name)
def _find_helper(self, nodeclass, findfunc, search, **kwargs):
"""A helper for find_* functions. The function specific attributes should
be passed as keyword arguments.
:param interfaces.ParserNode nodeclass: The node class for results.
:param str findfunc: Name of the find function to call
:param str search: The search term
"""
primary_res = getattr(self.primary, findfunc)(search, **kwargs)
secondary_res = getattr(self.secondary, findfunc)(search, **kwargs)
# The order of search results for Augeas implementation cannot be
# assured.
pass_primary = assertions.isPassNodeList(primary_res)
pass_secondary = assertions.isPassNodeList(secondary_res)
new_nodes = list()
if pass_primary and pass_secondary:
# Both unimplemented
new_nodes.append(nodeclass(primary=primary_res[0],
secondary=secondary_res[0])) # pragma: no cover
elif pass_primary:
for c in secondary_res:
new_nodes.append(nodeclass(primary=primary_res[0],
secondary=c))
elif pass_secondary:
for c in primary_res:
new_nodes.append(nodeclass(primary=c,
secondary=secondary_res[0]))
else:
assert len(primary_res) == len(secondary_res)
matches = self._create_matching_list(primary_res, secondary_res)
for p, s in matches:
new_nodes.append(nodeclass(primary=p, secondary=s))
return new_nodes
class DualCommentNode(DualNodeBase):
""" Dual parser implementation of CommentNode interface """
def __init__(self, **kwargs):
""" This initialization implementation allows ordinary initialization
of CommentNode objects as well as creating a DualCommentNode object
using precreated or fetched CommentNode objects if provided as optional
arguments primary and secondary.
Parameters other than the following are from interfaces.CommentNode:
:param CommentNode primary: Primary pre-created CommentNode, mainly
used when creating new DualParser nodes using add_* methods.
:param CommentNode secondary: Secondary pre-created CommentNode
"""
kwargs.setdefault("primary", None)
kwargs.setdefault("secondary", None)
primary = kwargs.pop("primary")
secondary = kwargs.pop("secondary")
if primary or secondary:
assert primary and secondary
self.primary = primary
self.secondary = secondary
else:
self.primary = augeasparser.AugeasCommentNode(**kwargs)
self.secondary = apacheparser.ApacheCommentNode(**kwargs)
assertions.assertEqual(self.primary, self.secondary)
class DualDirectiveNode(DualNodeBase):
""" Dual parser implementation of DirectiveNode interface """
def __init__(self, **kwargs):
""" This initialization implementation allows ordinary initialization
of DirectiveNode objects as well as creating a DualDirectiveNode object
using precreated or fetched DirectiveNode objects if provided as optional
arguments primary and secondary.
Parameters other than the following are from interfaces.DirectiveNode:
:param DirectiveNode primary: Primary pre-created DirectiveNode, mainly
used when creating new DualParser nodes using add_* methods.
:param DirectiveNode secondary: Secondary pre-created DirectiveNode
"""
kwargs.setdefault("primary", None)
kwargs.setdefault("secondary", None)
primary = kwargs.pop("primary")
secondary = kwargs.pop("secondary")
if primary or secondary:
assert primary and secondary
self.primary = primary
self.secondary = secondary
else:
self.primary = augeasparser.AugeasDirectiveNode(**kwargs)
self.secondary = apacheparser.ApacheDirectiveNode(**kwargs)
assertions.assertEqual(self.primary, self.secondary)
def set_parameters(self, parameters):
""" Sets parameters and asserts that both implementation successfully
set the parameter sequence """
self.primary.set_parameters(parameters)
self.secondary.set_parameters(parameters)
assertions.assertEqual(self.primary, self.secondary)
class DualBlockNode(DualNodeBase):
""" Dual parser implementation of BlockNode interface """
def __init__(self, **kwargs):
""" This initialization implementation allows ordinary initialization
of BlockNode objects as well as creating a DualBlockNode object
using precreated or fetched BlockNode objects if provided as optional
arguments primary and secondary.
Parameters other than the following are from interfaces.BlockNode:
:param BlockNode primary: Primary pre-created BlockNode, mainly
used when creating new DualParser nodes using add_* methods.
:param BlockNode secondary: Secondary pre-created BlockNode
"""
kwargs.setdefault("primary", None)
kwargs.setdefault("secondary", None)
primary = kwargs.pop("primary")
secondary = kwargs.pop("secondary")
if primary or secondary:
assert primary and secondary
self.primary = primary
self.secondary = secondary
else:
self.primary = augeasparser.AugeasBlockNode(**kwargs)
self.secondary = apacheparser.ApacheBlockNode(**kwargs)
assertions.assertEqual(self.primary, self.secondary)
def add_child_block(self, name, parameters=None, position=None):
""" Creates a new child BlockNode, asserts that both implementations
did it in a similar way, and returns a newly created DualBlockNode object
encapsulating both of the newly created objects """
primary_new = self.primary.add_child_block(name, parameters, position)
secondary_new = self.secondary.add_child_block(name, parameters, position)
assertions.assertEqual(primary_new, secondary_new)
new_block = DualBlockNode(primary=primary_new, secondary=secondary_new)
return new_block
def add_child_directive(self, name, parameters=None, position=None):
""" Creates a new child DirectiveNode, asserts that both implementations
did it in a similar way, and returns a newly created DualDirectiveNode
object encapsulating both of the newly created objects """
primary_new = self.primary.add_child_directive(name, parameters, position)
secondary_new = self.secondary.add_child_directive(name, parameters, position)
assertions.assertEqual(primary_new, secondary_new)
new_dir = DualDirectiveNode(primary=primary_new, secondary=secondary_new)
return new_dir
def add_child_comment(self, comment="", position=None):
""" Creates a new child CommentNode, asserts that both implementations
did it in a similar way, and returns a newly created DualCommentNode
object encapsulating both of the newly created objects """
primary_new = self.primary.add_child_comment(comment, position)
secondary_new = self.secondary.add_child_comment(comment, position)
assertions.assertEqual(primary_new, secondary_new)
new_comment = DualCommentNode(primary=primary_new, secondary=secondary_new)
return new_comment
def _create_matching_list(self, primary_list, secondary_list):
""" Matches the list of primary_list to a list of secondary_list and
returns a list of tuples. This is used to create results for find_
methods.
This helper function exists, because we cannot ensure that the list of
search results returned by primary.find_* and secondary.find_* are ordered
in a same way. The function pairs the same search results from both
implementations to a list of tuples.
"""
matched = list()
for p in primary_list:
match = None
for s in secondary_list:
try:
assertions.assertEqual(p, s)
match = s
break
except AssertionError:
continue
if match:
matched.append((p, match))
else:
raise AssertionError("Could not find a matching node.")
return matched
def find_blocks(self, name, exclude=True):
"""
Performs a search for BlockNodes using both implementations and does simple
checks for results. This is built upon the assumption that unimplemented
find_* methods return a list with a single assertion passing object.
After the assertion, it creates a list of newly created DualBlockNode
instances that encapsulate the pairs of returned BlockNode objects.
"""
return self._find_helper(DualBlockNode, "find_blocks", name,
exclude=exclude)
def find_directives(self, name, exclude=True):
"""
Performs a search for DirectiveNodes using both implementations and
checks the results. This is built upon the assumption that unimplemented
find_* methods return a list with a single assertion passing object.
After the assertion, it creates a list of newly created DualDirectiveNode
instances that encapsulate the pairs of returned DirectiveNode objects.
"""
return self._find_helper(DualDirectiveNode, "find_directives", name,
exclude=exclude)
def find_comments(self, comment):
"""
Performs a search for CommentNodes using both implementations and
checks the results. This is built upon the assumption that unimplemented
find_* methods return a list with a single assertion passing object.
After the assertion, it creates a list of newly created DualCommentNode
instances that encapsulate the pairs of returned CommentNode objects.
"""
return self._find_helper(DualCommentNode, "find_comments", comment)
def delete_child(self, child):
"""Deletes a child from the ParserNode implementations. The actual
ParserNode implementations are used here directly in order to be able
to match a child to the list of children."""
self.primary.delete_child(child.primary)
self.secondary.delete_child(child.secondary)
def unsaved_files(self):
""" Fetches the list of unsaved file paths and asserts that the lists
match """
primary_files = self.primary.unsaved_files()
secondary_files = self.secondary.unsaved_files()
assertions.assertEqualSimple(primary_files, secondary_files)
return primary_files
def parsed_paths(self):
"""
Returns a list of file paths that have currently been parsed into the parser
tree. The returned list may include paths with wildcard characters, for
example: ['/etc/apache2/conf.d/*.load']
This is typically called on the root node of the ParserNode tree.
:returns: list of file paths of files that have been parsed
"""
primary_paths = self.primary.parsed_paths()
secondary_paths = self.secondary.parsed_paths()
assertions.assertEqualPathsList(primary_paths, secondary_paths)
return primary_paths

View File

@@ -4,13 +4,12 @@
from distutils.version import LooseVersion # pylint: disable=no-name-in-module,import-error
from certbot import util
from certbot_apache._internal import configurator
from certbot_apache._internal import override_arch
from certbot_apache._internal import override_fedora
from certbot_apache._internal import override_centos
from certbot_apache._internal import override_darwin
from certbot_apache._internal import override_debian
from certbot_apache._internal import override_centos
from certbot_apache._internal import override_fedora
from certbot_apache._internal import override_gentoo
from certbot_apache._internal import override_suse

View File

@@ -1,13 +1,12 @@
"""A class that performs HTTP-01 challenges for Apache"""
import logging
from acme.magic_typing import List, Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot.compat import os
from certbot.compat import filesystem
from certbot.compat import os
from certbot.plugins import common
from certbot_apache._internal.obj import VirtualHost # pylint: disable=unused-import
from certbot_apache._internal.parser import get_aug_path
@@ -195,8 +194,8 @@ class ApacheHttp01(common.ChallengePerformer):
if vhost not in self.moded_vhosts:
logger.debug(
"Adding a temporary challenge validation Include for name: %s " +
"in: %s", vhost.name, vhost.filep)
"Adding a temporary challenge validation Include for name: %s in: %s",
vhost.name, vhost.filep)
self.configurator.parser.add_dir_beginning(
vhost.path, "Include", self.challenge_conf_pre)
self.configurator.parser.add_dir(

View File

@@ -0,0 +1,516 @@
"""ParserNode interface for interacting with configuration tree.
General description
-------------------
The ParserNode interfaces are designed to be able to contain all the parsing logic,
while allowing their users to interact with the configuration tree in a Pythonic
and well structured manner.
The structure allows easy traversal of the tree of ParserNodes. Each ParserNode
stores a reference to its ancestor and immediate children, allowing the user to
traverse the tree using built in interface methods as well as accessing the interface
properties directly.
ParserNode interface implementation should stand between the actual underlying
parser functionality and the business logic within Configurator code, interfacing
with both. The ParserNode tree is a result of configuration parsing action.
ParserNode tree will be in charge of maintaining the parser state and hence the
abstract syntax tree (AST). Interactions between ParserNode tree and underlying
parser should involve only parsing the configuration files to this structure, and
writing it back to the filesystem - while preserving the format including whitespaces.
For some implementations (Apache for example) it's important to keep track of and
to use state information while parsing conditional blocks and directives. This
allows the implementation to set a flag to parts of the parsed configuration
structure as not being in effect in a case of unmatched conditional block. It's
important to store these blocks in the tree as well in order to not to conduct
destructive actions (failing to write back parts of the configuration) while writing
the AST back to the filesystem.
The ParserNode tree is in charge of maintaining the its own structure while every
child node fetched with find - methods or by iterating its list of children can be
changed in place. When making changes the affected nodes should be flagged as "dirty"
in order for the parser implementation to figure out the parts of the configuration
that need to be written back to disk during the save() operation.
Metadata
--------
The metadata holds all the implementation specific attributes of the ParserNodes -
things like the positional information related to the AST, file paths, whitespacing,
and any other information relevant to the underlying parser engine.
Access to the metadata should be handled by implementation specific methods, allowing
the Configurator functionality to access the underlying information where needed.
For some implementations the node can be initialized using the information carried
in metadata alone. This is useful especially when populating the ParserNode tree
while parsing the configuration.
Apache implementation
---------------------
The Apache implementation of ParserNode interface requires some implementation
specific functionalities that are not described by the interface itself.
Initialization
When the user of a ParserNode class is creating these objects, they must specify
the parameters as described in the documentation for the __init__ methods below.
When these objects are created internally, however, some parameters may not be
needed because (possibly more detailed) information is included in the metadata
parameter. In this case, implementations can deviate from the required parameters
from __init__, however, they should still behave the same when metadata is not
provided.
For consistency internally, if an argument is provided directly in the ParserNode
initialization parameters as well as within metadata it's recommended to establish
clear behavior around this scenario within the implementation.
Conditional blocks
Apache configuration can have conditional blocks, for example: <IfModule ...>,
resulting the directives and subblocks within it being either enabled or disabled.
While find_* interface methods allow including the disabled parts of the configuration
tree in searches a special care needs to be taken while parsing the structure in
order to reflect the active state of configuration.
Whitespaces
Each ParserNode object is responsible of storing its prepending whitespace characters
in order to be able to write the AST back to filesystem like it was, preserving the
format, this applies for parameters of BlockNode and DirectiveNode as well.
When parameters of ParserNode are changed, the pre-existing whitespaces in the
parameter sequence are discarded, as the general reason for storing them is to
maintain the ability to write the configuration back to filesystem exactly like
it was. This loses its meaning when we have to change the directives or blocks
parameters for other reasons.
Searches and matching
Apache configuration is largely case insensitive, so the Apache implementation of
ParserNode interface needs to provide the user means to match block and directive
names and parameters in case insensitive manner. This does not apply to everything
however, for example the parameters of a conditional statement may be case sensitive.
For this reason the internal representation of data should not ignore the case.
"""
import abc
import six
from acme.magic_typing import Any, Dict, Optional, Tuple # pylint: disable=unused-import, no-name-in-module
@six.add_metaclass(abc.ABCMeta)
class ParserNode(object):
"""
ParserNode is the basic building block of the tree of such nodes,
representing the structure of the configuration. It is largely meant to keep
the structure information intact and idiomatically accessible.
The root node as well as the child nodes of it should be instances of ParserNode.
Nodes keep track of their differences to on-disk representation of configuration
by marking modified ParserNodes as dirty to enable partial write-to-disk for
different files in the configuration structure.
While for the most parts the usage and the child types are obvious, "include"-
and similar directives are an exception to this rule. This is because of the
nature of include directives - which unroll the contents of another file or
configuration block to their place. While we could unroll the included nodes
to the parent tree, it remains important to keep the context of include nodes
separate in order to write back the original configuration as it was.
For parsers that require the implementation to keep track of the whitespacing,
it's responsibility of each ParserNode object itself to store its prepending
whitespaces in order to be able to reconstruct the complete configuration file
as it was when originally read from the disk.
ParserNode objects should have the following attributes:
# Reference to ancestor node, or None if the node is the root node of the
# configuration tree.
ancestor: Optional[ParserNode]
# True if this node has been modified since last save.
dirty: bool
# Filepath of the file where the configuration element for this ParserNode
# object resides. For root node, the value for filepath is the httpd root
# configuration file. Filepath can be None if a configuration directive is
# defined in for example the httpd command line.
filepath: Optional[str]
# Metadata dictionary holds all the implementation specific key-value pairs
# for the ParserNode instance.
metadata: Dict[str, Any]
"""
@abc.abstractmethod
def __init__(self, **kwargs):
"""
Initializes the ParserNode instance, and sets the ParserNode specific
instance variables. This is not meant to be used directly, but through
specific classes implementing ParserNode interface.
:param ancestor: BlockNode ancestor for this CommentNode. Required.
:type ancestor: BlockNode or None
:param filepath: Filesystem path for the file where this CommentNode
does or should exist in the filesystem. Required.
:type filepath: str or None
:param dirty: Boolean flag for denoting if this CommentNode has been
created or changed after the last save. Default: False.
:type dirty: bool
:param metadata: Dictionary of metadata values for this ParserNode object.
Metadata information should be used only internally in the implementation.
Default: {}
:type metadata: dict
"""
@abc.abstractmethod
def save(self, msg):
"""
Save traverses the children, and attempts to write the AST to disk for
all the objects that are marked dirty. The actual operation of course
depends on the underlying implementation. save() shouldn't be called
from the Configurator outside of its designated save() method in order
to ensure that the Reverter checkpoints are created properly.
Note: this approach of keeping internal structure of the configuration
within the ParserNode tree does not represent the file inclusion structure
of actual configuration files that reside in the filesystem. To handle
file writes properly, the file specific temporary trees should be extracted
from the full ParserNode tree where necessary when writing to disk.
:param str msg: Message describing the reason for the save.
"""
@abc.abstractmethod
def find_ancestors(self, name):
"""
Traverses the ancestor tree up, searching for BlockNodes with a specific
name.
:param str name: Name of the ancestor BlockNode to search for
:returns: A list of ancestor BlockNodes that match the name
:rtype: list of BlockNode
"""
# Linter rule exclusion done because of https://github.com/PyCQA/pylint/issues/179
@six.add_metaclass(abc.ABCMeta) # pylint: disable=abstract-method
class CommentNode(ParserNode):
"""
CommentNode class is used for representation of comments within the parsed
configuration structure. Because of the nature of comments, it is not able
to have child nodes and hence it is always treated as a leaf node.
CommentNode stores its contents in class variable 'comment' and does not
have a specific name.
CommentNode objects should have the following attributes in addition to
the ones described in ParserNode:
# Contains the contents of the comment without the directive notation
# (typically # or /* ... */).
comment: str
"""
@abc.abstractmethod
def __init__(self, **kwargs):
"""
Initializes the CommentNode instance and sets its instance variables.
:param comment: Contents of the comment. Required.
:type comment: str
:param ancestor: BlockNode ancestor for this CommentNode. Required.
:type ancestor: BlockNode or None
:param filepath: Filesystem path for the file where this CommentNode
does or should exist in the filesystem. Required.
:type filepath: str or None
:param dirty: Boolean flag for denoting if this CommentNode has been
created or changed after the last save. Default: False.
:type dirty: bool
"""
super(CommentNode, self).__init__(ancestor=kwargs['ancestor'],
dirty=kwargs.get('dirty', False),
filepath=kwargs['filepath'],
metadata=kwargs.get('metadata', {})) # pragma: no cover
@six.add_metaclass(abc.ABCMeta)
class DirectiveNode(ParserNode):
"""
DirectiveNode class represents a configuration directive within the configuration.
It can have zero or more parameters attached to it. Because of the nature of
single directives, it is not able to have child nodes and hence it is always
treated as a leaf node.
If a this directive was defined on the httpd command line, the ancestor instance
variable for this DirectiveNode should be None, and it should be inserted to the
beginning of root BlockNode children sequence.
DirectiveNode objects should have the following attributes in addition to
the ones described in ParserNode:
# True if this DirectiveNode is enabled and False if it is inside of an
# inactive conditional block.
enabled: bool
# Name, or key of the configuration directive. If BlockNode subclass of
# DirectiveNode is the root configuration node, the name should be None.
name: Optional[str]
# Tuple of parameters of this ParserNode object, excluding whitespaces.
parameters: Tuple[str, ...]
"""
@abc.abstractmethod
def __init__(self, **kwargs):
"""
Initializes the DirectiveNode instance and sets its instance variables.
:param name: Name or key of the DirectiveNode object. Required.
:type name: str or None
:param tuple parameters: Tuple of str parameters for this DirectiveNode.
Default: ().
:type parameters: tuple
:param ancestor: BlockNode ancestor for this DirectiveNode, or None for
root configuration node. Required.
:type ancestor: BlockNode or None
:param filepath: Filesystem path for the file where this DirectiveNode
does or should exist in the filesystem, or None for directives introduced
in the httpd command line. Required.
:type filepath: str or None
:param dirty: Boolean flag for denoting if this DirectiveNode has been
created or changed after the last save. Default: False.
:type dirty: bool
:param enabled: True if this DirectiveNode object is parsed in the active
configuration of the httpd. False if the DirectiveNode exists within a
unmatched conditional configuration block. Default: True.
:type enabled: bool
"""
super(DirectiveNode, self).__init__(ancestor=kwargs['ancestor'],
dirty=kwargs.get('dirty', False),
filepath=kwargs['filepath'],
metadata=kwargs.get('metadata', {})) # pragma: no cover
@abc.abstractmethod
def set_parameters(self, parameters):
"""
Sets the sequence of parameters for this ParserNode object without
whitespaces. While the whitespaces for parameters are discarded when using
this method, the whitespacing preceeding the ParserNode itself should be
kept intact.
:param list parameters: sequence of parameters
"""
@six.add_metaclass(abc.ABCMeta)
class BlockNode(DirectiveNode):
"""
BlockNode class represents a block of nested configuration directives, comments
and other blocks as its children. A BlockNode can have zero or more parameters
attached to it.
Configuration blocks typically consist of one or more child nodes of all possible
types. Because of this, the BlockNode class has various discovery and structure
management methods.
Lists of parameters used as an optional argument for some of the methods should
be lists of strings that are applicable parameters for each specific BlockNode
or DirectiveNode type. As an example, for a following configuration example:
<VirtualHost *:80>
...
</VirtualHost>
The node type would be BlockNode, name would be 'VirtualHost' and its parameters
would be: ['*:80'].
While for the following example:
LoadModule alias_module /usr/lib/apache2/modules/mod_alias.so
The node type would be DirectiveNode, name would be 'LoadModule' and its
parameters would be: ['alias_module', '/usr/lib/apache2/modules/mod_alias.so']
The applicable parameters are dependent on the underlying configuration language
and its grammar.
BlockNode objects should have the following attributes in addition to
the ones described in DirectiveNode:
# Tuple of direct children of this BlockNode object. The order of children
# in this tuple retain the order of elements in the parsed configuration
# block.
children: Tuple[ParserNode, ...]
"""
@abc.abstractmethod
def add_child_block(self, name, parameters=None, position=None):
"""
Adds a new BlockNode child node with provided values and marks the callee
BlockNode dirty. This is used to add new children to the AST. The preceeding
whitespaces should not be added based on the ancestor or siblings for the
newly created object. This is to match the current behavior of the legacy
parser implementation.
:param str name: The name of the child node to add
:param list parameters: list of parameters for the node
:param int position: Position in the list of children to add the new child
node to. Defaults to None, which appends the newly created node to the list.
If an integer is given, the child is inserted before that index in the
list similar to list().insert.
:returns: BlockNode instance of the created child block
"""
@abc.abstractmethod
def add_child_directive(self, name, parameters=None, position=None):
"""
Adds a new DirectiveNode child node with provided values and marks the
callee BlockNode dirty. This is used to add new children to the AST. The
preceeding whitespaces should not be added based on the ancestor or siblings
for the newly created object. This is to match the current behavior of the
legacy parser implementation.
:param str name: The name of the child node to add
:param list parameters: list of parameters for the node
:param int position: Position in the list of children to add the new child
node to. Defaults to None, which appends the newly created node to the list.
If an integer is given, the child is inserted before that index in the
list similar to list().insert.
:returns: DirectiveNode instance of the created child directive
"""
@abc.abstractmethod
def add_child_comment(self, comment="", position=None):
"""
Adds a new CommentNode child node with provided value and marks the
callee BlockNode dirty. This is used to add new children to the AST. The
preceeding whitespaces should not be added based on the ancestor or siblings
for the newly created object. This is to match the current behavior of the
legacy parser implementation.
:param str comment: Comment contents
:param int position: Position in the list of children to add the new child
node to. Defaults to None, which appends the newly created node to the list.
If an integer is given, the child is inserted before that index in the
list similar to list().insert.
:returns: CommentNode instance of the created child comment
"""
@abc.abstractmethod
def find_blocks(self, name, exclude=True):
"""
Find a configuration block by name. This method walks the child tree of
ParserNodes under the instance it was called from. This way it is possible
to search for the whole configuration tree, when starting from root node or
to do a partial search when starting from a specified branch. The lookup
should be case insensitive.
:param str name: The name of the directive to search for
:param bool exclude: If the search results should exclude the contents of
ParserNode objects that reside within conditional blocks and because
of current state are not enabled.
:returns: A list of found BlockNode objects.
"""
@abc.abstractmethod
def find_directives(self, name, exclude=True):
"""
Find a directive by name. This method walks the child tree of ParserNodes
under the instance it was called from. This way it is possible to search
for the whole configuration tree, when starting from root node, or to do
a partial search when starting from a specified branch. The lookup should
be case insensitive.
:param str name: The name of the directive to search for
:param bool exclude: If the search results should exclude the contents of
ParserNode objects that reside within conditional blocks and because
of current state are not enabled.
:returns: A list of found DirectiveNode objects.
"""
@abc.abstractmethod
def find_comments(self, comment):
"""
Find comments with value containing the search term.
This method walks the child tree of ParserNodes under the instance it was
called from. This way it is possible to search for the whole configuration
tree, when starting from root node, or to do a partial search when starting
from a specified branch. The lookup should be case sensitive.
:param str comment: The content of comment to search for
:returns: A list of found CommentNode objects.
"""
@abc.abstractmethod
def delete_child(self, child):
"""
Remove a specified child node from the list of children of the called
BlockNode object.
:param ParserNode child: Child ParserNode object to remove from the list
of children of the callee.
"""
@abc.abstractmethod
def unsaved_files(self):
"""
Returns a list of file paths that have been changed since the last save
(or the initial configuration parse). The intended use for this method
is to tell the Reverter which files need to be included in a checkpoint.
This is typically called for the root of the ParserNode tree.
:returns: list of file paths of files that have been changed but not yet
saved to disk.
"""
@abc.abstractmethod
def parsed_paths(self):
"""
Returns a list of file paths that have currently been parsed into the parser
tree. The returned list may include paths with wildcard characters, for
example: ['/etc/apache2/conf.d/*.load']
This is typically called on the root node of the ParserNode tree.
:returns: list of file paths of files that have been parsed
"""

View File

@@ -1,7 +1,7 @@
"""Module contains classes used by the Apache Configurator."""
import re
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from certbot.plugins import common
@@ -124,7 +124,7 @@ class VirtualHost(object):
strip_name = re.compile(r"^(?:.+://)?([^ :$]*)")
def __init__(self, filep, path, addrs, ssl, enabled, name=None,
aliases=None, modmacro=False, ancestor=None):
aliases=None, modmacro=False, ancestor=None, node=None):
"""Initialize a VH."""
self.filep = filep
@@ -136,6 +136,7 @@ class VirtualHost(object):
self.enabled = enabled
self.modmacro = modmacro
self.ancestor = ancestor
self.node = node
def get_names(self):
"""Return a set of all names."""

View File

@@ -1,13 +1,12 @@
""" Distribution specific override class for Arch Linux """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import configurator
@zope.interface.provider(interfaces.IPluginFactory)
class ArchConfigurator(configurator.ApacheConfigurator):
"""Arch Linux specific ApacheConfigurator override class"""

View File

@@ -4,19 +4,16 @@ import logging
import pkg_resources
import zope.interface
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.compat import os
from certbot.errors import MisconfigurationError
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache._internal import parser
logger = logging.getLogger(__name__)

View File

@@ -1,13 +1,12 @@
""" Distribution specific override class for macOS """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import configurator
@zope.interface.provider(interfaces.IPluginFactory)
class DarwinConfigurator(configurator.ApacheConfigurator):
"""macOS specific ApacheConfigurator override class"""

View File

@@ -9,7 +9,6 @@ from certbot import interfaces
from certbot import util
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
@@ -71,15 +70,14 @@ class DebianConfigurator(configurator.ApacheConfigurator):
# Already in shape
vhost.enabled = True
return None
else:
logger.warning(
"Could not symlink %s to %s, got error: %s", enabled_path,
vhost.filep, err.strerror)
errstring = ("Encountered error while trying to enable a " +
"newly created VirtualHost located at {0} by " +
"linking to it from {1}")
raise errors.NotSupportedError(errstring.format(vhost.filep,
enabled_path))
logger.warning(
"Could not symlink %s to %s, got error: %s", enabled_path,
vhost.filep, err.strerror)
errstring = ("Encountered error while trying to enable a " +
"newly created VirtualHost located at {0} by " +
"linking to it from {1}")
raise errors.NotSupportedError(errstring.format(vhost.filep,
enabled_path))
vhost.enabled = True
logger.info("Enabling available site: %s", vhost.filep)
self.save_notes += "Enabled site %s\n" % vhost.filep

View File

@@ -6,7 +6,6 @@ from certbot import errors
from certbot import interfaces
from certbot import util
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache._internal import parser

View File

@@ -1,15 +1,14 @@
""" Distribution specific override class for Gentoo Linux """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import configurator
from certbot_apache._internal import parser
@zope.interface.provider(interfaces.IPluginFactory)
class GentooConfigurator(configurator.ApacheConfigurator):
"""Gentoo specific ApacheConfigurator override class"""
@@ -71,6 +70,6 @@ class GentooParser(parser.ApacheParser):
def update_modules(self):
"""Get loaded modules from httpd process, and add them to DOM"""
mod_cmd = [self.configurator.option("ctl"), "modules"]
matches = self.parse_from_subprocess(mod_cmd, r"(.*)_module")
matches = apache_util.parse_from_subprocess(mod_cmd, r"(.*)_module")
for mod in matches:
self.add_mod(mod.strip())

View File

@@ -1,13 +1,12 @@
""" Distribution specific override class for OpenSUSE """
import pkg_resources
import zope.interface
from certbot import interfaces
from certbot.compat import os
from certbot_apache._internal import configurator
@zope.interface.provider(interfaces.IPluginFactory)
class OpenSUSEConfigurator(configurator.ApacheConfigurator):
"""OpenSUSE specific ApacheConfigurator override class"""

View File

@@ -3,16 +3,16 @@ import copy
import fnmatch
import logging
import re
import subprocess
import sys
import six
from acme.magic_typing import Dict, List, Set # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import constants
logger = logging.getLogger(__name__)
@@ -284,38 +284,21 @@ class ApacheParser(object):
mods.add(mod_name)
mods.add(os.path.basename(mod_filename)[:-2] + "c")
else:
logger.debug("Could not read LoadModule directive from " +
"Augeas path: %s", match_name[6:])
logger.debug("Could not read LoadModule directive from Augeas path: %s",
match_name[6:])
self.modules.update(mods)
def update_runtime_variables(self):
"""Update Includes, Defines and Includes from httpd config dump data"""
self.update_defines()
self.update_includes()
self.update_modules()
def update_defines(self):
"""Get Defines from httpd process"""
"""Updates the dictionary of known variables in the configuration"""
variables = dict()
define_cmd = [self.configurator.option("ctl"), "-t", "-D",
"DUMP_RUN_CFG"]
matches = self.parse_from_subprocess(define_cmd, r"Define: ([^ \n]*)")
try:
matches.remove("DUMP_RUN_CFG")
except ValueError:
return
for match in matches:
if match.count("=") > 1:
logger.error("Unexpected number of equal signs in "
"runtime config dump.")
raise errors.PluginError(
"Error parsing Apache runtime variables")
parts = match.partition("=")
variables[parts[0]] = parts[2]
self.variables = variables
self.variables = apache_util.parse_defines(self.configurator.option("ctl"))
def update_includes(self):
"""Get includes from httpd process, and add them to DOM if needed"""
@@ -325,9 +308,7 @@ class ApacheParser(object):
# configuration files
_ = self.find_dir("Include")
inc_cmd = [self.configurator.option("ctl"), "-t", "-D",
"DUMP_INCLUDES"]
matches = self.parse_from_subprocess(inc_cmd, r"\(.*\) (.*)")
matches = apache_util.parse_includes(self.configurator.option("ctl"))
if matches:
for i in matches:
if not self.parsed_in_current(i):
@@ -336,56 +317,10 @@ class ApacheParser(object):
def update_modules(self):
"""Get loaded modules from httpd process, and add them to DOM"""
mod_cmd = [self.configurator.option("ctl"), "-t", "-D",
"DUMP_MODULES"]
matches = self.parse_from_subprocess(mod_cmd, r"(.*)_module")
matches = apache_util.parse_modules(self.configurator.option("ctl"))
for mod in matches:
self.add_mod(mod.strip())
def parse_from_subprocess(self, command, regexp):
"""Get values from stdout of subprocess command
:param list command: Command to run
:param str regexp: Regexp for parsing
:returns: list parsed from command output
:rtype: list
"""
stdout = self._get_runtime_cfg(command)
return re.compile(regexp).findall(stdout)
def _get_runtime_cfg(self, command): # pylint: disable=no-self-use
"""Get runtime configuration info.
:param command: Command to run
:returns: stdout from command
"""
try:
proc = subprocess.Popen(
command,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
universal_newlines=True)
stdout, stderr = proc.communicate()
except (OSError, ValueError):
logger.error(
"Error running command %s for runtime parameters!%s",
command, os.linesep)
raise errors.MisconfigurationError(
"Error accessing loaded Apache parameters: {0}".format(
command))
# Small errors that do not impede
if proc.returncode != 0:
logger.warning("Error in checking parameter list: %s", stderr)
raise errors.MisconfigurationError(
"Apache is unable to check whether or not the module is "
"loaded because Apache is misconfigured.")
return stdout
def filter_args_num(self, matches, args): # pylint: disable=no-self-use
"""Filter out directives with specific number of arguments.
@@ -612,7 +547,7 @@ class ApacheParser(object):
"%s//*[self::directive=~regexp('%s')]" % (start, regex))
if exclude:
matches = self._exclude_dirs(matches)
matches = self.exclude_dirs(matches)
if arg is None:
arg_suffix = "/arg"
@@ -625,7 +560,7 @@ class ApacheParser(object):
# https://httpd.apache.org/docs/2.4/mod/core.html#include
for match in matches:
dir_ = self.aug.get(match).lower()
if dir_ == "include" or dir_ == "includeoptional":
if dir_ in ("include", "includeoptional"):
ordered_matches.extend(self.find_dir(
directive, arg,
self._get_include_path(self.get_arg(match + "/arg")),
@@ -665,8 +600,7 @@ class ApacheParser(object):
# e.g. strip now, not later
if not value:
return None
else:
value = value.strip("'\"")
value = value.strip("'\"")
variables = ApacheParser.arg_var_interpreter.findall(value)
@@ -679,7 +613,13 @@ class ApacheParser(object):
return value
def _exclude_dirs(self, matches):
def get_root_augpath(self):
"""
Returns the Augeas path of root configuration.
"""
return get_aug_path(self.loc["root"])
def exclude_dirs(self, matches):
"""Exclude directives that are not loaded into the configuration."""
filters = [("ifmodule", self.modules), ("ifdefine", self.variables)]

View File

@@ -0,0 +1,129 @@
"""ParserNode utils"""
def validate_kwargs(kwargs, required_names):
"""
Ensures that the kwargs dict has all the expected values. This function modifies
the kwargs dictionary, and hence the returned dictionary should be used instead
in the caller function instead of the original kwargs.
:param dict kwargs: Dictionary of keyword arguments to validate.
:param list required_names: List of required parameter names.
"""
validated_kwargs = dict()
for name in required_names:
try:
validated_kwargs[name] = kwargs.pop(name)
except KeyError:
raise TypeError("Required keyword argument: {} undefined.".format(name))
# Raise exception if unknown key word arguments are found.
if kwargs:
unknown = ", ".join(kwargs.keys())
raise TypeError("Unknown keyword argument(s): {}".format(unknown))
return validated_kwargs
def parsernode_kwargs(kwargs):
"""
Validates keyword arguments for ParserNode. This function modifies the kwargs
dictionary, and hence the returned dictionary should be used instead in the
caller function instead of the original kwargs.
If metadata is provided, the otherwise required argument "filepath" may be
omitted if the implementation is able to extract its value from the metadata.
This usecase is handled within this function. Filepath defaults to None.
:param dict kwargs: Keyword argument dictionary to validate.
:returns: Tuple of validated and prepared arguments.
"""
# As many values of ParserNode instances can be derived from the metadata,
# (ancestor being a common exception here) make sure we permit it here as well.
if "metadata" in kwargs:
# Filepath can be derived from the metadata in Augeas implementation.
# Default is None, as in this case the responsibility of populating this
# variable lies on the implementation.
kwargs.setdefault("filepath", None)
kwargs.setdefault("dirty", False)
kwargs.setdefault("metadata", {})
kwargs = validate_kwargs(kwargs, ["ancestor", "dirty", "filepath", "metadata"])
return kwargs["ancestor"], kwargs["dirty"], kwargs["filepath"], kwargs["metadata"]
def commentnode_kwargs(kwargs):
"""
Validates keyword arguments for CommentNode and sets the default values for
optional kwargs. This function modifies the kwargs dictionary, and hence the
returned dictionary should be used instead in the caller function instead of
the original kwargs.
If metadata is provided, the otherwise required argument "comment" may be
omitted if the implementation is able to extract its value from the metadata.
This usecase is handled within this function.
:param dict kwargs: Keyword argument dictionary to validate.
:returns: Tuple of validated and prepared arguments and ParserNode kwargs.
"""
# As many values of ParserNode instances can be derived from the metadata,
# (ancestor being a common exception here) make sure we permit it here as well.
if "metadata" in kwargs:
kwargs.setdefault("comment", None)
# Filepath can be derived from the metadata in Augeas implementation.
# Default is None, as in this case the responsibility of populating this
# variable lies on the implementation.
kwargs.setdefault("filepath", None)
kwargs.setdefault("dirty", False)
kwargs.setdefault("metadata", {})
kwargs = validate_kwargs(kwargs, ["ancestor", "dirty", "filepath", "comment",
"metadata"])
comment = kwargs.pop("comment")
return comment, kwargs
def directivenode_kwargs(kwargs):
"""
Validates keyword arguments for DirectiveNode and BlockNode and sets the
default values for optional kwargs. This function modifies the kwargs
dictionary, and hence the returned dictionary should be used instead in the
caller function instead of the original kwargs.
If metadata is provided, the otherwise required argument "name" may be
omitted if the implementation is able to extract its value from the metadata.
This usecase is handled within this function.
:param dict kwargs: Keyword argument dictionary to validate.
:returns: Tuple of validated and prepared arguments and ParserNode kwargs.
"""
# As many values of ParserNode instances can be derived from the metadata,
# (ancestor being a common exception here) make sure we permit it here as well.
if "metadata" in kwargs:
kwargs.setdefault("name", None)
# Filepath can be derived from the metadata in Augeas implementation.
# Default is None, as in this case the responsibility of populating this
# variable lies on the implementation.
kwargs.setdefault("filepath", None)
kwargs.setdefault("dirty", False)
kwargs.setdefault("enabled", True)
kwargs.setdefault("parameters", ())
kwargs.setdefault("metadata", {})
kwargs = validate_kwargs(kwargs, ["ancestor", "dirty", "filepath", "name",
"parameters", "enabled", "metadata"])
name = kwargs.pop("name")
parameters = kwargs.pop("parameters")
enabled = kwargs.pop("enabled")
return name, parameters, enabled, kwargs

View File

@@ -1,3 +1,3 @@
# Remember to update setup.py to match the package versions below.
acme[dev]==0.29.0
certbot[dev]==0.39.0
-e certbot[dev]

View File

@@ -1,16 +1,16 @@
from setuptools import setup
from setuptools import find_packages
from setuptools.command.test import test as TestCommand
import sys
from setuptools import find_packages
from setuptools import setup
from setuptools.command.test import test as TestCommand
version = '1.0.0'
version = '1.1.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.
install_requires = [
'acme>=0.29.0',
'certbot>=0.39.0',
'certbot>=1.0.0.dev0',
'mock',
'python-augeas',
'setuptools',
@@ -18,6 +18,9 @@ install_requires = [
'zope.interface',
]
dev_extras = [
'apacheconfig>=0.3.2',
]
class PyTest(TestCommand):
user_options = []
@@ -69,6 +72,9 @@ setup(
packages=find_packages(),
include_package_data=True,
install_requires=install_requires,
extras_require={
'dev': dev_extras,
},
entry_points={
'certbot.plugins': [
'apache = certbot_apache._internal.entrypoint:ENTRYPOINT',

View File

@@ -0,0 +1,327 @@
"""Tests for AugeasParserNode classes"""
import mock
import unittest
import util
try:
import apacheconfig # pylint: disable=import-error,unused-import
HAS_APACHECONFIG = True
except ImportError: # pragma: no cover
HAS_APACHECONFIG = False
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from certbot import errors
from certbot_apache._internal import assertions
@unittest.skipIf(not HAS_APACHECONFIG, reason='Tests require apacheconfig dependency')
class AugeasParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-methods
"""Test AugeasParserNode using available test configurations"""
def setUp(self): # pylint: disable=arguments-differ
super(AugeasParserNodeTest, self).setUp()
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir, self.work_dir, use_parsernode=True)
self.vh_truth = util.get_vh_truth(
self.temp_dir, "debian_apache_2_4/multiple_vhosts")
def test_save(self):
with mock.patch('certbot_apache._internal.parser.ApacheParser.save') as mock_save:
self.config.parser_root.save("A save message")
self.assertTrue(mock_save.called)
self.assertEqual(mock_save.call_args[0][0], "A save message")
def test_unsaved_files(self):
with mock.patch('certbot_apache._internal.parser.ApacheParser.unsaved_files') as mock_uf:
mock_uf.return_value = ["first", "second"]
files = self.config.parser_root.unsaved_files()
self.assertEqual(files, ["first", "second"])
def test_get_block_node_name(self):
from certbot_apache._internal.augeasparser import AugeasBlockNode
block = AugeasBlockNode(
name=assertions.PASS,
ancestor=None,
filepath=assertions.PASS,
metadata={"augeasparser": mock.Mock(), "augeaspath": "/files/anything"}
)
testcases = {
"/some/path/FirstNode/SecondNode": "SecondNode",
"/some/path/FirstNode/SecondNode/": "SecondNode",
"OnlyPathItem": "OnlyPathItem",
"/files/etc/apache2/apache2.conf/VirtualHost": "VirtualHost",
"/Anything": "Anything",
}
for test in testcases:
self.assertEqual(block._aug_get_name(test), testcases[test]) # pylint: disable=protected-access
def test_find_blocks(self):
blocks = self.config.parser_root.find_blocks("VirtualHost", exclude=False)
self.assertEqual(len(blocks), 12)
def test_find_blocks_case_insensitive(self):
vhs = self.config.parser_root.find_blocks("VirtualHost")
vhs2 = self.config.parser_root.find_blocks("viRtuAlHoST")
self.assertEqual(len(vhs), len(vhs2))
def test_find_directive_found(self):
directives = self.config.parser_root.find_directives("Listen")
self.assertEqual(len(directives), 1)
self.assertTrue(directives[0].filepath.endswith("/apache2/ports.conf"))
self.assertEqual(directives[0].parameters, (u'80',))
def test_find_directive_notfound(self):
directives = self.config.parser_root.find_directives("Nonexistent")
self.assertEqual(len(directives), 0)
def test_find_directive_from_block(self):
blocks = self.config.parser_root.find_blocks("virtualhost")
found = False
for vh in blocks:
if vh.filepath.endswith("sites-enabled/certbot.conf"):
servername = vh.find_directives("servername")
self.assertEqual(servername[0].parameters[0], "certbot.demo")
found = True
self.assertTrue(found)
def test_find_comments(self):
rootcomment = self.config.parser_root.find_comments(
"This is the main Apache server configuration file. "
)
self.assertEqual(len(rootcomment), 1)
self.assertTrue(rootcomment[0].filepath.endswith(
"debian_apache_2_4/multiple_vhosts/apache2/apache2.conf"
))
def test_set_parameters(self):
servernames = self.config.parser_root.find_directives("servername")
names = [] # type: List[str]
for servername in servernames:
names += servername.parameters
self.assertFalse("going_to_set_this" in names)
servernames[0].set_parameters(["something", "going_to_set_this"])
servernames = self.config.parser_root.find_directives("servername")
names = []
for servername in servernames:
names += servername.parameters
self.assertTrue("going_to_set_this" in names)
def test_set_parameters_atinit(self):
from certbot_apache._internal.augeasparser import AugeasDirectiveNode
servernames = self.config.parser_root.find_directives("servername")
setparam = "certbot_apache._internal.augeasparser.AugeasDirectiveNode.set_parameters"
with mock.patch(setparam) as mock_set:
AugeasDirectiveNode(
name=servernames[0].name,
parameters=["test", "setting", "these"],
ancestor=assertions.PASS,
metadata=servernames[0].primary.metadata
)
self.assertTrue(mock_set.called)
self.assertEqual(
mock_set.call_args_list[0][0][0],
["test", "setting", "these"]
)
def test_set_parameters_delete(self):
# Set params
servername = self.config.parser_root.find_directives("servername")[0]
servername.set_parameters(["thisshouldnotexistpreviously", "another",
"third"])
# Delete params
servernames = self.config.parser_root.find_directives("servername")
found = False
for servername in servernames:
if "thisshouldnotexistpreviously" in servername.parameters:
self.assertEqual(len(servername.parameters), 3)
servername.set_parameters(["thisshouldnotexistpreviously"])
found = True
self.assertTrue(found)
# Verify params
servernames = self.config.parser_root.find_directives("servername")
found = False
for servername in servernames:
if "thisshouldnotexistpreviously" in servername.parameters:
self.assertEqual(len(servername.parameters), 1)
servername.set_parameters(["thisshouldnotexistpreviously"])
found = True
self.assertTrue(found)
def test_add_child_comment(self):
newc = self.config.parser_root.add_child_comment("The content")
comments = self.config.parser_root.find_comments("The content")
self.assertEqual(len(comments), 1)
self.assertEqual(
newc.metadata["augeaspath"],
comments[0].primary.metadata["augeaspath"]
)
self.assertEqual(newc.comment, comments[0].comment)
def test_delete_child(self):
listens = self.config.parser_root.primary.find_directives("Listen")
self.assertEqual(len(listens), 1)
self.config.parser_root.primary.delete_child(listens[0])
listens = self.config.parser_root.primary.find_directives("Listen")
self.assertEqual(len(listens), 0)
def test_delete_child_not_found(self):
listen = self.config.parser_root.find_directives("Listen")[0]
listen.primary.metadata["augeaspath"] = "/files/something/nonexistent"
self.assertRaises(
errors.PluginError,
self.config.parser_root.delete_child,
listen
)
def test_add_child_block(self):
nb = self.config.parser_root.add_child_block(
"NewBlock",
["first", "second"]
)
rpath, _, directive = nb.primary.metadata["augeaspath"].rpartition("/")
self.assertEqual(
rpath,
self.config.parser_root.primary.metadata["augeaspath"]
)
self.assertTrue(directive.startswith("NewBlock"))
def test_add_child_block_beginning(self):
self.config.parser_root.add_child_block(
"Beginning",
position=0
)
parser = self.config.parser_root.primary.parser
root_path = self.config.parser_root.primary.metadata["augeaspath"]
# Get first child
first = parser.aug.match("{}/*[1]".format(root_path))
self.assertTrue(first[0].endswith("Beginning"))
def test_add_child_block_append(self):
self.config.parser_root.add_child_block(
"VeryLast",
)
parser = self.config.parser_root.primary.parser
root_path = self.config.parser_root.primary.metadata["augeaspath"]
# Get last child
last = parser.aug.match("{}/*[last()]".format(root_path))
self.assertTrue(last[0].endswith("VeryLast"))
def test_add_child_block_append_alt(self):
self.config.parser_root.add_child_block(
"VeryLastAlt",
position=99999
)
parser = self.config.parser_root.primary.parser
root_path = self.config.parser_root.primary.metadata["augeaspath"]
# Get last child
last = parser.aug.match("{}/*[last()]".format(root_path))
self.assertTrue(last[0].endswith("VeryLastAlt"))
def test_add_child_block_middle(self):
self.config.parser_root.add_child_block(
"Middle",
position=5
)
parser = self.config.parser_root.primary.parser
root_path = self.config.parser_root.primary.metadata["augeaspath"]
# Augeas indices start at 1 :(
middle = parser.aug.match("{}/*[6]".format(root_path))
self.assertTrue(middle[0].endswith("Middle"))
def test_add_child_block_existing_name(self):
parser = self.config.parser_root.primary.parser
root_path = self.config.parser_root.primary.metadata["augeaspath"]
# There already exists a single VirtualHost in the base config
new_block = parser.aug.match("{}/VirtualHost[2]".format(root_path))
self.assertEqual(len(new_block), 0)
vh = self.config.parser_root.add_child_block(
"VirtualHost",
)
new_block = parser.aug.match("{}/VirtualHost[2]".format(root_path))
self.assertEqual(len(new_block), 1)
self.assertTrue(vh.primary.metadata["augeaspath"].endswith("VirtualHost[2]"))
def test_node_init_error_bad_augeaspath(self):
from certbot_apache._internal.augeasparser import AugeasBlockNode
parameters = {
"name": assertions.PASS,
"ancestor": None,
"filepath": assertions.PASS,
"metadata": {
"augeasparser": mock.Mock(),
"augeaspath": "/files/path/endswith/slash/"
}
}
self.assertRaises(
errors.PluginError,
AugeasBlockNode,
**parameters
)
def test_node_init_error_missing_augeaspath(self):
from certbot_apache._internal.augeasparser import AugeasBlockNode
parameters = {
"name": assertions.PASS,
"ancestor": None,
"filepath": assertions.PASS,
"metadata": {
"augeasparser": mock.Mock(),
}
}
self.assertRaises(
errors.PluginError,
AugeasBlockNode,
**parameters
)
def test_add_child_directive(self):
self.config.parser_root.add_child_directive(
"ThisWasAdded",
["with", "parameters"],
position=0
)
dirs = self.config.parser_root.find_directives("ThisWasAdded")
self.assertEqual(len(dirs), 1)
self.assertEqual(dirs[0].parameters, ("with", "parameters"))
# The new directive was added to the very first line of the config
self.assertTrue(dirs[0].primary.metadata["augeaspath"].endswith("[1]"))
def test_add_child_directive_exception(self):
self.assertRaises(
errors.PluginError,
self.config.parser_root.add_child_directive,
"ThisRaisesErrorBecauseMissingParameters"
)
def test_parsed_paths(self):
paths = self.config.parser_root.parsed_paths()
self.assertEqual(len(paths), 6)
def test_find_ancestors(self):
vhsblocks = self.config.parser_root.find_blocks("VirtualHost")
macro_test = False
nonmacro_test = False
for vh in vhsblocks:
if "/macro/" in vh.metadata["augeaspath"].lower():
ancs = vh.find_ancestors("Macro")
self.assertEqual(len(ancs), 1)
macro_test = True
else:
ancs = vh.find_ancestors("Macro")
self.assertEqual(len(ancs), 0)
nonmacro_test = True
self.assertTrue(macro_test)
self.assertTrue(nonmacro_test)
def test_find_ancestors_bad_path(self):
self.config.parser_root.primary.metadata["augeaspath"] = ""
ancs = self.config.parser_root.primary.find_ancestors("Anything")
self.assertEqual(len(ancs), 0)

View File

@@ -2,13 +2,12 @@
"""Test for certbot_apache._internal.configurator AutoHSTS functionality"""
import re
import unittest
import mock
# six is used in mock.patch()
import six # pylint: disable=unused-import
import six # pylint: disable=unused-import # six is used in mock.patch()
from certbot import errors
from certbot_apache._internal import constants
import util

View File

@@ -3,11 +3,9 @@ import unittest
from certbot.compat import os
from certbot.errors import MisconfigurationError
from certbot_apache._internal import obj
from certbot_apache._internal import override_centos
from certbot_apache._internal import parser
import util

View File

@@ -6,10 +6,8 @@ import mock
from certbot import errors
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache._internal import obj
from certbot_apache._internal import override_centos
import util
@@ -108,7 +106,7 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
def test_get_parser(self):
self.assertIsInstance(self.config.parser, override_centos.CentOSParser)
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_opportunistic_httpd_runtime_parsing(self, mock_get):
define_val = (
'Define: TEST1\n'
@@ -157,7 +155,7 @@ class MultipleVhostsTestCentOS(util.ApacheTest):
raise Exception("Missed: %s" % vhost) # pragma: no cover
self.assertEqual(found, 2)
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_get_sysconfig_vars(self, mock_cfg):
"""Make sure we read the sysconfig OPTIONS variable correctly"""
# Return nothing for the process calls

View File

@@ -4,7 +4,6 @@ import unittest
from certbot import errors
from certbot.compat import os
import util

View File

@@ -5,7 +5,6 @@ import unittest
import mock
from certbot import errors
import util

View File

@@ -7,24 +7,20 @@ import tempfile
import unittest
import mock
# six is used in mock.patch()
import six # pylint: disable=unused-import
import six # pylint: disable=unused-import # six is used in mock.patch()
from acme import challenges
from certbot import achallenges
from certbot import crypto_util
from certbot import errors
from certbot.compat import os
from certbot.compat import filesystem
from certbot.compat import os
from certbot.tests import acme_util
from certbot.tests import util as certbot_util
from certbot_apache._internal import apache_util
from certbot_apache._internal import constants
from certbot_apache._internal import obj
from certbot_apache._internal import parser
import util
@@ -79,7 +75,8 @@ class MultipleVhostsTest(util.ApacheTest):
@mock.patch("certbot_apache._internal.parser.ApacheParser")
@mock.patch("certbot_apache._internal.configurator.util.exe_exists")
def _test_prepare_locked(self, unused_parser, unused_exe_exists):
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.get_parsernode_root")
def _test_prepare_locked(self, _node, _exists, _parser):
try:
self.config.prepare()
except errors.PluginError as err:
@@ -803,7 +800,7 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertEqual(mock_restart.call_count, 1)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.restart")
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_cleanup(self, mock_cfg, mock_restart):
mock_cfg.return_value = ""
_, achalls = self.get_key_and_achalls()
@@ -819,7 +816,7 @@ class MultipleVhostsTest(util.ApacheTest):
self.assertFalse(mock_restart.called)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.restart")
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_cleanup_no_errors(self, mock_cfg, mock_restart):
mock_cfg.return_value = ""
_, achalls = self.get_key_and_achalls()

View File

@@ -6,10 +6,8 @@ import mock
from certbot import errors
from certbot.compat import os
from certbot_apache._internal import apache_util
from certbot_apache._internal import obj
import util
@@ -48,7 +46,7 @@ class MultipleVhostsTestDebian(util.ApacheTest):
@mock.patch("certbot.util.run_script")
@mock.patch("certbot.util.exe_exists")
@mock.patch("certbot_apache._internal.parser.subprocess.Popen")
@mock.patch("certbot_apache._internal.apache_util.subprocess.Popen")
def test_enable_mod(self, mock_popen, mock_exe_exists, mock_run_script):
mock_popen().communicate.return_value = ("Define: DUMP_RUN_CFG", "")
mock_popen().returncode = 0

View File

@@ -4,15 +4,10 @@ import unittest
import mock
from certbot import errors
from certbot.display import util as display_util
from certbot.tests import util as certbot_util
from certbot_apache._internal import obj
from certbot_apache._internal.display_ops import select_vhost_multiple
import util

View File

@@ -0,0 +1,443 @@
"""Tests for DualParserNode implementation"""
import unittest
import mock
from certbot_apache._internal import assertions
from certbot_apache._internal import augeasparser
from certbot_apache._internal import dualparser
class DualParserNodeTest(unittest.TestCase): # pylint: disable=too-many-public-methods
"""DualParserNode tests"""
def setUp(self): # pylint: disable=arguments-differ
parser_mock = mock.MagicMock()
parser_mock.aug.match.return_value = []
parser_mock.get_arg.return_value = []
ast_mock = mock.MagicMock()
self.metadata = {"augeasparser": parser_mock, "augeaspath": "/invalid", "ac_ast": ast_mock}
self.block = dualparser.DualBlockNode(name="block",
ancestor=None,
filepath="/tmp/something",
metadata=self.metadata)
self.block_two = dualparser.DualBlockNode(name="block",
ancestor=self.block,
filepath="/tmp/something",
metadata=self.metadata)
self.directive = dualparser.DualDirectiveNode(name="directive",
ancestor=self.block,
filepath="/tmp/something",
metadata=self.metadata)
self.comment = dualparser.DualCommentNode(comment="comment",
ancestor=self.block,
filepath="/tmp/something",
metadata=self.metadata)
def test_create_with_precreated(self):
cnode = dualparser.DualCommentNode(comment="comment",
ancestor=self.block,
filepath="/tmp/something",
primary=self.comment.secondary,
secondary=self.comment.primary)
dnode = dualparser.DualDirectiveNode(name="directive",
ancestor=self.block,
filepath="/tmp/something",
primary=self.directive.secondary,
secondary=self.directive.primary)
bnode = dualparser.DualBlockNode(name="block",
ancestor=self.block,
filepath="/tmp/something",
primary=self.block.secondary,
secondary=self.block.primary)
# Switched around
self.assertTrue(cnode.primary is self.comment.secondary)
self.assertTrue(cnode.secondary is self.comment.primary)
self.assertTrue(dnode.primary is self.directive.secondary)
self.assertTrue(dnode.secondary is self.directive.primary)
self.assertTrue(bnode.primary is self.block.secondary)
self.assertTrue(bnode.secondary is self.block.primary)
def test_set_params(self):
params = ("first", "second")
self.directive.primary.set_parameters = mock.Mock()
self.directive.secondary.set_parameters = mock.Mock()
self.directive.set_parameters(params)
self.assertTrue(self.directive.primary.set_parameters.called)
self.assertTrue(self.directive.secondary.set_parameters.called)
def test_set_parameters(self):
pparams = mock.MagicMock()
sparams = mock.MagicMock()
pparams.parameters = ("a", "b")
sparams.parameters = ("a", "b")
self.directive.primary.set_parameters = pparams
self.directive.secondary.set_parameters = sparams
self.directive.set_parameters(("param", "seq"))
self.assertTrue(pparams.called)
self.assertTrue(sparams.called)
def test_delete_child(self):
pdel = mock.MagicMock()
sdel = mock.MagicMock()
self.block.primary.delete_child = pdel
self.block.secondary.delete_child = sdel
self.block.delete_child(self.comment)
self.assertTrue(pdel.called)
self.assertTrue(sdel.called)
def test_unsaved_files(self):
puns = mock.MagicMock()
suns = mock.MagicMock()
puns.return_value = assertions.PASS
suns.return_value = assertions.PASS
self.block.primary.unsaved_files = puns
self.block.secondary.unsaved_files = suns
self.block.unsaved_files()
self.assertTrue(puns.called)
self.assertTrue(suns.called)
def test_getattr_equality(self):
self.directive.primary.variableexception = "value"
self.directive.secondary.variableexception = "not_value"
with self.assertRaises(AssertionError):
_ = self.directive.variableexception
self.directive.primary.variable = "value"
self.directive.secondary.variable = "value"
try:
self.directive.variable
except AssertionError: # pragma: no cover
self.fail("getattr check raised an AssertionError where it shouldn't have")
def test_parsernode_dirty_assert(self):
# disable assertion pass
self.comment.primary.comment = "value"
self.comment.secondary.comment = "value"
self.comment.primary.filepath = "x"
self.comment.secondary.filepath = "x"
self.comment.primary.dirty = False
self.comment.secondary.dirty = True
with self.assertRaises(AssertionError):
assertions.assertEqual(self.comment.primary, self.comment.secondary)
def test_parsernode_filepath_assert(self):
# disable assertion pass
self.comment.primary.comment = "value"
self.comment.secondary.comment = "value"
self.comment.primary.filepath = "first"
self.comment.secondary.filepath = "second"
with self.assertRaises(AssertionError):
assertions.assertEqual(self.comment.primary, self.comment.secondary)
def test_add_child_block(self):
mock_first = mock.MagicMock(return_value=self.block.primary)
mock_second = mock.MagicMock(return_value=self.block.secondary)
self.block.primary.add_child_block = mock_first
self.block.secondary.add_child_block = mock_second
self.block.add_child_block("Block")
self.assertTrue(mock_first.called)
self.assertTrue(mock_second.called)
def test_add_child_directive(self):
mock_first = mock.MagicMock(return_value=self.directive.primary)
mock_second = mock.MagicMock(return_value=self.directive.secondary)
self.block.primary.add_child_directive = mock_first
self.block.secondary.add_child_directive = mock_second
self.block.add_child_directive("Directive")
self.assertTrue(mock_first.called)
self.assertTrue(mock_second.called)
def test_add_child_comment(self):
mock_first = mock.MagicMock(return_value=self.comment.primary)
mock_second = mock.MagicMock(return_value=self.comment.secondary)
self.block.primary.add_child_comment = mock_first
self.block.secondary.add_child_comment = mock_second
self.block.add_child_comment("Comment")
self.assertTrue(mock_first.called)
self.assertTrue(mock_second.called)
def test_find_comments(self):
pri_comments = [augeasparser.AugeasCommentNode(comment="some comment",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
sec_comments = [augeasparser.AugeasCommentNode(comment=assertions.PASS,
ancestor=self.block,
filepath=assertions.PASS,
metadata=self.metadata)]
find_coms_primary = mock.MagicMock(return_value=pri_comments)
find_coms_secondary = mock.MagicMock(return_value=sec_comments)
self.block.primary.find_comments = find_coms_primary
self.block.secondary.find_comments = find_coms_secondary
dcoms = self.block.find_comments("comment")
p_dcoms = [d.primary for d in dcoms]
s_dcoms = [d.secondary for d in dcoms]
p_coms = self.block.primary.find_comments("comment")
s_coms = self.block.secondary.find_comments("comment")
# Check that every comment response is represented in the list of
# DualParserNode instances.
for p in p_dcoms:
self.assertTrue(p in p_coms)
for s in s_dcoms:
self.assertTrue(s in s_coms)
def test_find_blocks_first_passing(self):
youshallnotpass = [augeasparser.AugeasBlockNode(name="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
youshallpass = [augeasparser.AugeasBlockNode(name=assertions.PASS,
ancestor=self.block,
filepath=assertions.PASS,
metadata=self.metadata)]
find_blocks_primary = mock.MagicMock(return_value=youshallpass)
find_blocks_secondary = mock.MagicMock(return_value=youshallnotpass)
self.block.primary.find_blocks = find_blocks_primary
self.block.secondary.find_blocks = find_blocks_secondary
blocks = self.block.find_blocks("something")
for block in blocks:
try:
assertions.assertEqual(block.primary, block.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertTrue(assertions.isPassDirective(block.primary))
self.assertFalse(assertions.isPassDirective(block.secondary))
def test_find_blocks_second_passing(self):
youshallnotpass = [augeasparser.AugeasBlockNode(name="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
youshallpass = [augeasparser.AugeasBlockNode(name=assertions.PASS,
ancestor=self.block,
filepath=assertions.PASS,
metadata=self.metadata)]
find_blocks_primary = mock.MagicMock(return_value=youshallnotpass)
find_blocks_secondary = mock.MagicMock(return_value=youshallpass)
self.block.primary.find_blocks = find_blocks_primary
self.block.secondary.find_blocks = find_blocks_secondary
blocks = self.block.find_blocks("something")
for block in blocks:
try:
assertions.assertEqual(block.primary, block.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertFalse(assertions.isPassDirective(block.primary))
self.assertTrue(assertions.isPassDirective(block.secondary))
def test_find_dirs_first_passing(self):
notpassing = [augeasparser.AugeasDirectiveNode(name="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
passing = [augeasparser.AugeasDirectiveNode(name=assertions.PASS,
ancestor=self.block,
filepath=assertions.PASS,
metadata=self.metadata)]
find_dirs_primary = mock.MagicMock(return_value=passing)
find_dirs_secondary = mock.MagicMock(return_value=notpassing)
self.block.primary.find_directives = find_dirs_primary
self.block.secondary.find_directives = find_dirs_secondary
directives = self.block.find_directives("something")
for directive in directives:
try:
assertions.assertEqual(directive.primary, directive.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertTrue(assertions.isPassDirective(directive.primary))
self.assertFalse(assertions.isPassDirective(directive.secondary))
def test_find_dirs_second_passing(self):
notpassing = [augeasparser.AugeasDirectiveNode(name="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
passing = [augeasparser.AugeasDirectiveNode(name=assertions.PASS,
ancestor=self.block,
filepath=assertions.PASS,
metadata=self.metadata)]
find_dirs_primary = mock.MagicMock(return_value=notpassing)
find_dirs_secondary = mock.MagicMock(return_value=passing)
self.block.primary.find_directives = find_dirs_primary
self.block.secondary.find_directives = find_dirs_secondary
directives = self.block.find_directives("something")
for directive in directives:
try:
assertions.assertEqual(directive.primary, directive.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertFalse(assertions.isPassDirective(directive.primary))
self.assertTrue(assertions.isPassDirective(directive.secondary))
def test_find_coms_first_passing(self):
notpassing = [augeasparser.AugeasCommentNode(comment="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
passing = [augeasparser.AugeasCommentNode(comment=assertions.PASS,
ancestor=self.block,
filepath=assertions.PASS,
metadata=self.metadata)]
find_coms_primary = mock.MagicMock(return_value=passing)
find_coms_secondary = mock.MagicMock(return_value=notpassing)
self.block.primary.find_comments = find_coms_primary
self.block.secondary.find_comments = find_coms_secondary
comments = self.block.find_comments("something")
for comment in comments:
try:
assertions.assertEqual(comment.primary, comment.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertTrue(assertions.isPassComment(comment.primary))
self.assertFalse(assertions.isPassComment(comment.secondary))
def test_find_coms_second_passing(self):
notpassing = [augeasparser.AugeasCommentNode(comment="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
passing = [augeasparser.AugeasCommentNode(comment=assertions.PASS,
ancestor=self.block,
filepath=assertions.PASS,
metadata=self.metadata)]
find_coms_primary = mock.MagicMock(return_value=notpassing)
find_coms_secondary = mock.MagicMock(return_value=passing)
self.block.primary.find_comments = find_coms_primary
self.block.secondary.find_comments = find_coms_secondary
comments = self.block.find_comments("something")
for comment in comments:
try:
assertions.assertEqual(comment.primary, comment.secondary)
except AssertionError: # pragma: no cover
self.fail("Assertion should have passed")
self.assertFalse(assertions.isPassComment(comment.primary))
self.assertTrue(assertions.isPassComment(comment.secondary))
def test_find_blocks_no_pass_equal(self):
notpassing1 = [augeasparser.AugeasBlockNode(name="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
notpassing2 = [augeasparser.AugeasBlockNode(name="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
find_blocks_primary = mock.MagicMock(return_value=notpassing1)
find_blocks_secondary = mock.MagicMock(return_value=notpassing2)
self.block.primary.find_blocks = find_blocks_primary
self.block.secondary.find_blocks = find_blocks_secondary
blocks = self.block.find_blocks("anything")
for block in blocks:
self.assertEqual(block.primary, block.secondary)
self.assertTrue(block.primary is not block.secondary)
def test_find_dirs_no_pass_equal(self):
notpassing1 = [augeasparser.AugeasDirectiveNode(name="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
notpassing2 = [augeasparser.AugeasDirectiveNode(name="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
find_dirs_primary = mock.MagicMock(return_value=notpassing1)
find_dirs_secondary = mock.MagicMock(return_value=notpassing2)
self.block.primary.find_directives = find_dirs_primary
self.block.secondary.find_directives = find_dirs_secondary
directives = self.block.find_directives("anything")
for directive in directives:
self.assertEqual(directive.primary, directive.secondary)
self.assertTrue(directive.primary is not directive.secondary)
def test_find_comments_no_pass_equal(self):
notpassing1 = [augeasparser.AugeasCommentNode(comment="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
notpassing2 = [augeasparser.AugeasCommentNode(comment="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
find_coms_primary = mock.MagicMock(return_value=notpassing1)
find_coms_secondary = mock.MagicMock(return_value=notpassing2)
self.block.primary.find_comments = find_coms_primary
self.block.secondary.find_comments = find_coms_secondary
comments = self.block.find_comments("anything")
for comment in comments:
self.assertEqual(comment.primary, comment.secondary)
self.assertTrue(comment.primary is not comment.secondary)
def test_find_blocks_no_pass_notequal(self):
notpassing1 = [augeasparser.AugeasBlockNode(name="notpassing",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
notpassing2 = [augeasparser.AugeasBlockNode(name="different",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)]
find_blocks_primary = mock.MagicMock(return_value=notpassing1)
find_blocks_secondary = mock.MagicMock(return_value=notpassing2)
self.block.primary.find_blocks = find_blocks_primary
self.block.secondary.find_blocks = find_blocks_secondary
with self.assertRaises(AssertionError):
_ = self.block.find_blocks("anything")
def test_parsernode_notequal(self):
ne_block = augeasparser.AugeasBlockNode(name="different",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)
ne_directive = augeasparser.AugeasDirectiveNode(name="different",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)
ne_comment = augeasparser.AugeasCommentNode(comment="different",
ancestor=self.block,
filepath="/path/to/whatever",
metadata=self.metadata)
self.assertFalse(self.block == ne_block)
self.assertFalse(self.directive == ne_directive)
self.assertFalse(self.comment == ne_comment)
def test_parsed_paths(self):
mock_p = mock.MagicMock(return_value=['/path/file.conf',
'/another/path',
'/path/other.conf'])
mock_s = mock.MagicMock(return_value=['/path/*.conf', '/another/path'])
self.block.primary.parsed_paths = mock_p
self.block.secondary.parsed_paths = mock_s
self.block.parsed_paths()
self.assertTrue(mock_p.called)
self.assertTrue(mock_s.called)
def test_parsed_paths_error(self):
mock_p = mock.MagicMock(return_value=['/path/file.conf'])
mock_s = mock.MagicMock(return_value=['/path/*.conf', '/another/path'])
self.block.primary.parsed_paths = mock_p
self.block.secondary.parsed_paths = mock_s
with self.assertRaises(AssertionError):
self.block.parsed_paths()
def test_find_ancestors(self):
primarymock = mock.MagicMock(return_value=[])
secondarymock = mock.MagicMock(return_value=[])
self.block.primary.find_ancestors = primarymock
self.block.secondary.find_ancestors = secondarymock
self.block.find_ancestors("anything")
self.assertTrue(primarymock.called)
self.assertTrue(secondarymock.called)

View File

@@ -6,10 +6,8 @@ import mock
from certbot import errors
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache._internal import obj
from certbot_apache._internal import override_fedora
import util
@@ -102,7 +100,7 @@ class MultipleVhostsTestFedora(util.ApacheTest):
def test_get_parser(self):
self.assertIsInstance(self.config.parser, override_fedora.FedoraParser)
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_opportunistic_httpd_runtime_parsing(self, mock_get):
define_val = (
'Define: TEST1\n'
@@ -157,7 +155,7 @@ class MultipleVhostsTestFedora(util.ApacheTest):
raise Exception("Missed: %s" % vhost) # pragma: no cover
self.assertEqual(found, 2)
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_get_sysconfig_vars(self, mock_cfg):
"""Make sure we read the sysconfig OPTIONS variable correctly"""
# Return nothing for the process calls

View File

@@ -6,10 +6,8 @@ import mock
from certbot import errors
from certbot.compat import filesystem
from certbot.compat import os
from certbot_apache._internal import obj
from certbot_apache._internal import override_gentoo
import util
@@ -92,7 +90,7 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
for define in defines:
self.assertTrue(define in self.config.parser.variables.keys())
@mock.patch("certbot_apache._internal.parser.ApacheParser.parse_from_subprocess")
@mock.patch("certbot_apache._internal.apache_util.parse_from_subprocess")
def test_no_binary_configdump(self, mock_subprocess):
"""Make sure we don't call binary dumps other than modules from Apache
as this is not supported in Gentoo currently"""
@@ -106,7 +104,7 @@ class MultipleVhostsTestGentoo(util.ApacheTest):
self.config.parser.reset_modules()
self.assertTrue(mock_subprocess.called)
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_opportunistic_httpd_runtime_parsing(self, mock_get):
mod_val = (
'Loaded Modules:\n'

View File

@@ -1,21 +1,18 @@
"""Test for certbot_apache._internal.http_01."""
import unittest
import mock
from acme import challenges
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from certbot import achallenges
from certbot import errors
from certbot.compat import filesystem
from certbot.compat import os
from certbot.tests import acme_util
from certbot_apache._internal.parser import get_aug_path
import util
NUM_ACHALLS = 3

View File

@@ -6,7 +6,6 @@ import mock
from certbot import errors
from certbot.compat import os
import util
@@ -166,7 +165,7 @@ class BasicParserTest(util.ParserTest):
self.assertTrue(mock_logger.debug.called)
@mock.patch("certbot_apache._internal.parser.ApacheParser.find_dir")
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_update_runtime_variables(self, mock_cfg, _):
define_val = (
'ServerRoot: "/etc/apache2"\n'
@@ -272,7 +271,7 @@ class BasicParserTest(util.ParserTest):
self.assertEqual(mock_parse.call_count, 25)
@mock.patch("certbot_apache._internal.parser.ApacheParser.find_dir")
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_update_runtime_variables_alt_values(self, mock_cfg, _):
inc_val = (
'Included configuration files:\n'
@@ -294,7 +293,7 @@ class BasicParserTest(util.ParserTest):
# path derived from root configuration Include statements
self.assertEqual(mock_parse.call_count, 1)
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_update_runtime_vars_bad_output(self, mock_cfg):
mock_cfg.return_value = "Define: TLS=443=24"
self.parser.update_runtime_variables()
@@ -304,7 +303,7 @@ class BasicParserTest(util.ParserTest):
errors.PluginError, self.parser.update_runtime_variables)
@mock.patch("certbot_apache._internal.configurator.ApacheConfigurator.option")
@mock.patch("certbot_apache._internal.parser.subprocess.Popen")
@mock.patch("certbot_apache._internal.apache_util.subprocess.Popen")
def test_update_runtime_vars_bad_ctl(self, mock_popen, mock_opt):
mock_popen.side_effect = OSError
mock_opt.return_value = "nonexistent"
@@ -312,7 +311,7 @@ class BasicParserTest(util.ParserTest):
errors.MisconfigurationError,
self.parser.update_runtime_variables)
@mock.patch("certbot_apache._internal.parser.subprocess.Popen")
@mock.patch("certbot_apache._internal.apache_util.subprocess.Popen")
def test_update_runtime_vars_bad_exit(self, mock_popen):
mock_popen().communicate.return_value = ("", "")
mock_popen.returncode = -1
@@ -356,7 +355,7 @@ class ParserInitTest(util.ApacheTest):
ApacheParser, os.path.relpath(self.config_path),
"/dummy/vhostpath", version=(2, 4, 22), configurator=self.config)
@mock.patch("certbot_apache._internal.parser.ApacheParser._get_runtime_cfg")
@mock.patch("certbot_apache._internal.apache_util._get_runtime_cfg")
def test_unparseable(self, mock_cfg):
from certbot_apache._internal.parser import ApacheParser
mock_cfg.return_value = ('Define: TEST')

View File

@@ -0,0 +1,44 @@
"""Tests for ApacheConfigurator for AugeasParserNode classes"""
import unittest
import mock
import util
try:
import apacheconfig
HAS_APACHECONFIG = True
except ImportError: # pragma: no cover
HAS_APACHECONFIG = False
@unittest.skipIf(not HAS_APACHECONFIG, reason='Tests require apacheconfig dependency')
class ConfiguratorParserNodeTest(util.ApacheTest): # pylint: disable=too-many-public-methods
"""Test AugeasParserNode using available test configurations"""
def setUp(self): # pylint: disable=arguments-differ
super(ConfiguratorParserNodeTest, self).setUp()
self.config = util.get_apache_configurator(
self.config_path, self.vhost_path, self.config_dir,
self.work_dir, use_parsernode=True)
self.vh_truth = util.get_vh_truth(
self.temp_dir, "debian_apache_2_4/multiple_vhosts")
def test_parsernode_get_vhosts(self):
self.config.USE_PARSERNODE = True
vhosts = self.config.get_virtual_hosts()
# Legacy get_virtual_hosts() do not set the node
self.assertTrue(vhosts[0].node is not None)
def test_parsernode_get_vhosts_mismatch(self):
vhosts = self.config.get_virtual_hosts_v2()
# One of the returned VirtualHost objects differs
vhosts[0].name = "IdidntExpectThat"
self.config.get_virtual_hosts_v2 = mock.MagicMock(return_value=vhosts)
with self.assertRaises(AssertionError):
_ = self.config.get_virtual_hosts()
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -0,0 +1,128 @@
""" Tests for ParserNode interface """
import unittest
from certbot_apache._internal import interfaces
from certbot_apache._internal import parsernode_util as util
class DummyParserNode(interfaces.ParserNode):
""" A dummy class implementing ParserNode interface """
def __init__(self, **kwargs):
"""
Initializes the ParserNode instance.
"""
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(kwargs)
self.ancestor = ancestor
self.dirty = dirty
self.filepath = filepath
self.metadata = metadata
super(DummyParserNode, self).__init__(**kwargs)
def save(self, msg): # pragma: no cover
"""Save"""
pass
def find_ancestors(self, name): # pragma: no cover
""" Find ancestors """
return []
class DummyCommentNode(DummyParserNode):
""" A dummy class implementing CommentNode interface """
def __init__(self, **kwargs):
"""
Initializes the CommentNode instance and sets its instance variables.
"""
comment, kwargs = util.commentnode_kwargs(kwargs)
self.comment = comment
super(DummyCommentNode, self).__init__(**kwargs)
class DummyDirectiveNode(DummyParserNode):
""" A dummy class implementing DirectiveNode interface """
# pylint: disable=too-many-arguments
def __init__(self, **kwargs):
"""
Initializes the DirectiveNode instance and sets its instance variables.
"""
name, parameters, enabled, kwargs = util.directivenode_kwargs(kwargs)
self.name = name
self.parameters = parameters
self.enabled = enabled
super(DummyDirectiveNode, self).__init__(**kwargs)
def set_parameters(self, parameters): # pragma: no cover
"""Set parameters"""
pass
class DummyBlockNode(DummyDirectiveNode):
""" A dummy class implementing BlockNode interface """
def add_child_block(self, name, parameters=None, position=None): # pragma: no cover
"""Add child block"""
pass
def add_child_directive(self, name, parameters=None, position=None): # pragma: no cover
"""Add child directive"""
pass
def add_child_comment(self, comment="", position=None): # pragma: no cover
"""Add child comment"""
pass
def find_blocks(self, name, exclude=True): # pragma: no cover
"""Find blocks"""
pass
def find_directives(self, name, exclude=True): # pragma: no cover
"""Find directives"""
pass
def find_comments(self, comment, exact=False): # pragma: no cover
"""Find comments"""
pass
def delete_child(self, child): # pragma: no cover
"""Delete child"""
pass
def unsaved_files(self): # pragma: no cover
"""Unsaved files"""
pass
interfaces.CommentNode.register(DummyCommentNode)
interfaces.DirectiveNode.register(DummyDirectiveNode)
interfaces.BlockNode.register(DummyBlockNode)
class ParserNodeTest(unittest.TestCase):
"""Dummy placeholder test case for ParserNode interfaces"""
def test_dummy(self):
dummyblock = DummyBlockNode(
name="None",
parameters=(),
ancestor=None,
dirty=False,
filepath="/some/random/path"
)
dummydirective = DummyDirectiveNode(
name="Name",
ancestor=None,
filepath="/another/path"
)
dummycomment = DummyCommentNode(
comment="Comment",
ancestor=dummyblock,
filepath="/some/file"
)
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -0,0 +1,115 @@
""" Tests for ParserNode utils """
import unittest
from certbot_apache._internal import parsernode_util as util
class ParserNodeUtilTest(unittest.TestCase):
"""Tests for ParserNode utils"""
def _setup_parsernode(self):
""" Sets up kwargs dict for ParserNode """
return {
"ancestor": None,
"dirty": False,
"filepath": "/tmp",
}
def _setup_commentnode(self):
""" Sets up kwargs dict for CommentNode """
pn = self._setup_parsernode()
pn["comment"] = "x"
return pn
def _setup_directivenode(self):
""" Sets up kwargs dict for DirectiveNode """
pn = self._setup_parsernode()
pn["name"] = "Name"
pn["parameters"] = ("first",)
pn["enabled"] = True
return pn
def test_unknown_parameter(self):
params = self._setup_parsernode()
params["unknown"] = "unknown"
self.assertRaises(TypeError, util.parsernode_kwargs, params)
params = self._setup_commentnode()
params["unknown"] = "unknown"
self.assertRaises(TypeError, util.commentnode_kwargs, params)
params = self._setup_directivenode()
params["unknown"] = "unknown"
self.assertRaises(TypeError, util.directivenode_kwargs, params)
def test_parsernode(self):
params = self._setup_parsernode()
ctrl = self._setup_parsernode()
ancestor, dirty, filepath, metadata = util.parsernode_kwargs(params)
self.assertEqual(ancestor, ctrl["ancestor"])
self.assertEqual(dirty, ctrl["dirty"])
self.assertEqual(filepath, ctrl["filepath"])
self.assertEqual(metadata, {})
def test_parsernode_from_metadata(self):
params = self._setup_parsernode()
params.pop("filepath")
md = {"some": "value"}
params["metadata"] = md
# Just testing that error from missing required parameters is not raised
_, _, _, metadata = util.parsernode_kwargs(params)
self.assertEqual(metadata, md)
def test_commentnode(self):
params = self._setup_commentnode()
ctrl = self._setup_commentnode()
comment, _ = util.commentnode_kwargs(params)
self.assertEqual(comment, ctrl["comment"])
def test_commentnode_from_metadata(self):
params = self._setup_commentnode()
params.pop("comment")
params["metadata"] = {}
# Just testing that error from missing required parameters is not raised
util.commentnode_kwargs(params)
def test_directivenode(self):
params = self._setup_directivenode()
ctrl = self._setup_directivenode()
name, parameters, enabled, _ = util.directivenode_kwargs(params)
self.assertEqual(name, ctrl["name"])
self.assertEqual(parameters, ctrl["parameters"])
self.assertEqual(enabled, ctrl["enabled"])
def test_directivenode_from_metadata(self):
params = self._setup_directivenode()
params.pop("filepath")
params.pop("name")
params["metadata"] = {"irrelevant": "value"}
# Just testing that error from missing required parameters is not raised
util.directivenode_kwargs(params)
def test_missing_required(self):
c_params = self._setup_commentnode()
c_params.pop("comment")
self.assertRaises(TypeError, util.commentnode_kwargs, c_params)
d_params = self._setup_directivenode()
d_params.pop("ancestor")
self.assertRaises(TypeError, util.directivenode_kwargs, d_params)
p_params = self._setup_parsernode()
p_params.pop("filepath")
self.assertRaises(TypeError, util.parsernode_kwargs, p_params)
if __name__ == "__main__":
unittest.main() # pragma: no cover

View File

@@ -12,7 +12,6 @@ from certbot.compat import os
from certbot.display import util as display_util
from certbot.plugins import common
from certbot.tests import util as test_util
from certbot_apache._internal import configurator
from certbot_apache._internal import entrypoint
from certbot_apache._internal import obj
@@ -85,7 +84,8 @@ def get_apache_configurator(
config_path, vhost_path,
config_dir, work_dir, version=(2, 4, 7),
os_info="generic",
conf_vhost_path=None):
conf_vhost_path=None,
use_parsernode=False):
"""Create an Apache Configurator with the specified options.
:param conf: Function that returns binary paths. self.conf in Configurator
@@ -111,19 +111,21 @@ def get_apache_configurator(
mock_exe_exists.return_value = True
with mock.patch("certbot_apache._internal.parser.ApacheParser."
"update_runtime_variables"):
try:
config_class = entrypoint.OVERRIDE_CLASSES[os_info]
except KeyError:
config_class = configurator.ApacheConfigurator
config = config_class(config=mock_le_config, name="apache",
version=version)
if not conf_vhost_path:
config_class.OS_DEFAULTS["vhost_root"] = vhost_path
else:
# Custom virtualhost path was requested
config.config.apache_vhost_root = conf_vhost_path
config.config.apache_ctl = config_class.OS_DEFAULTS["ctl"]
config.prepare()
with mock.patch("certbot_apache._internal.apache_util.parse_from_subprocess") as mock_sp:
mock_sp.return_value = []
try:
config_class = entrypoint.OVERRIDE_CLASSES[os_info]
except KeyError:
config_class = configurator.ApacheConfigurator
config = config_class(config=mock_le_config, name="apache",
version=version, use_parsernode=use_parsernode)
if not conf_vhost_path:
config_class.OS_DEFAULTS["vhost_root"] = vhost_path
else:
# Custom virtualhost path was requested
config.config.apache_vhost_root = conf_vhost_path
config.config.apache_ctl = config_class.OS_DEFAULTS["ctl"]
config.prepare()
return config

View File

@@ -1,6 +1,6 @@
#!/usr/bin/env python
import sys
import os
import sys
hook_script_type = os.path.basename(os.path.dirname(sys.argv[1]))
if hook_script_type == 'deploy' and ('RENEWED_DOMAINS' not in os.environ or 'RENEWED_LINEAGE' not in os.environ):

View File

@@ -1,5 +1,6 @@
"""This module contains advanced assertions for the certbot integration tests."""
import os
try:
import grp
POSIX_MODE = True

View File

@@ -2,21 +2,25 @@
from __future__ import print_function
import os
from os.path import exists
from os.path import join
import re
import shutil
import subprocess
import time
from os.path import join, exists
import pytest
from certbot_integration_tests.certbot_tests import context as certbot_context
from certbot_integration_tests.certbot_tests.assertions import (
assert_hook_execution, assert_saved_renew_hook,
assert_cert_count_for_lineage,
assert_world_no_permissions, assert_world_read_permissions,
assert_equals_group_owner, assert_equals_group_permissions, assert_equals_world_read_permissions,
EVERYBODY_SID
)
from certbot_integration_tests.certbot_tests.assertions import assert_cert_count_for_lineage
from certbot_integration_tests.certbot_tests.assertions import assert_equals_group_owner
from certbot_integration_tests.certbot_tests.assertions import assert_equals_group_permissions
from certbot_integration_tests.certbot_tests.assertions import assert_equals_world_read_permissions
from certbot_integration_tests.certbot_tests.assertions import assert_hook_execution
from certbot_integration_tests.certbot_tests.assertions import assert_saved_renew_hook
from certbot_integration_tests.certbot_tests.assertions import assert_world_no_permissions
from certbot_integration_tests.certbot_tests.assertions import assert_world_read_permissions
from certbot_integration_tests.certbot_tests.assertions import EVERYBODY_SID
from certbot_integration_tests.utils import misc

View File

@@ -6,9 +6,10 @@ for a directory a specific configuration using built-in pytest hooks.
See https://docs.pytest.org/en/latest/reference.html#hook-reference
"""
from __future__ import print_function
import contextlib
import sys
import subprocess
import sys
from certbot_integration_tests.utils import acme_server as acme_lib

View File

@@ -2,8 +2,9 @@ import os
import subprocess
from certbot_integration_tests.certbot_tests import context as certbot_context
from certbot_integration_tests.utils import misc, certbot_call
from certbot_integration_tests.nginx_tests import nginx_config as config
from certbot_integration_tests.utils import certbot_call
from certbot_integration_tests.utils import misc
class IntegrationTestsContext(certbot_context.IntegrationTestsContext):

View File

@@ -1,19 +1,22 @@
#!/usr/bin/env python
"""Module to setup an ACME CA server environment able to run multiple tests in parallel"""
from __future__ import print_function
import errno
import json
import os
from os.path import join
import shutil
import subprocess
import sys
import tempfile
import time
import os
import subprocess
import shutil
import sys
from os.path import join
import requests
from certbot_integration_tests.utils import misc, proxy, pebble_artifacts
from certbot_integration_tests.utils import misc
from certbot_integration_tests.utils import pebble_artifacts
from certbot_integration_tests.utils import proxy
from certbot_integration_tests.utils.constants import *

View File

@@ -1,10 +1,11 @@
#!/usr/bin/env python
"""Module to call certbot in test mode"""
from __future__ import absolute_import
from distutils.version import LooseVersion
import os
import subprocess
import sys
import os
import certbot_integration_tests
from certbot_integration_tests.utils.constants import *

View File

@@ -3,27 +3,27 @@ Misc module contains stateless functions that could be used during pytest execut
or outside during setup/teardown of the integration tests environment.
"""
import contextlib
import logging
import errno
import multiprocessing
import os
import re
import shutil
import stat
import subprocess
import sys
import tempfile
import time
import warnings
from distutils.version import LooseVersion
import pkg_resources
import requests
from OpenSSL import crypto
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives.asymmetric import ec
from cryptography.hazmat.primitives.serialization import Encoding, PrivateFormat, NoEncryption
from six.moves import socketserver, SimpleHTTPServer
from cryptography.hazmat.primitives.serialization import Encoding
from cryptography.hazmat.primitives.serialization import NoEncryption
from cryptography.hazmat.primitives.serialization import PrivateFormat
from OpenSSL import crypto
import pkg_resources
import requests
from six.moves import SimpleHTTPServer
from six.moves import socketserver
RSA_KEY_TYPE = 'rsa'
ECDSA_KEY_TYPE = 'ecdsa'

View File

@@ -6,17 +6,18 @@ to serve a mock OCSP responder during integration tests against Pebble.
import datetime
import re
import requests
from dateutil import parser
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import serialization, hashes
from cryptography import x509
from cryptography.hazmat.backends import default_backend
from cryptography.hazmat.primitives import hashes
from cryptography.hazmat.primitives import serialization
from cryptography.x509 import ocsp
from dateutil import parser
import requests
from six.moves import BaseHTTPServer
from certbot_integration_tests.utils.constants import MOCK_OCSP_SERVER_PORT
from certbot_integration_tests.utils.constants import PEBBLE_MANAGEMENT_URL
from certbot_integration_tests.utils.misc import GracefulTCPServer
from certbot_integration_tests.utils.constants import MOCK_OCSP_SERVER_PORT, PEBBLE_MANAGEMENT_URL
class _ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):

View File

@@ -1,7 +1,7 @@
#!/usr/bin/env python
import json
import sys
import re
import sys
import requests
from six.moves import BaseHTTPServer

View File

@@ -1,14 +1,16 @@
from distutils.version import StrictVersion
import sys
from distutils.version import StrictVersion
from setuptools import setup, find_packages, __version__ as setuptools_version
from setuptools import __version__ as setuptools_version
from setuptools import find_packages
from setuptools import setup
version = '0.32.0.dev0'
install_requires = [
'coverage',
'cryptography',
'docker-compose',
'pyopenssl',
'pytest',
'pytest-cov',

View File

@@ -6,9 +6,9 @@ import subprocess
import mock
import zope.interface
from certbot._internal import configuration
from certbot import errors as le_errors
from certbot import util as certbot_util
from certbot._internal import configuration
from certbot_apache._internal import entrypoint
from certbot_compatibility_test import errors
from certbot_compatibility_test import interfaces

View File

@@ -8,7 +8,6 @@ from certbot._internal import constants
from certbot_compatibility_test import errors
from certbot_compatibility_test import util
logger = logging.getLogger(__name__)
@@ -44,8 +43,7 @@ class Proxy(object):
method = getattr(self._configurator, name, None)
if callable(method):
return method
else:
raise AttributeError()
raise AttributeError()
def has_more_configs(self):
"""Returns true if there are more configs to test"""
@@ -83,8 +81,7 @@ class Proxy(object):
"""Returns the set of domain names that the plugin should find"""
if self._all_names:
return self._all_names
else:
raise errors.Error("No configuration file loaded")
raise errors.Error("No configuration file loaded")
def get_testable_domain_names(self):
"""Returns the set of domain names that can be tested against"""

View File

@@ -6,14 +6,13 @@ import subprocess
import zope.interface
from acme.magic_typing import Set # pylint: disable=unused-import, no-name-in-module
from certbot._internal import configuration
from certbot_nginx._internal import configurator
from certbot_nginx._internal import constants
from certbot_compatibility_test import errors
from certbot_compatibility_test import interfaces
from certbot_compatibility_test import util
from certbot_compatibility_test.configurators import common as configurators_common
from certbot_nginx._internal import configurator
from certbot_nginx._internal import constants
@zope.interface.implementer(interfaces.IConfiguratorProxy)

View File

@@ -5,31 +5,27 @@ import filecmp
import logging
import os
import shutil
import sys
import tempfile
import time
import sys
from urllib3.util import connection
import OpenSSL
from six.moves import xrange # pylint: disable=import-error,redefined-builtin
from urllib3.util import connection
from acme import challenges
from acme import crypto_util
from acme import messages
from acme.magic_typing import List, Tuple # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
from acme.magic_typing import Tuple # pylint: disable=unused-import, no-name-in-module
from certbot import achallenges
from certbot import errors as le_errors
from certbot.tests import acme_util
from certbot_compatibility_test import errors
from certbot_compatibility_test import util
from certbot_compatibility_test import validator
from certbot_compatibility_test.configurators.apache import common as a_common
from certbot_compatibility_test.configurators.nginx import common as n_common
DESCRIPTION = """
Tests Certbot plugins against different server configurations. It is
assumed that Docker is already installed. If no test type is specified, all
@@ -60,26 +56,27 @@ def test_authenticator(plugin, config, temp_dir):
return False
success = True
for i in xrange(len(responses)):
if not responses[i]:
for i, response in enumerate(responses):
achall = achalls[i]
if not response:
logger.error(
"Plugin failed to complete %s for %s in %s",
type(achalls[i]), achalls[i].domain, config)
type(achall), achall.domain, config)
success = False
elif isinstance(responses[i], challenges.HTTP01Response):
elif isinstance(response, challenges.HTTP01Response):
# We fake the DNS resolution to ensure that any domain is resolved
# to the local HTTP server setup for the compatibility tests
with _fake_dns_resolution("127.0.0.1"):
verified = responses[i].simple_verify(
achalls[i].chall, achalls[i].domain,
verified = response.simple_verify(
achall.chall, achall.domain,
util.JWK.public_key(), port=plugin.http_port)
if verified:
logger.info(
"http-01 verification for %s succeeded", achalls[i].domain)
"http-01 verification for %s succeeded", achall.domain)
else:
logger.error(
"**** http-01 verification for %s in %s failed",
achalls[i].domain, config)
achall.domain, config)
success = False
if success:
@@ -92,8 +89,7 @@ def test_authenticator(plugin, config, temp_dir):
if _dirs_are_unequal(config, backup):
logger.error("Challenge cleanup failed for %s", config)
return False
else:
logger.info("Challenge cleanup succeeded")
logger.info("Challenge cleanup succeeded")
return success
@@ -308,7 +304,7 @@ def get_args():
"-e", "--enhance", action="store_true", help="tests the enhancements "
"the plugin supports (implicitly includes installer tests)")
for plugin in PLUGINS.itervalues():
for plugin in PLUGINS.values():
plugin.add_parser_arguments(parser)
args = parser.parse_args()

View File

@@ -8,12 +8,10 @@ import tarfile
import josepy as jose
from certbot.tests import util as test_util
from certbot._internal import constants
from certbot.tests import util as test_util
from certbot_compatibility_test import errors
_KEY_BASE = "rsa2048_key.pem"
KEY_PATH = test_util.vector_path(_KEY_BASE)
KEY = test_util.load_pyopenssl_private_key(_KEY_BASE)

View File

@@ -1,15 +1,14 @@
"""Validators to determine the current webserver configuration"""
import logging
import socket
import requests
import requests
import six
from six.moves import xrange # pylint: disable=import-error,redefined-builtin
from six.moves import xrange # pylint: disable=import-error, redefined-builtin
from acme import crypto_util
from acme import errors as acme_errors
logger = logging.getLogger(__name__)

View File

@@ -1,9 +1,9 @@
"""Tests for certbot_compatibility_test.validator."""
import unittest
import requests
import mock
import OpenSSL
import requests
from acme import errors as acme_errors
from certbot_compatibility_test import validator

View File

@@ -5,6 +5,7 @@ import sys
from certbot_nginx._internal import nginxparser
def roundtrip(stuff):
success = True
for t in stuff:

View File

@@ -1,10 +1,9 @@
import sys
from setuptools import setup
from setuptools import find_packages
from setuptools import setup
version = '1.0.0'
version = '1.1.0.dev0'
install_requires = [
'certbot',

View File

@@ -17,6 +17,7 @@
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))

View File

@@ -1,3 +1,3 @@
# Remember to update setup.py to match the package versions below.
acme[dev]==0.29.0
certbot[dev]==0.39.0
-e certbot[dev]

View File

@@ -1,16 +1,16 @@
from setuptools import setup
from setuptools import find_packages
from setuptools.command.test import test as TestCommand
import sys
from setuptools import find_packages
from setuptools import setup
from setuptools.command.test import test as TestCommand
version = '1.0.0'
version = '1.1.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.
install_requires = [
'acme>=0.29.0',
'certbot>=0.39.0',
'certbot>=1.0.0.dev0',
'cloudflare>=1.5.1',
'mock',
'setuptools',
@@ -46,7 +46,7 @@ setup(
license='Apache License 2.0',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
classifiers=[
'Development Status :: 3 - Alpha',
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: Apache Software License',

View File

@@ -1,8 +1,8 @@
"""DNS Authenticator for CloudXNS DNS."""
import logging
import zope.interface
from lexicon.providers import cloudxns
import zope.interface
from certbot import errors
from certbot import interfaces

View File

@@ -17,6 +17,7 @@
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))

View File

@@ -1,3 +1,3 @@
# Remember to update setup.py to match the package versions below.
acme[dev]==0.31.0
certbot[dev]==0.39.0
-e certbot[dev]

View File

@@ -1,16 +1,16 @@
from setuptools import setup
from setuptools import find_packages
from setuptools.command.test import test as TestCommand
import sys
from setuptools import find_packages
from setuptools import setup
from setuptools.command.test import test as TestCommand
version = '1.0.0'
version = '1.1.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.
install_requires = [
'acme>=0.31.0',
'certbot>=0.39.0',
'certbot>=1.0.0.dev0',
'dns-lexicon>=2.2.1', # Support for >1 TXT record per name
'mock',
'setuptools',
@@ -46,7 +46,7 @@ setup(
license='Apache License 2.0',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
classifiers=[
'Development Status :: 3 - Alpha',
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: Apache Software License',

View File

@@ -3,7 +3,8 @@
import unittest
import mock
from requests.exceptions import HTTPError, RequestException
from requests.exceptions import HTTPError
from requests.exceptions import RequestException
from certbot.compat import os
from certbot.plugins import dns_test_common

View File

@@ -17,6 +17,7 @@
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))

View File

@@ -1,3 +1,3 @@
# Remember to update setup.py to match the package versions below.
acme[dev]==0.29.0
certbot[dev]==0.39.0
-e certbot[dev]

View File

@@ -1,16 +1,16 @@
from setuptools import setup
from setuptools import find_packages
from setuptools.command.test import test as TestCommand
import sys
from setuptools import find_packages
from setuptools import setup
from setuptools.command.test import test as TestCommand
version = '1.0.0'
version = '1.1.0.dev0'
# Remember to update local-oldest-requirements.txt when changing the minimum
# acme/certbot version.
install_requires = [
'acme>=0.29.0',
'certbot>=0.39.0',
'certbot>=1.0.0.dev0',
'mock',
'python-digitalocean>=1.11',
'setuptools',
@@ -47,7 +47,7 @@ setup(
license='Apache License 2.0',
python_requires='>=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*',
classifiers=[
'Development Status :: 3 - Alpha',
'Development Status :: 5 - Production/Stable',
'Environment :: Plugins',
'Intended Audience :: System Administrators',
'License :: OSI Approved :: Apache Software License',

View File

@@ -1,8 +1,8 @@
"""DNS Authenticator for DNSimple DNS."""
import logging
import zope.interface
from lexicon.providers import dnsimple
import zope.interface
from certbot import errors
from certbot import interfaces

View File

@@ -17,6 +17,7 @@
# documentation root, use os.path.abspath to make it absolute, like shown here.
#
import os
# import sys
# sys.path.insert(0, os.path.abspath('.'))

Some files were not shown because too many files have changed in this diff Show More