Compare commits
3 Commits
test-apach
...
v0.9.3
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
ce4e00569e | ||
|
|
4fcc3c8d0c | ||
|
|
342e0d1184 |
@@ -1,119 +0,0 @@
|
||||
# Configuring Azure Pipelines with Certbot
|
||||
|
||||
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
|
||||
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
|
||||
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
|
||||
|
||||
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
|
||||
|
||||
Unlike Travis, where CodeCov is working without any action required, CodeCov supports Azure Pipelines
|
||||
using the coverage-bash utility (not python-coverage for now) only if you provide the Codecov repo token
|
||||
using the `CODECOV_TOKEN` environment variable. So `CODECOV_TOKEN` needs to be set as a secured
|
||||
environment variable to allow the main pipeline to publish coverage reports to CodeCov.
|
||||
|
||||
This INSTALL.md file explains how to configure Azure Pipelines with Certbot in order to execute the CI/CD logic defined in `.azure-pipelines` folder with it.
|
||||
During this installation step, warnings describing user access and legal comitments will be displayed like this:
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
```
|
||||
|
||||
This document suppose that the Azure DevOps organization is named _certbot_, and the Azure DevOps project is also _certbot_.
|
||||
|
||||
## Useful links
|
||||
|
||||
* https://docs.microsoft.com/en-us/azure/devops/pipelines/yaml-schema?view=azure-devops&tabs=schema
|
||||
* https://www.azuredevopslabs.com/labs/azuredevops/github-integration/
|
||||
* https://docs.microsoft.com/en-us/azure/devops/pipelines/ecosystems/python?view=azure-devops
|
||||
|
||||
## Prerequisites
|
||||
|
||||
### Having a GitHub account
|
||||
|
||||
Use your GitHub user for a normal GitHub account, or a user that has administrative rights to the GitHub organization if relevant.
|
||||
|
||||
### Having an Azure DevOps account
|
||||
- Go to https://dev.azure.com/, click "Start free with GitHub"
|
||||
- Login to GitHub
|
||||
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
Personal user data (email + profile info, in read-only)
|
||||
```
|
||||
|
||||
- Microsoft will create a Live account using the email referenced for the GitHub account. This account is also linked to GitHub account (meaning you can log it using GitHub authentication)
|
||||
- Proceed with account registration (birth date, country), add details about name and email contact
|
||||
|
||||
```
|
||||
!!! ACCESS REQUIRED !!!
|
||||
Microsoft proposes to send commercial links to this mail
|
||||
Azure DevOps terms of service need to be accepted
|
||||
```
|
||||
|
||||
_Logged to Azure DevOps, account is ready._
|
||||
|
||||
### Installing Azure Pipelines to GitHub
|
||||
|
||||
- On GitHub, go to Marketplace
|
||||
- Select Azure Pipeline, and "Set up a plan"
|
||||
- Select Free, then "Install it for free"
|
||||
- Click "Complete order and begin installation"
|
||||
|
||||
```
|
||||
!!! ACCESS !!!
|
||||
Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses, deployments, issues, pull requests.
|
||||
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
|
||||
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
|
||||
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
|
||||
protected branches, like master, and cannot modify the security context around this on GitHub.
|
||||
Access can be defined for all or only selected repositories, which is nice.
|
||||
```
|
||||
|
||||
- Redirected to Azure DevOps, select the account created in _Having an Azure DevOps account_ section.
|
||||
- Select the organization, and click "Create a new project" (let's name it the same than the targetted github repo)
|
||||
- The Visibility is public, to profit from 10 parallel jobs
|
||||
|
||||
```
|
||||
!!! ACCESS !!!
|
||||
Azure Pipelines needs access to the GitHub account (in term of beeing able to check it is valid), and the Resources shared between the GitHub account and Azure Pipelines.
|
||||
```
|
||||
|
||||
_Done. We can move to pipelines configuration._
|
||||
|
||||
## Import an existing pipelines from `.azure-pipelines` folder
|
||||
|
||||
- On Azure DevOps, go to your organization (eg. _certbot_) then your project (eg. _certbot_)
|
||||
- Click "Pipelines" tab
|
||||
- Click "New pipeline"
|
||||
- Where is your code?: select "__Use the classic editor__"
|
||||
|
||||
__Warning: Do not choose the GitHub option in Where is your code? section. Indeed, this option will trigger an OAuth
|
||||
grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAuth Application. The permissions asked
|
||||
then are way too large (admin level on almost everything), while the classic approach does not add any more
|
||||
permissions, and works perfectly well.__
|
||||
|
||||
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
|
||||
- Click on YAML option for "Select a template"
|
||||
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
|
||||
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
|
||||
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
|
||||
|
||||
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._
|
||||
|
||||
## Add a secret variable to a pipeline (like `CODECOV_TOKEN`)
|
||||
|
||||
__NB: Following steps suppose that you already setup the YAML pipeline file to
|
||||
consume the secret variable that these steps will create as an environment variable.
|
||||
For a variable named `CODECOV_TOKEN` consuming the variable `codecov_token`,
|
||||
in the YAML file this setup would take the form of the following:
|
||||
```
|
||||
steps:
|
||||
- script: ./do_something_that_consumes_CODECOV_TOKEN # Eg. `codecov -F windows`
|
||||
env:
|
||||
CODECOV_TOKEN: $(codecov_token)
|
||||
```
|
||||
|
||||
To set up a variable that is shared between pipelines, follow the instructions
|
||||
at
|
||||
https://docs.microsoft.com/en-us/azure/devops/pipelines/library/variable-groups.
|
||||
When adding variables to a group, don't forget to tick "Keep this value secret"
|
||||
if it shouldn't be shared publcily.
|
||||
@@ -1,20 +0,0 @@
|
||||
# Advanced pipeline for isolated checks and release purpose
|
||||
trigger:
|
||||
- test-*
|
||||
- '*.x'
|
||||
pr:
|
||||
- test-*
|
||||
# This pipeline is also nightly run on master
|
||||
schedules:
|
||||
- cron: "0 4 * * *"
|
||||
displayName: Nightly build
|
||||
branches:
|
||||
include:
|
||||
- master
|
||||
always: true
|
||||
|
||||
jobs:
|
||||
# Any addition here should be reflected in the release pipeline.
|
||||
# It is advised to declare all jobs here as templates to improve maintainability.
|
||||
- template: templates/tests-suite.yml
|
||||
- template: templates/installer-tests.yml
|
||||
@@ -1,12 +0,0 @@
|
||||
trigger:
|
||||
# apache-parser-v2 is a temporary branch for doing work related to
|
||||
# rewriting the parser in the Apache plugin.
|
||||
- apache-parser-v2
|
||||
- master
|
||||
pr:
|
||||
- apache-parser-v2
|
||||
- master
|
||||
- '*.x'
|
||||
|
||||
jobs:
|
||||
- template: templates/tests-suite.yml
|
||||
@@ -1,13 +0,0 @@
|
||||
# Release pipeline to build and deploy Certbot for Windows for GitHub release tags
|
||||
trigger:
|
||||
tags:
|
||||
include:
|
||||
- v*
|
||||
pr: none
|
||||
|
||||
jobs:
|
||||
# Any addition here should be reflected in the advanced pipeline.
|
||||
# It is advised to declare all jobs here as templates to improve maintainability.
|
||||
- template: templates/tests-suite.yml
|
||||
- template: templates/installer-tests.yml
|
||||
- template: templates/changelog.yml
|
||||
@@ -1,14 +0,0 @@
|
||||
jobs:
|
||||
- job: changelog
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
steps:
|
||||
- bash: |
|
||||
CERTBOT_VERSION="$(cd certbot && python -c "import certbot; print(certbot.__version__)" && cd ~-)"
|
||||
"${BUILD_REPOSITORY_LOCALPATH}\tools\extract_changelog.py" "${CERTBOT_VERSION}" >> "${BUILD_ARTIFACTSTAGINGDIRECTORY}/release_notes.md"
|
||||
displayName: Prepare changelog
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: changelog
|
||||
displayName: Publish changelog
|
||||
@@ -1,54 +0,0 @@
|
||||
jobs:
|
||||
- job: installer_build
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.7
|
||||
architecture: x86
|
||||
addToPath: true
|
||||
- script: python windows-installer/construct.py
|
||||
displayName: Build Certbot installer
|
||||
- task: CopyFiles@2
|
||||
inputs:
|
||||
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
|
||||
contents: '*.exe'
|
||||
targetFolder: $(Build.ArtifactStagingDirectory)
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: windows-installer
|
||||
displayName: Publish Windows installer
|
||||
- job: installer_run
|
||||
dependsOn: installer_build
|
||||
strategy:
|
||||
matrix:
|
||||
win2019:
|
||||
imageName: windows-2019
|
||||
win2016:
|
||||
imageName: vs2017-win2016
|
||||
win2012r2:
|
||||
imageName: vs2015-win2012r2
|
||||
pool:
|
||||
vmImage: $(imageName)
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: windows-installer
|
||||
path: $(Build.SourcesDirectory)/bin
|
||||
displayName: Retrieve Windows installer
|
||||
- script: $(Build.SourcesDirectory)\bin\certbot-beta-installer-win32.exe /S
|
||||
displayName: Install Certbot
|
||||
- powershell: Invoke-WebRequest https://www.python.org/ftp/python/3.8.0/python-3.8.0-amd64-webinstall.exe -OutFile C:\py3-setup.exe
|
||||
displayName: Get Python
|
||||
- script: C:\py3-setup.exe /quiet PrependPath=1 InstallAllUsers=1 Include_launcher=1 InstallLauncherAllUsers=1 Include_test=0 Include_doc=0 Include_dev=1 Include_debug=0 Include_tcltk=0 TargetDir=C:\py3
|
||||
displayName: Install Python
|
||||
- script: |
|
||||
py -3 -m venv venv
|
||||
venv\Scripts\python tools\pip_install.py -e certbot-ci
|
||||
displayName: Prepare Certbot-CI
|
||||
- script: |
|
||||
set PATH=%ProgramFiles(x86)%\Certbot\bin;%PATH%
|
||||
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
|
||||
displayName: Run integration tests
|
||||
@@ -1,38 +0,0 @@
|
||||
jobs:
|
||||
- job: test
|
||||
pool:
|
||||
vmImage: vs2017-win2016
|
||||
strategy:
|
||||
matrix:
|
||||
py35:
|
||||
PYTHON_VERSION: 3.5
|
||||
TOXENV: py35
|
||||
py37-cover:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: py37-cover
|
||||
integration-certbot:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: integration-certbot
|
||||
PYTEST_ADDOPTS: --numprocesses 4
|
||||
variables:
|
||||
- group: certbot-common
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: $(PYTHON_VERSION)
|
||||
addToPath: true
|
||||
- script: python tools/pip_install.py -U tox coverage
|
||||
displayName: Install dependencies
|
||||
- script: python -m tox
|
||||
displayName: Run tox
|
||||
# We do not require codecov report upload to succeed. So to avoid to break the pipeline if
|
||||
# something goes wrong, each command is suffixed with a command that hides any non zero exit
|
||||
# codes and echoes an informative message instead.
|
||||
- bash: |
|
||||
curl -s https://codecov.io/bash -o codecov-bash || echo "Failed to download codecov-bash"
|
||||
chmod +x codecov-bash || echo "Failed to apply execute permissions on codecov-bash"
|
||||
./codecov-bash -F windows || echo "Codecov did not collect coverage reports"
|
||||
condition: in(variables['TOXENV'], 'py37-cover', 'integration-certbot')
|
||||
env:
|
||||
CODECOV_TOKEN: $(codecov_token)
|
||||
displayName: Publish coverage
|
||||
18
.codecov.yml
18
.codecov.yml
@@ -1,18 +0,0 @@
|
||||
coverage:
|
||||
status:
|
||||
project:
|
||||
default: off
|
||||
linux:
|
||||
flags: linux
|
||||
# Fixed target instead of auto set by #7173, can
|
||||
# be removed when flags in Codecov are added back.
|
||||
target: 97.4
|
||||
threshold: 0.1
|
||||
base: auto
|
||||
windows:
|
||||
flags: windows
|
||||
# Fixed target instead of auto set by #7173, can
|
||||
# be removed when flags in Codecov are added back.
|
||||
target: 97.4
|
||||
threshold: 0.1
|
||||
base: auto
|
||||
@@ -1,5 +1,3 @@
|
||||
[run]
|
||||
omit = */setup.py
|
||||
|
||||
[report]
|
||||
omit = */setup.py
|
||||
# show lines missing coverage in output
|
||||
show_missing = True
|
||||
|
||||
35
.github/stale.yml
vendored
35
.github/stale.yml
vendored
@@ -1,35 +0,0 @@
|
||||
# Configuration for https://github.com/marketplace/stale
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request becomes stale
|
||||
daysUntilStale: 365
|
||||
|
||||
# Number of days of inactivity before an Issue or Pull Request with the stale label is closed.
|
||||
# Set to false to disable. If disabled, issues still need to be closed manually, but will remain marked as stale.
|
||||
# When changing this value, be sure to also update markComment below.
|
||||
daysUntilClose: 30
|
||||
|
||||
# Ignore issues with an assignee (defaults to false)
|
||||
exemptAssignees: true
|
||||
|
||||
# Label to use when marking as stale
|
||||
staleLabel: needs-update
|
||||
|
||||
# Comment to post when marking as stale. Set to `false` to disable
|
||||
markComment: >
|
||||
We've made a lot of changes to Certbot since this issue was opened. If you
|
||||
still have this issue with an up-to-date version of Certbot, can you please
|
||||
add a comment letting us know? This helps us to better see what issues are
|
||||
still affecting our users. If there is no activity in the next 30 days, this
|
||||
issue will be automatically closed.
|
||||
|
||||
# Comment to post when closing a stale Issue or Pull Request.
|
||||
closeComment: >
|
||||
This issue has been closed due to lack of activity, but if you think it
|
||||
should be reopened, please open a new issue with a link to this one and we'll
|
||||
take a look.
|
||||
|
||||
# Limit the number of actions per hour, from 1-30. Default is 30
|
||||
limitPerRun: 1
|
||||
|
||||
# Don't mark pull requests as stale.
|
||||
only: issues
|
||||
18
.gitignore
vendored
18
.gitignore
vendored
@@ -6,8 +6,7 @@ dist*/
|
||||
/venv*/
|
||||
/kgs/
|
||||
/.tox/
|
||||
/releases*/
|
||||
/log*
|
||||
/releases/
|
||||
letsencrypt.log
|
||||
certbot.log
|
||||
letsencrypt-auto-source/letsencrypt-auto.sig.lzma.base64
|
||||
@@ -34,18 +33,3 @@ tags
|
||||
tests/letstest/letest-*/
|
||||
tests/letstest/*.pem
|
||||
tests/letstest/venv/
|
||||
|
||||
.venv
|
||||
|
||||
# pytest cache
|
||||
.cache
|
||||
.mypy_cache/
|
||||
.pytest_cache/
|
||||
|
||||
# docker files
|
||||
.docker
|
||||
|
||||
# certbot tests
|
||||
.certbot_test_workspace
|
||||
**/assets/pebble*
|
||||
**/assets/challtestsrv*
|
||||
|
||||
@@ -1,7 +0,0 @@
|
||||
[settings]
|
||||
skip_glob=venv*
|
||||
skip=letsencrypt-auto-source
|
||||
force_sort_within_sections=True
|
||||
force_single_line=True
|
||||
order_by_type=False
|
||||
line_length=400
|
||||
4
.pep8
Normal file
4
.pep8
Normal file
@@ -0,0 +1,4 @@
|
||||
[pep8]
|
||||
# E265 block comment should start with '# '
|
||||
# E501 line too long (X > 79 characters)
|
||||
ignore = E265,E501
|
||||
56
.pylintrc
56
.pylintrc
@@ -1,8 +1,5 @@
|
||||
[MASTER]
|
||||
|
||||
# use as many jobs as there are cores
|
||||
jobs=0
|
||||
|
||||
# Specify a configuration file.
|
||||
#rcfile=
|
||||
|
||||
@@ -24,11 +21,6 @@ persistent=yes
|
||||
# usually to register additional checkers.
|
||||
load-plugins=linter_plugin
|
||||
|
||||
# A comma-separated list of package or module names from where C extensions may
|
||||
# be loaded. Extensions are loading into the active Python interpreter and may
|
||||
# run arbitrary code.
|
||||
extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
|
||||
|
||||
|
||||
[MESSAGES CONTROL]
|
||||
|
||||
@@ -46,14 +38,10 @@ extension-pkg-whitelist=pywintypes,win32api,win32file,win32security
|
||||
# --enable=similarities". If you want to run only the classes checker, but have
|
||||
# no Warning level messages displayed, use"--disable=all --enable=classes
|
||||
# --disable=W"
|
||||
# CERTBOT COMMENT
|
||||
# 1) Once certbot codebase is claimed to be compatible exclusively with Python 3,
|
||||
# the useless-object-inheritance check can be enabled again, and code fixed accordingly.
|
||||
# 2) Check unsubscriptable-object tends to create a lot of false positives. Let's disable it.
|
||||
# See https://github.com/PyCQA/pylint/issues/1498.
|
||||
# 3) Same as point 2 for no-value-for-parameter.
|
||||
# See https://github.com/PyCQA/pylint/issues/2820.
|
||||
disable=fixme,locally-disabled,locally-enabled,bad-continuation,no-self-use,invalid-name,cyclic-import,duplicate-code,design,import-outside-toplevel,useless-object-inheritance,unsubscriptable-object,no-value-for-parameter,no-else-return,no-else-raise,no-else-break,no-else-continue
|
||||
disable=fixme,locally-disabled,abstract-class-not-used,abstract-class-little-used,bad-continuation,too-few-public-methods,no-self-use,invalid-name,too-many-instance-attributes,cyclic-import
|
||||
# abstract-class-not-used cannot be disabled locally (at least in
|
||||
# pylint 1.4.1), same for abstract-class-little-used
|
||||
|
||||
|
||||
[REPORTS]
|
||||
|
||||
@@ -260,7 +248,7 @@ ignored-modules=pkg_resources,confargparse,argparse,six.moves,six.moves.urllib
|
||||
|
||||
# List of classes names for which member attributes should not be checked
|
||||
# (useful for classes with attributes dynamically set).
|
||||
ignored-classes=Field,Header,JWS,closing
|
||||
ignored-classes=SQLObject
|
||||
|
||||
# When zope mode is activated, add a predefined set of Zope acquired attributes
|
||||
# to generated-members.
|
||||
@@ -306,6 +294,40 @@ valid-classmethod-first-arg=cls
|
||||
valid-metaclass-classmethod-first-arg=mcs
|
||||
|
||||
|
||||
[DESIGN]
|
||||
|
||||
# Maximum number of arguments for function / method
|
||||
max-args=6
|
||||
|
||||
# Argument names that match this expression will be ignored. Default to name
|
||||
# with leading underscore
|
||||
ignored-argument-names=_.*
|
||||
|
||||
# Maximum number of locals for function / method body
|
||||
max-locals=15
|
||||
|
||||
# Maximum number of return / yield for function / method body
|
||||
max-returns=6
|
||||
|
||||
# Maximum number of branch for function / method body
|
||||
max-branches=12
|
||||
|
||||
# Maximum number of statements in function / method body
|
||||
max-statements=50
|
||||
|
||||
# Maximum number of parents for a class (see R0901).
|
||||
max-parents=12
|
||||
|
||||
# Maximum number of attributes for a class (see R0902).
|
||||
max-attributes=7
|
||||
|
||||
# Minimum number of public methods for a class (see R0903).
|
||||
min-public-methods=2
|
||||
|
||||
# Maximum number of public methods for a class (see R0904).
|
||||
max-public-methods=20
|
||||
|
||||
|
||||
[EXCEPTIONS]
|
||||
|
||||
# Exceptions that will emit a warning when being caught. Defaults to
|
||||
|
||||
336
.travis.yml
336
.travis.yml
@@ -1,278 +1,109 @@
|
||||
language: python
|
||||
dist: xenial
|
||||
|
||||
cache:
|
||||
directories:
|
||||
- $HOME/.cache/pip
|
||||
|
||||
before_script:
|
||||
- 'if [[ "$TRAVIS_OS_NAME" == "osx" ]]; then ulimit -n 1024 ; fi'
|
||||
# On Travis, the fastest parallelization for integration tests has proved to be 4.
|
||||
- 'if [[ "$TOXENV" == *"integration"* ]]; then export PYTEST_ADDOPTS="--numprocesses 4"; fi'
|
||||
# Use Travis retry feature for farm tests since they are flaky
|
||||
- 'if [[ "$TOXENV" == "travis-test-farm"* ]]; then export TRAVIS_RETRY=travis_retry; fi'
|
||||
- export TOX_TESTENV_PASSENV=TRAVIS
|
||||
# This makes sure we get a host with docker-compose present.
|
||||
dist: trusty
|
||||
|
||||
# Only build pushes to the master branch, PRs, and branches beginning with
|
||||
# `test-` or of the form `digit(s).digit(s).x`. This reduces the number of
|
||||
# simultaneous Travis runs, which speeds turnaround time on review since there
|
||||
# is a cap of on the number of simultaneous runs.
|
||||
branches:
|
||||
only:
|
||||
# apache-parser-v2 is a temporary branch for doing work related to
|
||||
# rewriting the parser in the Apache plugin.
|
||||
- apache-parser-v2
|
||||
- master
|
||||
- /^\d+\.\d+\.x$/
|
||||
- /^test-.*$/
|
||||
before_install:
|
||||
- 'dpkg -s libaugeas0'
|
||||
|
||||
# Jobs for the main test suite are always executed (including on PRs) except for pushes on master.
|
||||
not-on-master: ¬-on-master
|
||||
if: NOT (type = push AND branch = master)
|
||||
|
||||
# Jobs for the extended test suite are executed for cron jobs and pushes to
|
||||
# non-development branches. See the explanation for apache-parser-v2 above.
|
||||
extended-test-suite: &extended-test-suite
|
||||
if: type = cron OR (type = push AND branch NOT IN (apache-parser-v2, master))
|
||||
# using separate envs with different TOXENVs creates 4x1 Travis build
|
||||
# matrix, which allows us to clearly distinguish which component under
|
||||
# test has failed
|
||||
env:
|
||||
global:
|
||||
- BOULDERPATH=$PWD/boulder/
|
||||
|
||||
matrix:
|
||||
include:
|
||||
# Main test suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=pebble TOXENV=integration
|
||||
<<: *not-on-master
|
||||
|
||||
# This job is always executed, including on master
|
||||
env: TOXENV=cover BOULDER_INTEGRATION=1
|
||||
sudo: true
|
||||
after_failure:
|
||||
- sudo cat /var/log/mysql/error.log
|
||||
- ps aux | grep mysql
|
||||
- python: "2.7"
|
||||
env: TOXENV=py27-cover FYI="py27 tests + code coverage"
|
||||
|
||||
- python: "3.7"
|
||||
env: TOXENV=lint
|
||||
<<: *not-on-master
|
||||
- python: "3.5"
|
||||
env: TOXENV=mypy
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
env: TOXENV='py27-{acme,apache,apache-v2,certbot,dns,nginx}-oldest'
|
||||
<<: *not-on-master
|
||||
- python: "3.4"
|
||||
env: TOXENV=py34
|
||||
<<: *not-on-master
|
||||
- python: "3.7"
|
||||
env: TOXENV=py37
|
||||
<<: *not-on-master
|
||||
- python: "3.8"
|
||||
env: TOXENV=py38
|
||||
<<: *not-on-master
|
||||
- sudo: required
|
||||
env: TOXENV=apache_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
<<: *not-on-master
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_xenial
|
||||
services: docker
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
env: TOXENV=apacheconftest-with-pebble
|
||||
<<: *not-on-master
|
||||
- python: "2.7"
|
||||
env: TOXENV=nginxroundtrip
|
||||
<<: *not-on-master
|
||||
|
||||
# Extended test suite on cron jobs and pushes to tested branches other than master
|
||||
env: TOXENV=py27-oldest BOULDER_INTEGRATION=1
|
||||
sudo: true
|
||||
after_failure:
|
||||
- sudo cat /var/log/mysql/error.log
|
||||
- ps aux | grep mysql
|
||||
- python: "2.6"
|
||||
env: TOXENV=py26 BOULDER_INTEGRATION=1
|
||||
sudo: true
|
||||
after_failure:
|
||||
- sudo cat /var/log/mysql/error.log
|
||||
- ps aux | grep mysql
|
||||
- python: "2.6"
|
||||
env: TOXENV=py26-oldest BOULDER_INTEGRATION=1
|
||||
sudo: true
|
||||
after_failure:
|
||||
- sudo cat /var/log/mysql/error.log
|
||||
- ps aux | grep mysql
|
||||
- sudo: required
|
||||
env: TOXENV=nginx_compat
|
||||
services: docker
|
||||
before_install:
|
||||
addons:
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-apache2
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-leauto-upgrades
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
git:
|
||||
depth: false # This is needed to have the history to checkout old versions of certbot-auto.
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-certonly-standalone
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env:
|
||||
- TOXENV=travis-test-farm-sdists
|
||||
- secure: "f+j/Lj9s1lcuKo5sEFrlRd1kIAMnIJI4z0MTI7QF8jl9Fkmbx7KECGzw31TNgzrOSzxSapHbcueFYvNCLKST+kE/8ogMZBbwqXfEDuKpyF6BY3uYoJn+wPVE5pIb8Hhe08xPte8TTDSMIyHI3EyTfcAKrIreauoArePvh/cRvSw="
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: TOXENV=py37 CERTBOT_NO_PIN=1
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
before_install:
|
||||
addons:
|
||||
- sudo: required
|
||||
env: TOXENV=apache_compat
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
before_install:
|
||||
addons:
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration-certbot-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
env: TOXENV=apacheconftest
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration-certbot-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration-nginx-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration-nginx-oldest
|
||||
# Ubuntu Trusty or older must be used because the oldest version of
|
||||
# cryptography we support cannot be compiled against the version of
|
||||
# OpenSSL in Xenial or newer.
|
||||
dist: trusty
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.3"
|
||||
env: TOXENV=py33
|
||||
- python: "3.4"
|
||||
env: TOXENV=py34
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: TOXENV=py35
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: TOXENV=py36
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: TOXENV=py37
|
||||
<<: *extended-test-suite
|
||||
- python: "3.8"
|
||||
env: TOXENV=py38
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.4"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.5"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.6"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.7"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
sudo: required
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- python: "3.8"
|
||||
env: ACME_SERVER=boulder-v1 TOXENV=integration
|
||||
<<: *extended-test-suite
|
||||
- python: "3.8"
|
||||
env: ACME_SERVER=boulder-v2 TOXENV=integration
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_jessie
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=le_auto_centos6
|
||||
services: docker
|
||||
<<: *extended-test-suite
|
||||
- sudo: required
|
||||
env: TOXENV=docker_dev
|
||||
services: docker
|
||||
addons:
|
||||
apt:
|
||||
packages: # don't install nginx and apache
|
||||
- libaugeas0
|
||||
<<: *extended-test-suite
|
||||
- language: generic
|
||||
env: TOXENV=py27
|
||||
os: osx
|
||||
# Using this osx_image is a workaround for
|
||||
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
|
||||
osx_image: xcode10.2
|
||||
addons:
|
||||
homebrew:
|
||||
packages:
|
||||
- augeas
|
||||
- python2
|
||||
<<: *extended-test-suite
|
||||
- language: generic
|
||||
env: TOXENV=py3
|
||||
os: osx
|
||||
# Using this osx_image is a workaround for
|
||||
# https://travis-ci.community/t/xcode-8-3-homebrew-outdated-error/3798.
|
||||
osx_image: xcode10.2
|
||||
addons:
|
||||
homebrew:
|
||||
packages:
|
||||
- augeas
|
||||
- python3
|
||||
<<: *extended-test-suite
|
||||
- python: "2.7"
|
||||
env: TOXENV=nginxroundtrip
|
||||
|
||||
|
||||
# Only build pushes to the master branch, PRs, and branches beginning with
|
||||
# `test-`. This reduces the number of simultaneous Travis runs, which speeds
|
||||
# turnaround time on review since there is a cap of 5 simultaneous runs.
|
||||
branches:
|
||||
only:
|
||||
- master
|
||||
- /^test-.*$/
|
||||
|
||||
# container-based infrastructure
|
||||
sudo: false
|
||||
|
||||
addons:
|
||||
# Custom /etc/hosts required for simple verification of http-01
|
||||
# and tls-sni-01, and for certbot_test_nginx
|
||||
hosts:
|
||||
- le.wtf
|
||||
- le1.wtf
|
||||
- le2.wtf
|
||||
- le3.wtf
|
||||
- nginx.wtf
|
||||
- boulder
|
||||
- boulder-mysql
|
||||
- boulder-rabbitmq
|
||||
apt:
|
||||
sources:
|
||||
- augeas
|
||||
packages: # Keep in sync with letsencrypt-auto-source/pieces/bootstrappers/deb_common.sh and Boulder.
|
||||
- python-dev
|
||||
- python-virtualenv
|
||||
- gcc
|
||||
- dialog
|
||||
- libaugeas0
|
||||
- libssl-dev
|
||||
- libffi-dev
|
||||
@@ -280,30 +111,23 @@ addons:
|
||||
# For certbot-nginx integration testing
|
||||
- nginx-light
|
||||
- openssl
|
||||
# for apacheconftest
|
||||
- apache2
|
||||
- libapache2-mod-wsgi
|
||||
- libapache2-mod-macro
|
||||
- sudo
|
||||
|
||||
# tools/pip_install.py is used to pin packages to a known working version
|
||||
# except in tests where the environment variable CERTBOT_NO_PIN is set.
|
||||
# virtualenv is listed here explicitly to make sure it is upgraded when
|
||||
# CERTBOT_NO_PIN is set to work around failures we've seen when using an older
|
||||
# version of virtualenv.
|
||||
install: 'tools/pip_install.py -U codecov tox virtualenv'
|
||||
# Most of the time TRAVIS_RETRY is an empty string, and has no effect on the
|
||||
# script command. It is set only to `travis_retry` during farm tests, in
|
||||
# order to trigger the Travis retry feature, and compensate the inherent
|
||||
# flakiness of these specific tests.
|
||||
script: '$TRAVIS_RETRY tox'
|
||||
install: "travis_retry pip install tox coveralls"
|
||||
script: 'travis_retry tox && ([ "xxx$BOULDER_INTEGRATION" = "xxx" ] || ./tests/travis-integration.sh)'
|
||||
|
||||
after_success: '[ "$TOXENV" == "py27-cover" ] && codecov -F linux'
|
||||
after_success: '[ "$TOXENV" == "cover" ] && coveralls'
|
||||
|
||||
notifications:
|
||||
email: false
|
||||
irc:
|
||||
channels:
|
||||
# This is set to a secure variable to prevent forks from sending
|
||||
# notifications. This value was created by installing
|
||||
# https://github.com/travis-ci/travis.rb and running
|
||||
# `travis encrypt "chat.freenode.net#certbot-devel"`.
|
||||
- secure: "EWW66E2+KVPZyIPR8ViENZwfcup4Gx3/dlimmAZE0WuLwxDCshBBOd3O8Rf6pBokEoZlXM5eDT6XdyJj8n0DLslgjO62pExdunXpbcMwdY7l1ELxX2/UbnDTE6UnPYa09qVBHNG7156Z6yE0x2lH4M9Ykvp0G0cubjPQHylAwo0="
|
||||
on_cancel: never
|
||||
- "chat.freenode.net#letsencrypt"
|
||||
on_success: never
|
||||
on_failure: always
|
||||
use_notice: true
|
||||
skip_join: true
|
||||
|
||||
267
AUTHORS.md
267
AUTHORS.md
@@ -1,267 +0,0 @@
|
||||
Authors
|
||||
=======
|
||||
|
||||
* [Aaron Zirbes](https://github.com/aaronzirbes)
|
||||
* Aaron Zuehlke
|
||||
* Ada Lovelace
|
||||
* [Adam Woodbeck](https://github.com/awoodbeck)
|
||||
* [Adrien Ferrand](https://github.com/adferrand)
|
||||
* [Aidin Gharibnavaz](https://github.com/aidin36)
|
||||
* [AJ ONeal](https://github.com/coolaj86)
|
||||
* [Alcaro](https://github.com/Alcaro)
|
||||
* [Alexander Mankuta](https://github.com/pointlessone)
|
||||
* [Alex Bowers](https://github.com/alexbowers)
|
||||
* [Alex Conlin](https://github.com/alexconlin)
|
||||
* [Alex Gaynor](https://github.com/alex)
|
||||
* [Alex Halderman](https://github.com/jhalderm)
|
||||
* [Alex Jordan](https://github.com/strugee)
|
||||
* [Alex Zorin](https://github.com/alexzorin)
|
||||
* [Amjad Mashaal](https://github.com/TheNavigat)
|
||||
* [Andrew Murray](https://github.com/radarhere)
|
||||
* [Andrzej Górski](https://github.com/andrzej3393)
|
||||
* [Anselm Levskaya](https://github.com/levskaya)
|
||||
* [Antoine Jacoutot](https://github.com/ajacoutot)
|
||||
* [asaph](https://github.com/asaph)
|
||||
* [Axel Beckert](https://github.com/xtaran)
|
||||
* [Bas](https://github.com/Mechazawa)
|
||||
* [benbankes](https://github.com/benbankes)
|
||||
* [Ben Irving](https://github.com/benileo)
|
||||
* [Benjamin Kerensa](https://github.com/bkerensa)
|
||||
* [Benjamin Neff](https://github.com/SuperTux88)
|
||||
* [Benjamin Piouffle](https://github.com/Betree)
|
||||
* [Ben Ubois](https://github.com/benubois)
|
||||
* [Ben Wolfe](https://github.com/bwolfe)
|
||||
* [Bigfish](https://github.com/bwolfe)
|
||||
* [Blake Griffith](https://github.com/cowlicks)
|
||||
* [Brad Warren](https://github.com/bmw)
|
||||
* [Brandon Kraft](https://github.com/kraftbj)
|
||||
* [Brandon Kreisel](https://github.com/kraftbj)
|
||||
* [Ceesjan Luiten](https://github.com/quinox)
|
||||
* [Chad Whitacre](https://github.com/whit537)
|
||||
* [Chhatoi Pritam Baral](https://github.com/pritambaral)
|
||||
* [Chris Johns](https://github.com/ter0)
|
||||
* [Chris Lamb](https://github.com/lamby)
|
||||
* [chrismarget](https://github.com/chrismarget)
|
||||
* [Christian Gärtner](https://github.com/ChristianGaertner)
|
||||
* [Christian Rosentreter](https://github.com/the-real-tokai)
|
||||
* [Christopher Brown](https://github.com/chbrown)
|
||||
* [Christopher Manning](https://github.com/christophermanning)
|
||||
* [Christoph Kisfeld](https://github.com/chk1)
|
||||
* [Clif Houck](https://github.com/ClifHouck)
|
||||
* [Cooper Quintin](https://github.com/cooperq)
|
||||
* [Corey Farwell](https://github.com/frewsxcv)
|
||||
* [Craig Smith](https://github.com/dashaxiong)
|
||||
* [Damian Poddebniak](https://github.com/duesee)
|
||||
* [Damien Nozay](https://github.com/dnozay)
|
||||
* [Damien Tournoud](https://github.com/damz)
|
||||
* [DanCld](https://github.com/DanCld)
|
||||
* [Daniel Albers](https://github.com/AID)
|
||||
* [Daniel Aleksandersen](https://github.com/da2x)
|
||||
* [Daniel Convissor](https://github.com/convissor)
|
||||
* [Daniel Huang](https://github.com/dhuang)
|
||||
* [Dave Guarino](https://github.com/daguar)
|
||||
* [David cz](https://github.com/dave-cz)
|
||||
* [David Dworken](https://github.com/ddworken)
|
||||
* [David Kreitschmann](https://kreitschmann.de)
|
||||
* [David Xia](https://github.com/davidxia)
|
||||
* [Devin Howard](https://github.com/devvmh)
|
||||
* [dokazaki](https://github.com/dokazaki)
|
||||
* [Dominic Cleal](https://github.com/domcleal)
|
||||
* [Dominic Lüchinger](https://github.com/dol)
|
||||
* [Douglas José](https://github.com/douglasjose)
|
||||
* [Erica Portnoy](https://github.com/ohemorange)
|
||||
* [Eric Engestrom](https://github.com/1ace)
|
||||
* [Eric Rescorla](https://github.com/ekr)
|
||||
* [Eric Wustrow](https://github.com/ewust)
|
||||
* [Erik Rose](https://github.com/erikrose)
|
||||
* [Eugene Kazakov](https://github.com/xgin)
|
||||
* [Fabian](https://github.com/faerbit)
|
||||
* [Faidon Liambotis](https://github.com/paravoid)
|
||||
* [Fan Jiang](https://github.com/tcz001)
|
||||
* [Felix Lechner](https://github.com/lechner)
|
||||
* [Felix Schwarz](https://github.com/FelixSchwarz)
|
||||
* [Felix Yan](https://github.com/felixonmars)
|
||||
* [Filip Ochnik](https://github.com/filipochnik)
|
||||
* [Francois Marier](https://github.com/fmarier)
|
||||
* [Frank](https://github.com/Frankkkkk)
|
||||
* [Frederic BLANC](https://github.com/fblanc)
|
||||
* [Garrett Robinson](https://github.com/garrettr)
|
||||
* [Gene Wood](https://github.com/gene1wood)
|
||||
* [Geoffroy Doucet](https://www.geoffroydoucet.com)
|
||||
* [Gian Carlo Pace](https://github.com/gicappa)
|
||||
* [Gilles Pietri](https://github.com/gilou)
|
||||
* [Giovanni Pellerano](https://github.com/evilaliv3)
|
||||
* [Giovanni Toraldo](https://github.com/gionn)
|
||||
* [Gordin](https://github.com/Gordin)
|
||||
* [Gregor Dschung](https://github.com/chkpnt)
|
||||
* [Gregory L. Dietsche](https://github.com/farmergreg)
|
||||
* [Greg Osuri](https://github.com/gosuri)
|
||||
* [Guillaume Boudreau](https://github.com/gboudreau)
|
||||
* [Harlan Lieberman-Berg](https://github.com/hlieberman)
|
||||
* [Henri Salo](https://github.com/fgeek)
|
||||
* [Henry Chen](https://github.com/henrychen95)
|
||||
* [Ingolf Becker](https://github.com/watercrossing)
|
||||
* [Jaap Eldering](https://github.com/eldering)
|
||||
* [Jacob Hoffman-Andrews](https://github.com/jsha)
|
||||
* [Jacob Sachs](https://github.com/jsachs)
|
||||
* [Jairo Llopis](https://github.com/Yajo)
|
||||
* [Jakub Warmuz](https://github.com/kuba)
|
||||
* [James Kasten](https://github.com/jdkasten)
|
||||
* [Jason Grinblat](https://github.com/ptychomancer)
|
||||
* [Jay Faulkner](https://github.com/jayofdoom)
|
||||
* [J.C. Jones](https://github.com/jcjones)
|
||||
* [Jeff Hodges](https://github.com/jmhodges)
|
||||
* [Jeremy Gillula](https://github.com/jgillula)
|
||||
* [Jeroen Ketelaar](https://github.com/JKetelaar)
|
||||
* [Jeroen Pluimers](https://github.com/jpluimers)
|
||||
* [j](https://github.com/bit)
|
||||
* [Jim Tittsler](https://github.com/jimt)
|
||||
* [Joe Ranweiler](https://github.com/ranweiler)
|
||||
* [Joerg Sonnenberger](https://github.com/jsonn)
|
||||
* [John Leach](https://github.com/johnl)
|
||||
* [John Reed](https://github.com/leerspace)
|
||||
* [Jonas Berlin](https://github.com/xkr47)
|
||||
* [Jonathan Herlin](https://github.com/Jonher937)
|
||||
* [Jon Walsh](https://github.com/code-tree)
|
||||
* [Joona Hoikkala](https://github.com/joohoi)
|
||||
* [Josh Soref](https://github.com/jsoref)
|
||||
* [Joubin Jabbari](https://github.com/joubin)
|
||||
* [Juho Juopperi](https://github.com/jkjuopperi)
|
||||
* [Kane York](https://github.com/riking)
|
||||
* [Kenichi Maehashi](https://github.com/kmaehashi)
|
||||
* [Kenneth Skovhede](https://github.com/kenkendk)
|
||||
* [Kevin Burke](https://github.com/kevinburke)
|
||||
* [Kevin London](https://github.com/kevinlondon)
|
||||
* [Kubilay Kocak](https://github.com/koobs)
|
||||
* [LeCoyote](https://github.com/LeCoyote)
|
||||
* [Lee Watson](https://github.com/TheReverend403)
|
||||
* [Leo Famulari](https://github.com/lfam)
|
||||
* [lf](https://github.com/lf-)
|
||||
* [Liam Marshall](https://github.com/liamim)
|
||||
* [Lior Sabag](https://github.com/liorsbg)
|
||||
* [Lipis](https://github.com/lipis)
|
||||
* [lord63](https://github.com/lord63)
|
||||
* [Luca Beltrame](https://github.com/lbeltrame)
|
||||
* [Luca Ebach](https://github.com/lucebac)
|
||||
* [Luca Olivetti](https://github.com/olivluca)
|
||||
* [Luke Rogers](https://github.com/lukeroge)
|
||||
* [Maarten](https://github.com/mrtndwrd)
|
||||
* [Maikel Martens](https://github.com/krukas)
|
||||
* [Malte Janduda](https://github.com/MalteJ)
|
||||
* [Mantas Mikulėnas](https://github.com/grawity)
|
||||
* [Marcel Krüger](https://github.com/zauguin)
|
||||
* [Marcos Caceres](https://github.com/marcoscaceres)
|
||||
* [Marek Viger](https://github.com/freezy-sk)
|
||||
* [Mario Villaplana](https://github.com/supermari0)
|
||||
* [Marius Gedminas](https://github.com/mgedmin)
|
||||
* [Martey Dodoo](https://github.com/martey)
|
||||
* [Martijn Bastiaan](https://github.com/martijnbastiaan)
|
||||
* [Martijn Braam](https://github.com/MartijnBraam)
|
||||
* [Martin Brugger](https://github.com/mbrugger)
|
||||
* [Mathieu Leduc-Hamel](https://github.com/mlhamel)
|
||||
* [Matt Bostock](https://github.com/mattbostock)
|
||||
* [Matthew Ames](https://github.com/SuperMatt)
|
||||
* [Michael Schumacher](https://github.com/schumaml)
|
||||
* [Michael Strache](https://github.com/Jarodiv)
|
||||
* [Michael Sverdlin](https://github.com/sveder)
|
||||
* [Michael Watters](https://github.com/blackknight36)
|
||||
* [Michal Moravec](https://github.com/https://github.com/Majkl578)
|
||||
* [Michal Papis](https://github.com/mpapis)
|
||||
* [Mickaël Schoentgen](https://github.com/BoboTiG)
|
||||
* [Minn Soe](https://github.com/MinnSoe)
|
||||
* [Min RK](https://github.com/minrk)
|
||||
* [Miquel Ruiz](https://github.com/miquelruiz)
|
||||
* [Môshe van der Sterre](https://github.com/moshevds)
|
||||
* [mrstanwell](https://github.com/mrstanwell)
|
||||
* [Nav Aulakh](https://github.com/Nav)
|
||||
* [Nelson Elhage](https://github.com/nelhage)
|
||||
* [Nick Fong](https://github.com/nickfong)
|
||||
* [Nick Le Mouton](https://github.com/NoodlesNZ)
|
||||
* [Nikos Roussos](https://github.com/comzeradd)
|
||||
* [Noah Swartz](https://github.com/swartzcr)
|
||||
* [Ola Bini](https://github.com/olabini)
|
||||
* [Ondřej Súkup](https://github.com/mimi1vx)
|
||||
* [Ondřej Surý](https://github.com/oerdnj)
|
||||
* [osirisinferi](https://github.com/osirisinferi)
|
||||
* Patrick Figel
|
||||
* [Patrick Heppler](https://github.com/PatrickHeppler)
|
||||
* [Paul Feitzinger](https://github.com/pfeyz)
|
||||
* [Pavan Gupta](https://github.com/pavgup)
|
||||
* [Pavel Pavlov](https://github.com/ghost355)
|
||||
* [Peter Conrad](https://github.com/pconrad-fb)
|
||||
* [Peter Eckersley](https://github.com/pde)
|
||||
* [Peter Mosmans](https://github.com/PeterMosmans)
|
||||
* [Philippe Langlois](https://github.com/langloisjp)
|
||||
* [Philipp Spitzer](https://github.com/spitza)
|
||||
* [Piero Steinger](https://github.com/Jadaw1n)
|
||||
* [Pierre Jaury](https://github.com/kaiyou)
|
||||
* [Piotr Kasprzyk](https://github.com/kwadrat)
|
||||
* [Prayag Verma](https://github.com/pra85)
|
||||
* [Reinaldo de Souza Jr](https://github.com/juniorz)
|
||||
* [Remi Rampin](https://github.com/remram44)
|
||||
* [Rémy HUBSCHER](https://github.com/Natim)
|
||||
* [Rémy Léone](https://github.com/sieben)
|
||||
* [Richard Barnes](https://github.com/r-barnes)
|
||||
* [Richard Panek](https://github.com/kernelpanek)
|
||||
* [Robert Buchholz](https://github.com/rbu)
|
||||
* [Robert Habermann](https://github.com/frennkie)
|
||||
* [Robert Xiao](https://github.com/nneonneo)
|
||||
* [Roland Shoemaker](https://github.com/rolandshoemaker)
|
||||
* [Roy Wellington Ⅳ](https://github.com/thanatos)
|
||||
* [rugk](https://github.com/rugk)
|
||||
* [Sachi King](https://github.com/nakato)
|
||||
* [Sagi Kedmi](https://github.com/sagi)
|
||||
* [Sam Lanning](https://github.com/samlanning)
|
||||
* [sapics](https://github.com/sapics)
|
||||
* [Scott Barr](https://github.com/scottjbarr)
|
||||
* [Scott Merrill](https://github.com/skpy)
|
||||
* [Sebastian Bögl](https://github.com/TheBoegl)
|
||||
* [Sebastian Wagner](https://github.com/sebix)
|
||||
* [sedrubal](https://github.com/sedrubal)
|
||||
* [Seppe Stas](https://github.com/seppestas)
|
||||
* [Sergey Nuzdhin](https://github.com/lwolf)
|
||||
* [Seth Schoen](https://github.com/schoen)
|
||||
* [Sharif Nassar](https://github.com/mrwacky42)
|
||||
* [Shaun Cummiskey](https://github.com/ampersign)
|
||||
* [Shiloh Heurich](https://github.com/sheurich)
|
||||
* [silverwind](https://github.com/silverwind)
|
||||
* [Sorvani](https://github.com/sorvani)
|
||||
* [Spencer Bliven](https://github.com/sbliven)
|
||||
* [Stacey Sheldon](https://github.com/solidgoldbomb)
|
||||
* [Stavros Korokithakis](https://github.com/skorokithakis)
|
||||
* [Stefan Weil](https://github.com/stweil)
|
||||
* [Steve Desmond](https://github.com/stevedesmond-ca)
|
||||
* [sydneyli](https://github.com/sydneyli)
|
||||
* [Tan Jay Jun](https://github.com/jayjun)
|
||||
* [Tapple Gao](https://github.com/tapple)
|
||||
* [Telepenin Nikolay](https://github.com/telepenin)
|
||||
* [Thomas Cottier](https://github.com/tcottier-enalean)
|
||||
* [Thomas Mayer](https://github.com/thomaszbz)
|
||||
* [Thomas Waldmann](https://github.com/ThomasWaldmann)
|
||||
* [Thom Wiggers](https://github.com/thomwiggers)
|
||||
* [Till Maas](https://github.com/tyll)
|
||||
* [Timothy Guan-tin Chien](https://github.com/timdream)
|
||||
* [Torsten Bögershausen](https://github.com/tboegi)
|
||||
* [Travis Raines](https://github.com/rainest)
|
||||
* [Trung Ngo](https://github.com/Ngo-The-Trung)
|
||||
* [Valentin](https://github.com/e00E)
|
||||
* [venyii](https://github.com/venyii)
|
||||
* [Viktor Szakats](https://github.com/vszakats)
|
||||
* [Ville Skyttä](https://github.com/scop)
|
||||
* [Vinney Cavallo](https://github.com/vcavallo)
|
||||
* [Vladimir Rutsky](https://github.com/rutsky)
|
||||
* [Wang Yu](https://github.com/wyhitcs)
|
||||
* [Ward Vandewege](https://github.com/cure)
|
||||
* [Whyfoo](https://github.com/whyfoo)
|
||||
* [Wilfried Teiken](https://github.com/wteiken)
|
||||
* [Willem Fibbe](https://github.com/fibbers)
|
||||
* [William Budington](https://github.com/Hainish)
|
||||
* [Will Newby](https://github.com/willnewby)
|
||||
* [Will Oller](https://github.com/willoller)
|
||||
* [Yan](https://github.com/diracdeltas)
|
||||
* [Yen Chi Hsuan](https://github.com/yan12125)
|
||||
* [Yomna](https://github.com/ynasser)
|
||||
* [Yoni Jah](https://github.com/yonjah)
|
||||
* [YourDaddyIsHere](https://github.com/YourDaddyIsHere)
|
||||
* [Zach Shepherd](https://github.com/zjs)
|
||||
* [陈三](https://github.com/chenxsan)
|
||||
@@ -1 +0,0 @@
|
||||
certbot/CHANGELOG.md
|
||||
8
CHANGES.rst
Normal file
8
CHANGES.rst
Normal file
@@ -0,0 +1,8 @@
|
||||
ChangeLog
|
||||
=========
|
||||
|
||||
To see the changes in a given release, view the issues closed in a given
|
||||
release's GitHub milestone:
|
||||
|
||||
- `Past releases <https://github.com/certbot/certbot/milestones?state=closed>`_
|
||||
- `Upcoming releases <https://github.com/certbot/certbot/milestones>`_
|
||||
@@ -1 +0,0 @@
|
||||
This project is governed by [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode).
|
||||
@@ -33,5 +33,3 @@ started. In particular, we recommend you read these sections
|
||||
- [Finding issues to work on](https://certbot.eff.org/docs/contributing.html#find-issues-to-work-on)
|
||||
- [Coding style](https://certbot.eff.org/docs/contributing.html#coding-style)
|
||||
- [Submitting a pull request](https://certbot.eff.org/docs/contributing.html#submitting-a-pull-request)
|
||||
- [EFF's Public Projects Code of Conduct](https://www.eff.org/pages/eppcode)
|
||||
|
||||
|
||||
70
Dockerfile
Normal file
70
Dockerfile
Normal file
@@ -0,0 +1,70 @@
|
||||
# https://github.com/letsencrypt/letsencrypt/pull/431#issuecomment-103659297
|
||||
# it is more likely developers will already have ubuntu:trusty rather
|
||||
# than e.g. debian:jessie and image size differences are negligible
|
||||
FROM ubuntu:trusty
|
||||
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
|
||||
MAINTAINER William Budington <bill@eff.org>
|
||||
|
||||
# Note: this only exposes the port to other docker containers. You
|
||||
# still have to bind to 443@host at runtime, as per the ACME spec.
|
||||
EXPOSE 443
|
||||
|
||||
# TODO: make sure --config-dir and --work-dir cannot be changed
|
||||
# through the CLI (certbot-docker wrapper that uses standalone
|
||||
# authenticator and text mode only?)
|
||||
VOLUME /etc/letsencrypt /var/lib/letsencrypt
|
||||
|
||||
WORKDIR /opt/certbot
|
||||
|
||||
# no need to mkdir anything:
|
||||
# https://docs.docker.com/reference/builder/#copy
|
||||
# If <dest> doesn't exist, it is created along with all missing
|
||||
# directories in its path.
|
||||
|
||||
ENV DEBIAN_FRONTEND=noninteractive
|
||||
|
||||
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
|
||||
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
# the above is not likely to change, so by putting it further up the
|
||||
# Dockerfile we make sure we cache as much as possible
|
||||
|
||||
|
||||
COPY setup.py README.rst CHANGES.rst MANIFEST.in letsencrypt-auto-source/pieces/pipstrap.py /opt/certbot/src/
|
||||
|
||||
# all above files are necessary for setup.py and venv setup, however,
|
||||
# package source code directory has to be copied separately to a
|
||||
# subdirectory...
|
||||
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
|
||||
# directory, the entire contents of the directory are copied,
|
||||
# including filesystem metadata. Note: The directory itself is not
|
||||
# copied, just its contents." Order again matters, three files are far
|
||||
# more likely to be cached than the whole project directory
|
||||
|
||||
COPY certbot /opt/certbot/src/certbot/
|
||||
COPY acme /opt/certbot/src/acme/
|
||||
COPY certbot-apache /opt/certbot/src/certbot-apache/
|
||||
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
|
||||
|
||||
|
||||
RUN virtualenv --no-site-packages -p python2 /opt/certbot/venv
|
||||
|
||||
# PATH is set now so pipstrap upgrades the correct (v)env
|
||||
ENV PATH /opt/certbot/venv/bin:$PATH
|
||||
RUN /opt/certbot/venv/bin/python /opt/certbot/src/pipstrap.py && \
|
||||
/opt/certbot/venv/bin/pip install \
|
||||
-e /opt/certbot/src/acme \
|
||||
-e /opt/certbot/src \
|
||||
-e /opt/certbot/src/certbot-apache \
|
||||
-e /opt/certbot/src/certbot-nginx
|
||||
|
||||
# install in editable mode (-e) to save space: it's not possible to
|
||||
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
|
||||
# this might also help in debugging: you can "docker run --entrypoint
|
||||
# bash" and investigate, apply patches, etc.
|
||||
|
||||
ENTRYPOINT [ "certbot" ]
|
||||
@@ -1,20 +1,68 @@
|
||||
# This Dockerfile builds an image for development.
|
||||
FROM debian:buster
|
||||
FROM ubuntu:trusty
|
||||
MAINTAINER Jakub Warmuz <jakub@warmuz.org>
|
||||
MAINTAINER William Budington <bill@eff.org>
|
||||
MAINTAINER Yan <yan@eff.org>
|
||||
|
||||
# Note: this only exposes the port to other docker containers.
|
||||
EXPOSE 80 443
|
||||
# Note: this only exposes the port to other docker containers. You
|
||||
# still have to bind to 443@host at runtime, as per the ACME spec.
|
||||
EXPOSE 443
|
||||
|
||||
WORKDIR /opt/certbot/src
|
||||
# TODO: make sure --config-dir and --work-dir cannot be changed
|
||||
# through the CLI (certbot-docker wrapper that uses standalone
|
||||
# authenticator and text mode only?)
|
||||
VOLUME /etc/letsencrypt /var/lib/letsencrypt
|
||||
|
||||
COPY . .
|
||||
RUN apt-get update && \
|
||||
apt-get install apache2 git python3-dev python3-venv gcc libaugeas0 \
|
||||
libssl-dev libffi-dev ca-certificates openssl nginx-light -y && \
|
||||
WORKDIR /opt/certbot
|
||||
|
||||
# no need to mkdir anything:
|
||||
# https://docs.docker.com/reference/builder/#copy
|
||||
# If <dest> doesn't exist, it is created along with all missing
|
||||
# directories in its path.
|
||||
|
||||
# TODO: Install non-default Python versions for tox.
|
||||
# TODO: Install Apache/Nginx for plugin development.
|
||||
COPY letsencrypt-auto-source/letsencrypt-auto /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto
|
||||
RUN /opt/certbot/src/letsencrypt-auto-source/letsencrypt-auto --os-packages-only && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
RUN VENV_NAME="../venv3" python3 tools/venv3.py
|
||||
# the above is not likely to change, so by putting it further up the
|
||||
# Dockerfile we make sure we cache as much as possible
|
||||
|
||||
ENV PATH /opt/certbot/venv3/bin:$PATH
|
||||
COPY setup.py README.rst CHANGES.rst MANIFEST.in linter_plugin.py tox.cover.sh tox.ini pep8.travis.sh .pep8 .pylintrc /opt/certbot/src/
|
||||
|
||||
# all above files are necessary for setup.py, however, package source
|
||||
# code directory has to be copied separately to a subdirectory...
|
||||
# https://docs.docker.com/reference/builder/#copy: "If <src> is a
|
||||
# directory, the entire contents of the directory are copied,
|
||||
# including filesystem metadata. Note: The directory itself is not
|
||||
# copied, just its contents." Order again matters, three files are far
|
||||
# more likely to be cached than the whole project directory
|
||||
|
||||
COPY certbot /opt/certbot/src/certbot/
|
||||
COPY acme /opt/certbot/src/acme/
|
||||
COPY certbot-apache /opt/certbot/src/certbot-apache/
|
||||
COPY certbot-nginx /opt/certbot/src/certbot-nginx/
|
||||
COPY letshelp-certbot /opt/certbot/src/letshelp-certbot/
|
||||
COPY certbot-compatibility-test /opt/certbot/src/certbot-compatibility-test/
|
||||
COPY tests /opt/certbot/src/tests/
|
||||
|
||||
RUN virtualenv --no-site-packages -p python2 /opt/certbot/venv && \
|
||||
/opt/certbot/venv/bin/pip install \
|
||||
-e /opt/certbot/src/acme \
|
||||
-e /opt/certbot/src \
|
||||
-e /opt/certbot/src/certbot-apache \
|
||||
-e /opt/certbot/src/certbot-nginx \
|
||||
-e /opt/certbot/src/letshelp-certbot \
|
||||
-e /opt/certbot/src/certbot-compatibility-test \
|
||||
-e /opt/certbot/src[dev,docs]
|
||||
|
||||
# install in editable mode (-e) to save space: it's not possible to
|
||||
# "rm -rf /opt/certbot/src" (it's stays in the underlaying image);
|
||||
# this might also help in debugging: you can "docker run --entrypoint
|
||||
# bash" and investigate, apply patches, etc.
|
||||
|
||||
ENV PATH /opt/certbot/venv/bin:$PATH
|
||||
|
||||
@@ -1,22 +0,0 @@
|
||||
If you're having trouble using Certbot and aren't sure you've found a bug or
|
||||
request for a new feature, please first try asking for help at
|
||||
https://community.letsencrypt.org/. There is a much larger community there of
|
||||
people familiar with the project who will be able to more quickly answer your
|
||||
questions.
|
||||
|
||||
## My operating system is (include version):
|
||||
|
||||
|
||||
## I installed Certbot with (certbot-auto, OS package manager, pip, etc):
|
||||
|
||||
|
||||
## I ran this command and it produced this output:
|
||||
|
||||
|
||||
## Certbot's behavior differed from what I expected because:
|
||||
|
||||
|
||||
## Here is a Certbot log showing the issue (if available):
|
||||
###### Logs are stored in `/var/log/letsencrypt` by default. Feel free to redact domains, e-mail and IP addresses as you see fit.
|
||||
|
||||
## Here is the relevant nginx server block or Apache virtualhost for the domain I am configuring:
|
||||
@@ -1,10 +1,8 @@
|
||||
include README.rst
|
||||
include CHANGELOG.md
|
||||
include CHANGES.rst
|
||||
include CONTRIBUTING.md
|
||||
include LICENSE.txt
|
||||
include linter_plugin.py
|
||||
recursive-include docs *
|
||||
recursive-include examples *
|
||||
recursive-include certbot/tests/testdata *
|
||||
recursive-include tests *.py
|
||||
include certbot/ssl-dhparams.pem
|
||||
global-exclude __pycache__
|
||||
global-exclude *.py[cod]
|
||||
@@ -1 +0,0 @@
|
||||
certbot/README.rst
|
||||
177
README.rst
Normal file
177
README.rst
Normal file
@@ -0,0 +1,177 @@
|
||||
.. This file contains a series of comments that are used to include sections of this README in other files. Do not modify these comments unless you know what you are doing. tag:intro-begin
|
||||
|
||||
Certbot is part of EFF’s effort to encrypt the entire Internet. Secure communication over the Web relies on HTTPS, which requires the use of a digital certificate that lets browsers verify the identify of web servers (e.g., is that really google.com?). Web servers obtain their certificates from trusted third parties called certificate authorities (CAs). Certbot is an easy-to-use client that fetches a certificate from Let’s Encrypt—an open certificate authority launched by the EFF, Mozilla, and others—and deploys it to a web server.
|
||||
|
||||
Anyone who has gone through the trouble of setting up a secure website knows what a hassle getting and maintaining a certificate is. Certbot and Let’s Encrypt can automate away the pain and let you turn on and manage HTTPS with simple commands. Using Certbot and Let's Encrypt is free, so there’s no need to arrange payment.
|
||||
|
||||
How you use Certbot depends on the configuration of your web server. The best way to get started is to use our `interactive guide <https://certbot.eff.org>`_. It generates instructions based on your configuration settings. In most cases, you’ll need `root or administrator access <https://certbot.eff.org/faq/#does-certbot-require-root-privileges>`_ to your web server to run Certbot.
|
||||
|
||||
If you’re using a hosted service and don’t have direct access to your web server, you might not be able to use Certbot. Check with your hosting provider for documentation about uploading certificates or using certificates issues by Let’s Encrypt.
|
||||
|
||||
Certbot is a fully-featured, extensible client for the Let's
|
||||
Encrypt CA (or any other CA that speaks the `ACME
|
||||
<https://github.com/ietf-wg-acme/acme/blob/master/draft-ietf-acme-acme.md>`_
|
||||
protocol) that can automate the tasks of obtaining certificates and
|
||||
configuring webservers to use them. This client runs on Unix-based operating
|
||||
systems.
|
||||
|
||||
Until May 2016, Certbot was named simply ``letsencrypt`` or ``letsencrypt-auto``,
|
||||
depending on install method. Instructions on the Internet, and some pieces of the
|
||||
software, may still refer to this older name.
|
||||
|
||||
Contributing
|
||||
------------
|
||||
|
||||
If you'd like to contribute to this project please read `Developer Guide
|
||||
<https://certbot.eff.org/docs/contributing.html>`_.
|
||||
|
||||
.. _installation:
|
||||
|
||||
Installation
|
||||
------------
|
||||
|
||||
The easiest way to install Certbot is by visiting `certbot.eff.org`_, where you can
|
||||
find the correct installation instructions for many web server and OS combinations.
|
||||
For more information, see the `User Guide <https://certbot.eff.org/docs/using.html#getting-certbot>`_.
|
||||
|
||||
.. _certbot.eff.org: https://certbot.eff.org/
|
||||
|
||||
How to run the client
|
||||
---------------------
|
||||
|
||||
In many cases, you can just run ``certbot-auto`` or ``certbot``, and the
|
||||
client will guide you through the process of obtaining and installing certs
|
||||
interactively.
|
||||
|
||||
For full command line help, you can type::
|
||||
|
||||
./certbot-auto --help all
|
||||
|
||||
|
||||
You can also tell it exactly what you want it to do from the command line.
|
||||
For instance, if you want to obtain a cert for ``example.com``,
|
||||
``www.example.com``, and ``other.example.net``, using the Apache plugin to both
|
||||
obtain and install the certs, you could do this::
|
||||
|
||||
./certbot-auto --apache -d example.com -d www.example.com -d other.example.net
|
||||
|
||||
(The first time you run the command, it will make an account, and ask for an
|
||||
email and agreement to the Let's Encrypt Subscriber Agreement; you can
|
||||
automate those with ``--email`` and ``--agree-tos``)
|
||||
|
||||
If you want to use a webserver that doesn't have full plugin support yet, you
|
||||
can still use "standalone" or "webroot" plugins to obtain a certificate::
|
||||
|
||||
./certbot-auto certonly --standalone --email admin@example.com -d example.com -d www.example.com -d other.example.net
|
||||
|
||||
|
||||
Understanding the client in more depth
|
||||
--------------------------------------
|
||||
|
||||
To understand what the client is doing in detail, it's important to
|
||||
understand the way it uses plugins. Please see the `explanation of
|
||||
plugins <https://certbot.eff.org/docs/using.html#plugins>`_ in
|
||||
the User Guide.
|
||||
|
||||
Links
|
||||
=====
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:links-begin
|
||||
|
||||
Documentation: https://certbot.eff.org/docs
|
||||
|
||||
Software project: https://github.com/certbot/certbot
|
||||
|
||||
Notes for developers: https://certbot.eff.org/docs/contributing.html
|
||||
|
||||
Main Website: https://certbot.eff.org
|
||||
|
||||
Let's Encrypt Website: https://letsencrypt.org
|
||||
|
||||
IRC Channel: #letsencrypt on `Freenode`_ or #certbot on `OFTC`_
|
||||
|
||||
Community: https://community.letsencrypt.org
|
||||
|
||||
ACME spec: http://ietf-wg-acme.github.io/acme/
|
||||
|
||||
ACME working area in github: https://github.com/ietf-wg-acme/acme
|
||||
|
||||
|
||||
Mailing list: `client-dev`_ (to subscribe without a Google account, send an
|
||||
email to client-dev+subscribe@letsencrypt.org)
|
||||
|
||||
|build-status| |coverage| |docs| |container|
|
||||
|
||||
.. _Freenode: https://webchat.freenode.net?channels=%23letsencrypt
|
||||
|
||||
.. _OFTC: https://webchat.oftc.net?channels=%23certbot
|
||||
|
||||
.. _client-dev: https://groups.google.com/a/letsencrypt.org/forum/#!forum/client-dev
|
||||
|
||||
.. |build-status| image:: https://travis-ci.org/certbot/certbot.svg?branch=master
|
||||
:target: https://travis-ci.org/certbot/certbot
|
||||
:alt: Travis CI status
|
||||
|
||||
.. |coverage| image:: https://coveralls.io/repos/certbot/certbot/badge.svg?branch=master
|
||||
:target: https://coveralls.io/r/certbot/certbot
|
||||
:alt: Coverage status
|
||||
|
||||
.. |docs| image:: https://readthedocs.org/projects/letsencrypt/badge/
|
||||
:target: https://readthedocs.org/projects/letsencrypt/
|
||||
:alt: Documentation status
|
||||
|
||||
.. |container| image:: https://quay.io/repository/letsencrypt/letsencrypt/status
|
||||
:target: https://quay.io/repository/letsencrypt/letsencrypt
|
||||
:alt: Docker Repository on Quay.io
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:links-end
|
||||
|
||||
System Requirements
|
||||
===================
|
||||
|
||||
The Let's Encrypt Client presently only runs on Unix-ish OSes that include
|
||||
Python 2.6 or 2.7; Python 3.x support will hopefully be added in the future. The
|
||||
client requires root access in order to write to ``/etc/letsencrypt``,
|
||||
``/var/log/letsencrypt``, ``/var/lib/letsencrypt``; to bind to ports 80 and 443
|
||||
(if you use the ``standalone`` plugin) and to read and modify webserver
|
||||
configurations (if you use the ``apache`` or ``nginx`` plugins). If none of
|
||||
these apply to you, it is theoretically possible to run without root privileges,
|
||||
but for most users who want to avoid running an ACME client as root, either
|
||||
`letsencrypt-nosudo <https://github.com/diafygi/letsencrypt-nosudo>`_ or
|
||||
`simp_le <https://github.com/kuba/simp_le>`_ are more appropriate choices.
|
||||
|
||||
The Apache plugin currently requires a Debian-based OS with augeas version
|
||||
1.0; this includes Ubuntu 12.04+ and Debian 7+.
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:intro-end
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:features-begin
|
||||
|
||||
Current Features
|
||||
=====================
|
||||
|
||||
* Supports multiple web servers:
|
||||
|
||||
- apache/2.x (working on Debian 8+ and Ubuntu 12.04+)
|
||||
- standalone (runs its own simple webserver to prove you control a domain)
|
||||
- webroot (adds files to webroot directories in order to prove control of
|
||||
domains and obtain certs)
|
||||
- nginx/0.8.48+ (highly experimental, not included in certbot-auto)
|
||||
|
||||
* The private key is generated locally on your system.
|
||||
* Can talk to the Let's Encrypt CA or optionally to other ACME
|
||||
compliant services.
|
||||
* Can get domain-validated (DV) certificates.
|
||||
* Can revoke certificates.
|
||||
* Adjustable RSA key bit-length (2048 (default), 4096, ...).
|
||||
* Can optionally install a http -> https redirect, so your site effectively
|
||||
runs https only (Apache only)
|
||||
* Fully automated.
|
||||
* Configuration changes are logged and can be reverted.
|
||||
* Supports ncurses and text (-t) UI, or can be driven entirely from the
|
||||
command line.
|
||||
* Free and Open Source Software, made with Python.
|
||||
|
||||
.. Do not modify this comment unless you know what you're doing. tag:features-end
|
||||
|
||||
For extensive documentation on using and contributing to Certbot, go to https://certbot.eff.org/docs. If you would like to contribute to the project or run the latest code from git, you should read our `developer guide <https://certbot.eff.org/docs/contributing.html>`_.
|
||||
40
Vagrantfile
vendored
Normal file
40
Vagrantfile
vendored
Normal file
@@ -0,0 +1,40 @@
|
||||
# -*- mode: ruby -*-
|
||||
# vi: set ft=ruby :
|
||||
|
||||
# Vagrantfile API/syntax version. Don't touch unless you know what you're doing!
|
||||
VAGRANTFILE_API_VERSION = "2"
|
||||
|
||||
# Setup instructions from docs/contributing.rst
|
||||
# Script installs dependencies for tox and boulder integration
|
||||
$ubuntu_setup_script = <<SETUP_SCRIPT
|
||||
cd /vagrant
|
||||
./letsencrypt-auto-source/letsencrypt-auto --os-packages-only
|
||||
./tools/venv.sh
|
||||
wget https://storage.googleapis.com/golang/go1.5.3.linux-amd64.tar.gz -P /tmp/
|
||||
sudo tar -C /usr/local -xzf /tmp/go1.5.3.linux-amd64.tar.gz
|
||||
if ! grep -Fxq "export GOROOT=/usr/local/go" /home/vagrant/.profile ; then echo "export GOROOT=/usr/local/go" >> /home/vagrant/.profile; fi
|
||||
if ! grep -Fxq "export PATH=\\$GOROOT/bin:\\$PATH" /home/vagrant/.profile ; then echo "export PATH=\\$GOROOT/bin:\\$PATH" >> /home/vagrant/.profile; fi
|
||||
if ! grep -Fxq "export GOPATH=\\$HOME/go" /home/vagrant/.profile ; then echo "export GOPATH=\\$HOME/go" >> /home/vagrant/.profile; fi
|
||||
if ! grep -Fxq "cd /vagrant/; ./tests/boulder-start.sh &" /etc/rc.local ; then sed -i -e '$i \cd /vagrant/; ./tests/boulder-start.sh &\n' /etc/rc.local; fi
|
||||
export DEBIAN_FRONTEND=noninteractive
|
||||
sudo -E apt-get -q -y install git make libltdl-dev mariadb-server rabbitmq-server nginx-light
|
||||
SETUP_SCRIPT
|
||||
|
||||
Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|
|
||||
|
||||
config.vm.define "ubuntu-trusty", primary: true do |ubuntu_trusty|
|
||||
ubuntu_trusty.vm.box = "ubuntu/trusty64"
|
||||
ubuntu_trusty.vm.provision "shell", inline: $ubuntu_setup_script
|
||||
ubuntu_trusty.vm.provider "virtualbox" do |v|
|
||||
# VM needs more memory to run test suite, got "OSError: [Errno 12]
|
||||
# Cannot allocate memory" when running
|
||||
# letsencrypt.client.tests.display.util_test.NcursesDisplayTest
|
||||
v.memory = 1024
|
||||
|
||||
# Handle cases when the host is behind a private network by making the
|
||||
# NAT engine use the host's resolver mechanisms to handle DNS requests.
|
||||
v.customize ["modifyvm", :id, "--natdnshostresolver1", "on"]
|
||||
end
|
||||
end
|
||||
|
||||
end
|
||||
4
acme/.pep8
Normal file
4
acme/.pep8
Normal file
@@ -0,0 +1,4 @@
|
||||
[pep8]
|
||||
# E265 block comment should start with '# '
|
||||
# E501 line too long (X > 79 characters)
|
||||
ignore = E265,E501
|
||||
377
acme/.pylintrc
Normal file
377
acme/.pylintrc
Normal file
@@ -0,0 +1,377 @@
|
||||
[MASTER]
|
||||
|
||||
# Specify a configuration file.
|
||||
#rcfile=
|
||||
|
||||
# Python code to execute, usually for sys.path manipulation such as
|
||||
# pygtk.require().
|
||||
#init-hook=
|
||||
|
||||
# Profiled execution.
|
||||
profile=no
|
||||
|
||||
# Add files or directories to the blacklist. They should be base names, not
|
||||
# paths.
|
||||
ignore=CVS
|
||||
|
||||
# Pickle collected data for later comparisons.
|
||||
persistent=yes
|
||||
|
||||
# List of plugins (as comma separated values of python modules names) to load,
|
||||
# usually to register additional checkers.
|
||||
load-plugins=linter_plugin
|
||||
|
||||
# Use multiple processes to speed up Pylint.
|
||||
jobs=1
|
||||
|
||||
# Allow loading of arbitrary C extensions. Extensions are imported into the
|
||||
# active Python interpreter and may run arbitrary code.
|
||||
unsafe-load-any-extension=no
|
||||
|
||||
# A comma-separated list of package or module names from where C extensions may
|
||||
# be loaded. Extensions are loading into the active Python interpreter and may
|
||||
# run arbitrary code
|
||||
extension-pkg-whitelist=
|
||||
|
||||
|
||||
[MESSAGES CONTROL]
|
||||
|
||||
# Only show warnings with the listed confidence levels. Leave empty to show
|
||||
# all. Valid levels: HIGH, INFERENCE, INFERENCE_FAILURE, UNDEFINED
|
||||
confidence=
|
||||
|
||||
# Enable the message, report, category or checker with the given id(s). You can
|
||||
# either give multiple identifier separated by comma (,) or put this option
|
||||
# multiple time. See also the "--disable" option for examples.
|
||||
#enable=
|
||||
|
||||
# Disable the message, report, category or checker with the given id(s). You
|
||||
# can either give multiple identifiers separated by comma (,) or put this
|
||||
# option multiple times (only on the command line, not in the configuration
|
||||
# file where it should appear only once).You can also use "--disable=all" to
|
||||
# disable everything first and then reenable specific checks. For example, if
|
||||
# you want to run only the similarities checker, you can use "--disable=all
|
||||
# --enable=similarities". If you want to run only the classes checker, but have
|
||||
# no Warning level messages displayed, use"--disable=all --enable=classes
|
||||
# --disable=W"
|
||||
disable=fixme,locally-disabled,abstract-class-not-used
|
||||
# bstract-class-not-used cannot be disabled locally (at least in
|
||||
# pylint 1.4.1/2)
|
||||
|
||||
|
||||
|
||||
[REPORTS]
|
||||
|
||||
# Set the output format. Available formats are text, parseable, colorized, msvs
|
||||
# (visual studio) and html. You can also give a reporter class, eg
|
||||
# mypackage.mymodule.MyReporterClass.
|
||||
output-format=text
|
||||
|
||||
# Put messages in a separate file for each module / package specified on the
|
||||
# command line instead of printing them on stdout. Reports (if any) will be
|
||||
# written in a file name "pylint_global.[txt|html]".
|
||||
files-output=no
|
||||
|
||||
# Tells whether to display a full report or only the messages
|
||||
reports=yes
|
||||
|
||||
# Python expression which should return a note less than 10 (10 is the highest
|
||||
# note). You have access to the variables errors warning, statement which
|
||||
# respectively contain the number of errors / warnings messages and the total
|
||||
# number of statements analyzed. This is used by the global evaluation report
|
||||
# (RP0004).
|
||||
evaluation=10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10)
|
||||
|
||||
# Add a comment according to your evaluation note. This is used by the global
|
||||
# evaluation report (RP0004).
|
||||
comment=no
|
||||
|
||||
# Template used to display messages. This is a python new-style format string
|
||||
# used to format the message information. See doc for all details
|
||||
#msg-template=
|
||||
|
||||
|
||||
[FORMAT]
|
||||
|
||||
# Maximum number of characters on a single line.
|
||||
max-line-length=80
|
||||
|
||||
# Regexp for a line that is allowed to be longer than the limit.
|
||||
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
|
||||
|
||||
# Allow the body of an if to be on the same line as the test if there is no
|
||||
# else.
|
||||
single-line-if-stmt=no
|
||||
|
||||
# List of optional constructs for which whitespace checking is disabled
|
||||
no-space-check=trailing-comma,dict-separator
|
||||
|
||||
# Maximum number of lines in a module
|
||||
max-module-lines=1000
|
||||
|
||||
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
|
||||
# tab).
|
||||
indent-string=' '
|
||||
|
||||
# Number of spaces of indent required inside a hanging or continued line.
|
||||
indent-after-paren=4
|
||||
|
||||
# Expected format of line ending, e.g. empty (any line ending), LF or CRLF.
|
||||
expected-line-ending-format=
|
||||
|
||||
|
||||
[MISCELLANEOUS]
|
||||
|
||||
# List of note tags to take in consideration, separated by a comma.
|
||||
notes=FIXME,XXX,TODO
|
||||
|
||||
|
||||
[LOGGING]
|
||||
|
||||
# Logging modules to check that the string format arguments are in logging
|
||||
# function parameter format
|
||||
logging-modules=logging,logger
|
||||
|
||||
|
||||
[SPELLING]
|
||||
|
||||
# Spelling dictionary name. Available dictionaries: none. To make it working
|
||||
# install python-enchant package.
|
||||
spelling-dict=
|
||||
|
||||
# List of comma separated words that should not be checked.
|
||||
spelling-ignore-words=
|
||||
|
||||
# A path to a file that contains private dictionary; one word per line.
|
||||
spelling-private-dict-file=
|
||||
|
||||
# Tells whether to store unknown words to indicated private dictionary in
|
||||
# --spelling-private-dict-file option instead of raising a message.
|
||||
spelling-store-unknown-words=no
|
||||
|
||||
|
||||
[TYPECHECK]
|
||||
|
||||
# Tells whether missing members accessed in mixin class should be ignored. A
|
||||
# mixin class is detected if its name ends with "mixin" (case insensitive).
|
||||
ignore-mixin-members=yes
|
||||
|
||||
# List of module names for which member attributes should not be checked
|
||||
# (useful for modules/projects where namespaces are manipulated during runtime
|
||||
# and thus existing member attributes cannot be deduced by static analysis
|
||||
ignored-modules=
|
||||
|
||||
# List of classes names for which member attributes should not be checked
|
||||
# (useful for classes with attributes dynamically set).
|
||||
ignored-classes=SQLObject
|
||||
|
||||
# When zope mode is activated, add a predefined set of Zope acquired attributes
|
||||
# to generated-members.
|
||||
zope=no
|
||||
|
||||
# List of members which are set dynamically and missed by pylint inference
|
||||
# system, and so shouldn't trigger E0201 when accessed. Python regular
|
||||
# expressions are accepted.
|
||||
generated-members=REQUEST,acl_users,aq_parent
|
||||
|
||||
|
||||
[SIMILARITIES]
|
||||
|
||||
# Minimum lines number of a similarity.
|
||||
min-similarity-lines=4
|
||||
|
||||
# Ignore comments when computing similarities.
|
||||
ignore-comments=yes
|
||||
|
||||
# Ignore docstrings when computing similarities.
|
||||
ignore-docstrings=yes
|
||||
|
||||
# Ignore imports when computing similarities.
|
||||
ignore-imports=no
|
||||
|
||||
|
||||
[VARIABLES]
|
||||
|
||||
# Tells whether we should check for unused import in __init__ files.
|
||||
init-import=no
|
||||
|
||||
# A regular expression matching the name of dummy variables (i.e. expectedly
|
||||
# not used).
|
||||
dummy-variables-rgx=_$|dummy|unused
|
||||
|
||||
# List of additional names supposed to be defined in builtins. Remember that
|
||||
# you should avoid to define new builtins when possible.
|
||||
additional-builtins=
|
||||
|
||||
# List of strings which can identify a callback function by name. A callback
|
||||
# name must start or end with one of those strings.
|
||||
callbacks=cb_,_cb
|
||||
|
||||
|
||||
[BASIC]
|
||||
|
||||
# Required attributes for module, separated by a comma
|
||||
required-attributes=
|
||||
|
||||
# List of builtins function names that should not be used, separated by a comma
|
||||
bad-functions=map,filter,input
|
||||
|
||||
# Good variable names which should always be accepted, separated by a comma
|
||||
good-names=i,j,k,ex,Run,_,logger
|
||||
|
||||
# Bad variable names which should always be refused, separated by a comma
|
||||
bad-names=foo,bar,baz,toto,tutu,tata
|
||||
|
||||
# Colon-delimited sets of names that determine each other's naming style when
|
||||
# the name regexes allow several styles.
|
||||
name-group=
|
||||
|
||||
# Include a hint for the correct naming format with invalid-name
|
||||
include-naming-hint=no
|
||||
|
||||
# Regular expression matching correct function names
|
||||
function-rgx=[a-z_][a-z0-9_]{2,40}$
|
||||
|
||||
# Naming hint for function names
|
||||
function-name-hint=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Regular expression matching correct variable names
|
||||
variable-rgx=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Naming hint for variable names
|
||||
variable-name-hint=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Regular expression matching correct constant names
|
||||
const-rgx=(([A-Z_][A-Z0-9_]*)|(__.*__))$
|
||||
|
||||
# Naming hint for constant names
|
||||
const-name-hint=(([A-Z_][A-Z0-9_]*)|(__.*__))$
|
||||
|
||||
# Regular expression matching correct attribute names
|
||||
attr-rgx=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Naming hint for attribute names
|
||||
attr-name-hint=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Regular expression matching correct argument names
|
||||
argument-rgx=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Naming hint for argument names
|
||||
argument-name-hint=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Regular expression matching correct class attribute names
|
||||
class-attribute-rgx=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$
|
||||
|
||||
# Naming hint for class attribute names
|
||||
class-attribute-name-hint=([A-Za-z_][A-Za-z0-9_]{2,30}|(__.*__))$
|
||||
|
||||
# Regular expression matching correct inline iteration names
|
||||
inlinevar-rgx=[A-Za-z_][A-Za-z0-9_]*$
|
||||
|
||||
# Naming hint for inline iteration names
|
||||
inlinevar-name-hint=[A-Za-z_][A-Za-z0-9_]*$
|
||||
|
||||
# Regular expression matching correct class names
|
||||
class-rgx=[A-Z_][a-zA-Z0-9]+$
|
||||
|
||||
# Naming hint for class names
|
||||
class-name-hint=[A-Z_][a-zA-Z0-9]+$
|
||||
|
||||
# Regular expression matching correct module names
|
||||
module-rgx=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
|
||||
|
||||
# Naming hint for module names
|
||||
module-name-hint=(([a-z_][a-z0-9_]*)|([A-Z][a-zA-Z0-9]+))$
|
||||
|
||||
# Regular expression matching correct method names
|
||||
method-rgx=[a-z_][a-z0-9_]{2,49}$
|
||||
|
||||
# Naming hint for method names
|
||||
method-name-hint=[a-z_][a-z0-9_]{2,30}$
|
||||
|
||||
# Regular expression which should only match function or class names that do
|
||||
# not require a docstring.
|
||||
no-docstring-rgx=__.*__|test_[A-Za-z0-9_]*|_.*|.*Test
|
||||
|
||||
# Minimum line length for functions/classes that require docstrings, shorter
|
||||
# ones are exempt.
|
||||
docstring-min-length=-1
|
||||
|
||||
|
||||
[CLASSES]
|
||||
|
||||
# List of interface methods to ignore, separated by a comma. This is used for
|
||||
# instance to not check methods defines in Zope's Interface base class.
|
||||
ignore-iface-methods=isImplementedBy,deferred,extends,names,namesAndDescriptions,queryDescriptionFor,getBases,getDescriptionFor,getDoc,getName,getTaggedValue,getTaggedValueTags,isEqualOrExtendedBy,setTaggedValue,isImplementedByInstancesOf,adaptWith,is_implemented_by
|
||||
|
||||
# List of method names used to declare (i.e. assign) instance attributes.
|
||||
defining-attr-methods=__init__,__new__,setUp
|
||||
|
||||
# List of valid names for the first argument in a class method.
|
||||
valid-classmethod-first-arg=cls
|
||||
|
||||
# List of valid names for the first argument in a metaclass class method.
|
||||
valid-metaclass-classmethod-first-arg=mcs
|
||||
|
||||
# List of member names, which should be excluded from the protected access
|
||||
# warning.
|
||||
exclude-protected=_asdict,_fields,_replace,_source,_make
|
||||
|
||||
|
||||
[DESIGN]
|
||||
|
||||
# Maximum number of arguments for function / method
|
||||
max-args=6
|
||||
|
||||
# Argument names that match this expression will be ignored. Default to name
|
||||
# with leading underscore
|
||||
ignored-argument-names=_.*
|
||||
|
||||
# Maximum number of locals for function / method body
|
||||
max-locals=15
|
||||
|
||||
# Maximum number of return / yield for function / method body
|
||||
max-returns=6
|
||||
|
||||
# Maximum number of branch for function / method body
|
||||
max-branches=12
|
||||
|
||||
# Maximum number of statements in function / method body
|
||||
max-statements=50
|
||||
|
||||
# Maximum number of parents for a class (see R0901).
|
||||
max-parents=12
|
||||
|
||||
# Maximum number of attributes for a class (see R0902).
|
||||
max-attributes=7
|
||||
|
||||
# Minimum number of public methods for a class (see R0903).
|
||||
min-public-methods=2
|
||||
|
||||
# Maximum number of public methods for a class (see R0904).
|
||||
max-public-methods=20
|
||||
|
||||
|
||||
[IMPORTS]
|
||||
|
||||
# Deprecated modules which should not be used, separated by a comma
|
||||
deprecated-modules=regsub,TERMIOS,Bastion,rexec
|
||||
|
||||
# Create a graph of every (i.e. internal and external) dependencies in the
|
||||
# given file (report RP0402 must not be disabled)
|
||||
import-graph=
|
||||
|
||||
# Create a graph of external dependencies in the given file (report RP0402 must
|
||||
# not be disabled)
|
||||
ext-import-graph=
|
||||
|
||||
# Create a graph of internal dependencies in the given file (report RP0402 must
|
||||
# not be disabled)
|
||||
int-import-graph=
|
||||
|
||||
|
||||
[EXCEPTIONS]
|
||||
|
||||
# Exceptions that will emit a warning when being caught. Defaults to
|
||||
# "Exception"
|
||||
overgeneral-exceptions=Exception
|
||||
@@ -1,8 +1,5 @@
|
||||
include LICENSE.txt
|
||||
include README.rst
|
||||
include pytest.ini
|
||||
recursive-include docs *
|
||||
recursive-include examples *
|
||||
recursive-include tests *
|
||||
global-exclude __pycache__
|
||||
global-exclude *.py[cod]
|
||||
recursive-include acme/testdata *
|
||||
|
||||
@@ -1,22 +1,12 @@
|
||||
"""ACME protocol implementation.
|
||||
|
||||
This module is an implementation of the `ACME protocol`_.
|
||||
This module is an implementation of the `ACME protocol`_. Latest
|
||||
supported version: `draft-ietf-acme-01`_.
|
||||
|
||||
|
||||
.. _`ACME protocol`: https://ietf-wg-acme.github.io/acme
|
||||
|
||||
.. _`draft-ietf-acme-01`:
|
||||
https://github.com/ietf-wg-acme/acme/tree/draft-ietf-acme-acme-01
|
||||
|
||||
"""
|
||||
import sys
|
||||
import warnings
|
||||
|
||||
# This code exists to keep backwards compatibility with people using acme.jose
|
||||
# before it became the standalone josepy package.
|
||||
#
|
||||
# It is based on
|
||||
# https://github.com/requests/requests/blob/1278ecdf71a312dc2268f3bfc0aabfab3c006dcf/requests/packages.py
|
||||
import josepy as jose
|
||||
|
||||
for mod in list(sys.modules):
|
||||
# This traversal is apparently necessary such that the identities are
|
||||
# preserved (acme.jose.* is josepy.*)
|
||||
if mod == 'josepy' or mod.startswith('josepy.'):
|
||||
sys.modules['acme.' + mod.replace('josepy', 'jose', 1)] = sys.modules[mod]
|
||||
|
||||
@@ -3,21 +3,28 @@ import abc
|
||||
import functools
|
||||
import hashlib
|
||||
import logging
|
||||
import socket
|
||||
|
||||
from cryptography.hazmat.primitives import hashes # type: ignore
|
||||
import josepy as jose
|
||||
from cryptography.hazmat.primitives import hashes
|
||||
import OpenSSL
|
||||
import requests
|
||||
import six
|
||||
|
||||
from acme import dns_resolver
|
||||
from acme import errors
|
||||
from acme import crypto_util
|
||||
from acme import fields
|
||||
from acme import jose
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# pylint: disable=too-few-public-methods
|
||||
|
||||
|
||||
class Challenge(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
# _fields_to_partial_json | pylint: disable=abstract-method
|
||||
"""ACME challenge."""
|
||||
TYPES = {} # type: dict
|
||||
TYPES = {}
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
@@ -29,9 +36,9 @@ class Challenge(jose.TypedJSONObjectWithFields):
|
||||
|
||||
|
||||
class ChallengeResponse(jose.TypedJSONObjectWithFields):
|
||||
# _fields_to_partial_json
|
||||
# _fields_to_partial_json | pylint: disable=abstract-method
|
||||
"""ACME challenge response."""
|
||||
TYPES = {} # type: dict
|
||||
TYPES = {}
|
||||
resource_type = 'challenge'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
@@ -54,7 +61,8 @@ class UnrecognizedChallenge(Challenge):
|
||||
object.__setattr__(self, "jobj", jobj)
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.jobj # pylint: disable=no-member
|
||||
# pylint: disable=no-member
|
||||
return self.jobj
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
@@ -87,7 +95,6 @@ class _TokenChallenge(Challenge):
|
||||
"""
|
||||
# TODO: check that path combined with uri does not go above
|
||||
# URI_ROOT_PATH!
|
||||
# pylint: disable=unsupported-membership-test
|
||||
return b'..' not in self.token and b'/' not in self.token
|
||||
|
||||
|
||||
@@ -112,7 +119,7 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
parts = self.key_authorization.split('.')
|
||||
parts = self.key_authorization.split('.') # pylint: disable=no-member
|
||||
if len(parts) != 2:
|
||||
logger.debug("Key authorization (%r) is not well formed",
|
||||
self.key_authorization)
|
||||
@@ -132,21 +139,17 @@ class KeyAuthorizationChallengeResponse(ChallengeResponse):
|
||||
|
||||
return True
|
||||
|
||||
def to_partial_json(self):
|
||||
jobj = super(KeyAuthorizationChallengeResponse, self).to_partial_json()
|
||||
jobj.pop('keyAuthorization', None)
|
||||
return jobj
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class KeyAuthorizationChallenge(_TokenChallenge):
|
||||
# pylint: disable=abstract-class-little-used,too-many-ancestors
|
||||
"""Challenge based on Key Authorization.
|
||||
|
||||
:param response_cls: Subclass of `KeyAuthorizationChallengeResponse`
|
||||
that will be used to generate `response`.
|
||||
:param str typ: type of the challenge
|
||||
|
||||
"""
|
||||
typ = NotImplemented
|
||||
__metaclass__ = abc.ABCMeta
|
||||
|
||||
response_cls = NotImplemented
|
||||
thumbprint_hash_function = (
|
||||
KeyAuthorizationChallengeResponse.thumbprint_hash_function)
|
||||
@@ -171,7 +174,7 @@ class KeyAuthorizationChallenge(_TokenChallenge):
|
||||
:rtype: KeyAuthorizationChallengeResponse
|
||||
|
||||
"""
|
||||
return self.response_cls( # pylint: disable=not-callable
|
||||
return self.response_cls(
|
||||
key_authorization=self.key_authorization(account_key))
|
||||
|
||||
@abc.abstractmethod
|
||||
@@ -180,7 +183,7 @@ class KeyAuthorizationChallenge(_TokenChallenge):
|
||||
|
||||
Subclasses must implement this method, but they are likely to
|
||||
return completely different data structures, depending on what's
|
||||
necessary to complete the challenge. Interpretation of that
|
||||
necessary to complete the challenge. Interepretation of that
|
||||
return value must be known to the caller.
|
||||
|
||||
:param JWK account_key:
|
||||
@@ -208,29 +211,42 @@ class DNS01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME dns-01 challenge response."""
|
||||
typ = "dns-01"
|
||||
|
||||
def simple_verify(self, chall, domain, account_public_key): # pylint: disable=unused-argument
|
||||
def simple_verify(self, chall, domain, account_public_key):
|
||||
"""Simple verify.
|
||||
|
||||
This method no longer checks DNS records and is a simple wrapper
|
||||
around `KeyAuthorizationChallengeResponse.verify`.
|
||||
|
||||
:param challenges.DNS01 chall: Corresponding challenge.
|
||||
:param unicode domain: Domain name being verified.
|
||||
:param JWK account_public_key: Public key for the key pair
|
||||
being authorized.
|
||||
|
||||
:return: ``True`` iff verification of the key authorization was
|
||||
successful.
|
||||
:returns: ``True`` iff validation with the TXT records resolved from a
|
||||
DNS server is successful.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
verified = self.verify(chall, account_public_key)
|
||||
if not verified:
|
||||
if not self.verify(chall, account_public_key):
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return verified
|
||||
return False
|
||||
|
||||
validation_domain_name = chall.validation_domain_name(domain)
|
||||
validation = chall.validation(account_public_key)
|
||||
logger.debug("Verifying %s at %s...", chall.typ, validation_domain_name)
|
||||
|
||||
try:
|
||||
txt_records = dns_resolver.txt_records_for_name(
|
||||
validation_domain_name)
|
||||
except errors.DependencyError:
|
||||
raise errors.DependencyError("Local validation for 'dns-01' "
|
||||
"challenges requires 'dnspython'")
|
||||
exists = validation in txt_records
|
||||
if not exists:
|
||||
logger.debug("Key authorization from response (%r) doesn't match "
|
||||
"any DNS response in %r", self.key_authorization,
|
||||
txt_records)
|
||||
return exists
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class DNS01(KeyAuthorizationChallenge):
|
||||
"""ACME dns-01 challenge."""
|
||||
response_cls = DNS01Response
|
||||
@@ -320,7 +336,7 @@ class HTTP01Response(KeyAuthorizationChallengeResponse):
|
||||
return True
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class HTTP01(KeyAuthorizationChallenge):
|
||||
"""ACME http-01 challenge."""
|
||||
response_cls = HTTP01Response
|
||||
@@ -361,33 +377,149 @@ class HTTP01(KeyAuthorizationChallenge):
|
||||
|
||||
|
||||
@ChallengeResponse.register
|
||||
class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME TLS-ALPN-01 challenge response.
|
||||
class TLSSNI01Response(KeyAuthorizationChallengeResponse):
|
||||
"""ACME tls-sni-01 challenge response."""
|
||||
typ = "tls-sni-01"
|
||||
|
||||
This class only allows initiating a TLS-ALPN-01 challenge returned from the
|
||||
CA. Full support for responding to TLS-ALPN-01 challenges by generating and
|
||||
serving the expected response certificate is not currently provided.
|
||||
"""
|
||||
typ = "tls-alpn-01"
|
||||
DOMAIN_SUFFIX = b".acme.invalid"
|
||||
"""Domain name suffix."""
|
||||
|
||||
PORT = 443
|
||||
"""Verification port as defined by the protocol.
|
||||
|
||||
@Challenge.register
|
||||
class TLSALPN01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-alpn-01 challenge.
|
||||
|
||||
This class simply allows parsing the TLS-ALPN-01 challenge returned from
|
||||
the CA. Full TLS-ALPN-01 support is not currently provided.
|
||||
You can override it (e.g. for testing) by passing ``port`` to
|
||||
`simple_verify`.
|
||||
|
||||
"""
|
||||
typ = "tls-alpn-01"
|
||||
response_cls = TLSALPN01Response
|
||||
|
||||
@property
|
||||
def z(self): # pylint: disable=invalid-name
|
||||
"""``z`` value used for verification.
|
||||
|
||||
:rtype bytes:
|
||||
|
||||
"""
|
||||
return hashlib.sha256(
|
||||
self.key_authorization.encode("utf-8")).hexdigest().lower().encode()
|
||||
|
||||
@property
|
||||
def z_domain(self):
|
||||
"""Domain name used for verification, generated from `z`.
|
||||
|
||||
:rtype bytes:
|
||||
|
||||
"""
|
||||
return self.z[:32] + b'.' + self.z[32:] + self.DOMAIN_SUFFIX
|
||||
|
||||
def gen_cert(self, key=None, bits=2048):
|
||||
"""Generate tls-sni-01 certificate.
|
||||
|
||||
:param OpenSSL.crypto.PKey key: Optional private key used in
|
||||
certificate generation. If not provided (``None``), then
|
||||
fresh key will be generated.
|
||||
:param int bits: Number of bits for newly generated key.
|
||||
|
||||
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
|
||||
|
||||
"""
|
||||
if key is None:
|
||||
key = OpenSSL.crypto.PKey()
|
||||
key.generate_key(OpenSSL.crypto.TYPE_RSA, bits)
|
||||
return crypto_util.gen_ss_cert(key, [
|
||||
# z_domain is too big to fit into CN, hence first dummy domain
|
||||
'dummy', self.z_domain.decode()], force_san=True), key
|
||||
|
||||
def probe_cert(self, domain, **kwargs):
|
||||
"""Probe tls-sni-01 challenge certificate.
|
||||
|
||||
:param unicode domain:
|
||||
|
||||
"""
|
||||
# TODO: domain is not necessary if host is provided
|
||||
if "host" not in kwargs:
|
||||
host = socket.gethostbyname(domain)
|
||||
logging.debug('%s resolved to %s', domain, host)
|
||||
kwargs["host"] = host
|
||||
|
||||
kwargs.setdefault("port", self.PORT)
|
||||
kwargs["name"] = self.z_domain
|
||||
# TODO: try different methods?
|
||||
# pylint: disable=protected-access
|
||||
return crypto_util.probe_sni(**kwargs)
|
||||
|
||||
def verify_cert(self, cert):
|
||||
"""Verify tls-sni-01 challenge certificate.
|
||||
|
||||
:param OpensSSL.crypto.X509 cert: Challenge certificate.
|
||||
|
||||
:returns: Whether the certificate was successfully verified.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
# pylint: disable=protected-access
|
||||
sans = crypto_util._pyopenssl_cert_or_req_san(cert)
|
||||
logging.debug('Certificate %s. SANs: %s', cert.digest('sha1'), sans)
|
||||
return self.z_domain.decode() in sans
|
||||
|
||||
def simple_verify(self, chall, domain, account_public_key,
|
||||
cert=None, **kwargs):
|
||||
"""Simple verify.
|
||||
|
||||
Verify ``validation`` using ``account_public_key``, optionally
|
||||
probe tls-sni-01 certificate and check using `verify_cert`.
|
||||
|
||||
:param .challenges.TLSSNI01 chall: Corresponding challenge.
|
||||
:param str domain: Domain name being validated.
|
||||
:param JWK account_public_key:
|
||||
:param OpenSSL.crypto.X509 cert: Optional certificate. If not
|
||||
provided (``None``) certificate will be retrieved using
|
||||
`probe_cert`.
|
||||
:param int port: Port used to probe the certificate.
|
||||
|
||||
|
||||
:returns: ``True`` iff client's control of the domain has been
|
||||
verified.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
if not self.verify(chall, account_public_key):
|
||||
logger.debug("Verification of key authorization in response failed")
|
||||
return False
|
||||
|
||||
if cert is None:
|
||||
try:
|
||||
cert = self.probe_cert(domain=domain, **kwargs)
|
||||
except errors.Error as error:
|
||||
logger.debug(error, exc_info=True)
|
||||
return False
|
||||
|
||||
return self.verify_cert(cert)
|
||||
|
||||
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class TLSSNI01(KeyAuthorizationChallenge):
|
||||
"""ACME tls-sni-01 challenge."""
|
||||
response_cls = TLSSNI01Response
|
||||
typ = response_cls.typ
|
||||
|
||||
# boulder#962, ietf-wg-acme#22
|
||||
#n = jose.Field("n", encoder=int, decoder=int)
|
||||
|
||||
def validation(self, account_key, **kwargs):
|
||||
"""Generate validation for the challenge."""
|
||||
raise NotImplementedError()
|
||||
"""Generate validation.
|
||||
|
||||
:param JWK account_key:
|
||||
:param OpenSSL.crypto.PKey cert_key: Optional private key used
|
||||
in certificate generation. If not provided (``None``), then
|
||||
fresh key will be generated.
|
||||
|
||||
:rtype: `tuple` of `OpenSSL.crypto.X509` and `OpenSSL.crypto.PKey`
|
||||
|
||||
"""
|
||||
return self.response(account_key).gen_cert(key=kwargs.get('cert_key'))
|
||||
|
||||
|
||||
@Challenge.register
|
||||
@Challenge.register # pylint: disable=too-many-ancestors
|
||||
class DNS(_TokenChallenge):
|
||||
"""ACME "dns" challenge."""
|
||||
typ = "dns"
|
||||
|
||||
@@ -1,12 +1,16 @@
|
||||
"""Tests for acme.challenges."""
|
||||
import unittest
|
||||
|
||||
import josepy as jose
|
||||
import mock
|
||||
import OpenSSL
|
||||
import requests
|
||||
from six.moves.urllib import parse as urllib_parse
|
||||
|
||||
import test_util
|
||||
from six.moves.urllib import parse as urllib_parse # pylint: disable=import-error
|
||||
|
||||
from acme import errors
|
||||
from acme import jose
|
||||
from acme import test_util
|
||||
from acme.dns_resolver import DNS_REQUIREMENT
|
||||
|
||||
CERT = test_util.load_comparable_cert('cert.pem')
|
||||
KEY = jose.JWKRSA(key=test_util.load_rsa_private_key('rsa512_key.pem'))
|
||||
@@ -18,6 +22,7 @@ class ChallengeTest(unittest.TestCase):
|
||||
from acme.challenges import Challenge
|
||||
from acme.challenges import UnrecognizedChallenge
|
||||
chall = UnrecognizedChallenge({"type": "foo"})
|
||||
# pylint: disable=no-member
|
||||
self.assertEqual(chall, Challenge.from_json(chall.jobj))
|
||||
|
||||
|
||||
@@ -73,6 +78,7 @@ class KeyAuthorizationChallengeResponseTest(unittest.TestCase):
|
||||
|
||||
|
||||
class DNS01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import DNS01Response
|
||||
@@ -86,10 +92,10 @@ class DNS01ResponseTest(unittest.TestCase):
|
||||
from acme.challenges import DNS01
|
||||
self.chall = DNS01(token=(b'x' * 16))
|
||||
self.response = self.chall.response(KEY)
|
||||
self.records_for_name_path = "acme.dns_resolver.txt_records_for_name"
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import DNS01Response
|
||||
@@ -99,16 +105,45 @@ class DNS01ResponseTest(unittest.TestCase):
|
||||
from acme.challenges import DNS01Response
|
||||
hash(DNS01Response.from_json(self.jmsg))
|
||||
|
||||
def test_simple_verify_failure(self):
|
||||
def test_simple_verify_bad_key_authorization(self):
|
||||
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
|
||||
public_key = key2.public_key()
|
||||
verified = self.response.simple_verify(self.chall, "local", public_key)
|
||||
self.assertFalse(verified)
|
||||
self.response.simple_verify(self.chall, "local", key2.public_key())
|
||||
|
||||
def test_simple_verify_success(self):
|
||||
public_key = KEY.public_key()
|
||||
verified = self.response.simple_verify(self.chall, "local", public_key)
|
||||
self.assertTrue(verified)
|
||||
@mock.patch('acme.dns_resolver.DNS_AVAILABLE', False)
|
||||
def test_simple_verify_without_dns(self):
|
||||
self.assertRaises(
|
||||
errors.DependencyError, self.response.simple_verify,
|
||||
self.chall, 'local', KEY.public_key())
|
||||
|
||||
@test_util.skip_unless(test_util.requirement_available(DNS_REQUIREMENT),
|
||||
"optional dependency dnspython is not available")
|
||||
def test_simple_verify_good_validation(self): # pragma: no cover
|
||||
with mock.patch(self.records_for_name_path) as mock_resolver:
|
||||
mock_resolver.return_value = [
|
||||
self.chall.validation(KEY.public_key())]
|
||||
self.assertTrue(self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key()))
|
||||
mock_resolver.assert_called_once_with(
|
||||
self.chall.validation_domain_name("local"))
|
||||
|
||||
@test_util.skip_unless(test_util.requirement_available(DNS_REQUIREMENT),
|
||||
"optional dependency dnspython is not available")
|
||||
def test_simple_verify_good_validation_multitxts(self): # pragma: no cover
|
||||
with mock.patch(self.records_for_name_path) as mock_resolver:
|
||||
mock_resolver.return_value = [
|
||||
"!", self.chall.validation(KEY.public_key())]
|
||||
self.assertTrue(self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key()))
|
||||
mock_resolver.assert_called_once_with(
|
||||
self.chall.validation_domain_name("local"))
|
||||
|
||||
@test_util.skip_unless(test_util.requirement_available(DNS_REQUIREMENT),
|
||||
"optional dependency dnspython is not available")
|
||||
def test_simple_verify_bad_validation(self): # pragma: no cover
|
||||
with mock.patch(self.records_for_name_path) as mock_resolver:
|
||||
mock_resolver.return_value = ["!"]
|
||||
self.assertFalse(self.response.simple_verify(
|
||||
self.chall, "local", KEY.public_key()))
|
||||
|
||||
|
||||
class DNS01Test(unittest.TestCase):
|
||||
@@ -144,6 +179,7 @@ class DNS01Test(unittest.TestCase):
|
||||
|
||||
|
||||
class HTTP01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
@@ -159,8 +195,7 @@ class HTTP01ResponseTest(unittest.TestCase):
|
||||
self.response = self.chall.response(KEY)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import HTTP01Response
|
||||
@@ -253,42 +288,114 @@ class HTTP01Test(unittest.TestCase):
|
||||
self.msg.update(token=b'..').good_token)
|
||||
|
||||
|
||||
class TLSALPN01ResponseTest(unittest.TestCase):
|
||||
class TLSSNI01ResponseTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
self.msg = TLSALPN01Response(key_authorization=u'foo')
|
||||
from acme.challenges import TLSSNI01
|
||||
self.chall = TLSSNI01(
|
||||
token=jose.b64decode(b'a82d5ff8ef740d12881f6d3c2277ab2e'))
|
||||
|
||||
self.response = self.chall.response(KEY)
|
||||
self.jmsg = {
|
||||
'resource': 'challenge',
|
||||
'type': 'tls-alpn-01',
|
||||
'keyAuthorization': u'foo',
|
||||
'type': 'tls-sni-01',
|
||||
'keyAuthorization': self.response.key_authorization,
|
||||
}
|
||||
|
||||
from acme.challenges import TLSALPN01
|
||||
self.chall = TLSALPN01(token=(b'x' * 16))
|
||||
self.response = self.chall.response(KEY)
|
||||
# pylint: disable=invalid-name
|
||||
label1 = b'dc38d9c3fa1a4fdcc3a5501f2d38583f'
|
||||
label2 = b'b7793728f084394f2a1afd459556bb5c'
|
||||
self.z = label1 + label2
|
||||
self.z_domain = label1 + b'.' + label2 + b'.acme.invalid'
|
||||
self.domain = 'foo.com'
|
||||
|
||||
def test_z_and_domain(self):
|
||||
self.assertEqual(self.z, self.response.z)
|
||||
self.assertEqual(self.z_domain, self.response.z_domain)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual({k: v for k, v in self.jmsg.items() if k != 'keyAuthorization'},
|
||||
self.msg.to_partial_json())
|
||||
self.assertEqual(self.jmsg, self.response.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
self.assertEqual(self.msg, TLSALPN01Response.from_json(self.jmsg))
|
||||
from acme.challenges import TLSSNI01Response
|
||||
self.assertEqual(self.response, TLSSNI01Response.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import TLSALPN01Response
|
||||
hash(TLSALPN01Response.from_json(self.jmsg))
|
||||
from acme.challenges import TLSSNI01Response
|
||||
hash(TLSSNI01Response.from_json(self.jmsg))
|
||||
|
||||
@mock.patch('acme.challenges.socket.gethostbyname')
|
||||
@mock.patch('acme.challenges.crypto_util.probe_sni')
|
||||
def test_probe_cert(self, mock_probe_sni, mock_gethostbyname):
|
||||
mock_gethostbyname.return_value = '127.0.0.1'
|
||||
self.response.probe_cert('foo.com')
|
||||
mock_gethostbyname.assert_called_once_with('foo.com')
|
||||
mock_probe_sni.assert_called_once_with(
|
||||
host='127.0.0.1', port=self.response.PORT,
|
||||
name=self.z_domain)
|
||||
|
||||
self.response.probe_cert('foo.com', host='8.8.8.8')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host='8.8.8.8', port=mock.ANY, name=mock.ANY)
|
||||
|
||||
self.response.probe_cert('foo.com', port=1234)
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=1234, name=mock.ANY)
|
||||
|
||||
self.response.probe_cert('foo.com', bar='baz')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=mock.ANY, name=mock.ANY, bar='baz')
|
||||
|
||||
self.response.probe_cert('foo.com', name=b'xxx')
|
||||
mock_probe_sni.assert_called_with(
|
||||
host=mock.ANY, port=mock.ANY,
|
||||
name=self.z_domain)
|
||||
|
||||
def test_gen_verify_cert(self):
|
||||
key1 = test_util.load_pyopenssl_private_key('rsa512_key.pem')
|
||||
cert, key2 = self.response.gen_cert(key1)
|
||||
self.assertEqual(key1, key2)
|
||||
self.assertTrue(self.response.verify_cert(cert))
|
||||
|
||||
def test_gen_verify_cert_gen_key(self):
|
||||
cert, key = self.response.gen_cert()
|
||||
self.assertTrue(isinstance(key, OpenSSL.crypto.PKey))
|
||||
self.assertTrue(self.response.verify_cert(cert))
|
||||
|
||||
def test_verify_bad_cert(self):
|
||||
self.assertFalse(self.response.verify_cert(
|
||||
test_util.load_cert('cert.pem')))
|
||||
|
||||
def test_simple_verify_bad_key_authorization(self):
|
||||
key2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
|
||||
self.response.simple_verify(self.chall, "local", key2.public_key())
|
||||
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.verify_cert', autospec=True)
|
||||
def test_simple_verify(self, mock_verify_cert):
|
||||
mock_verify_cert.return_value = mock.sentinel.verification
|
||||
self.assertEqual(
|
||||
mock.sentinel.verification, self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key(),
|
||||
cert=mock.sentinel.cert))
|
||||
mock_verify_cert.assert_called_once_with(
|
||||
self.response, mock.sentinel.cert)
|
||||
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.probe_cert')
|
||||
def test_simple_verify_false_on_probe_error(self, mock_probe_cert):
|
||||
mock_probe_cert.side_effect = errors.Error
|
||||
self.assertFalse(self.response.simple_verify(
|
||||
self.chall, self.domain, KEY.public_key()))
|
||||
|
||||
|
||||
class TLSALPN01Test(unittest.TestCase):
|
||||
class TLSSNI01Test(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
self.msg = TLSALPN01(
|
||||
from acme.challenges import TLSSNI01
|
||||
self.msg = TLSSNI01(
|
||||
token=jose.b64decode('a82d5ff8ef740d12881f6d3c2277ab2e'))
|
||||
self.jmsg = {
|
||||
'type': 'tls-alpn-01',
|
||||
'type': 'tls-sni-01',
|
||||
'token': 'a82d5ff8ef740d12881f6d3c2277ab2e',
|
||||
}
|
||||
|
||||
@@ -296,21 +403,25 @@ class TLSALPN01Test(unittest.TestCase):
|
||||
self.assertEqual(self.jmsg, self.msg.to_partial_json())
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
self.assertEqual(self.msg, TLSALPN01.from_json(self.jmsg))
|
||||
from acme.challenges import TLSSNI01
|
||||
self.assertEqual(self.msg, TLSSNI01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
hash(TLSALPN01.from_json(self.jmsg))
|
||||
from acme.challenges import TLSSNI01
|
||||
hash(TLSSNI01.from_json(self.jmsg))
|
||||
|
||||
def test_from_json_invalid_token_length(self):
|
||||
from acme.challenges import TLSALPN01
|
||||
from acme.challenges import TLSSNI01
|
||||
self.jmsg['token'] = jose.encode_b64jose(b'abcd')
|
||||
self.assertRaises(
|
||||
jose.DeserializationError, TLSALPN01.from_json, self.jmsg)
|
||||
jose.DeserializationError, TLSSNI01.from_json, self.jmsg)
|
||||
|
||||
def test_validation(self):
|
||||
self.assertRaises(NotImplementedError, self.msg.validation, KEY)
|
||||
@mock.patch('acme.challenges.TLSSNI01Response.gen_cert')
|
||||
def test_validation(self, mock_gen_cert):
|
||||
mock_gen_cert.return_value = ('cert', 'key')
|
||||
self.assertEqual(('cert', 'key'), self.msg.validation(
|
||||
KEY, cert_key=mock.sentinel.cert_key))
|
||||
mock_gen_cert.assert_called_once_with(key=mock.sentinel.cert_key)
|
||||
|
||||
|
||||
class DNSTest(unittest.TestCase):
|
||||
File diff suppressed because it is too large
Load Diff
666
acme/acme/client_test.py
Normal file
666
acme/acme/client_test.py
Normal file
@@ -0,0 +1,666 @@
|
||||
"""Tests for acme.client."""
|
||||
import datetime
|
||||
import json
|
||||
import unittest
|
||||
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
|
||||
import mock
|
||||
import requests
|
||||
|
||||
from acme import challenges
|
||||
from acme import errors
|
||||
from acme import jose
|
||||
from acme import jws as acme_jws
|
||||
from acme import messages
|
||||
from acme import messages_test
|
||||
from acme import test_util
|
||||
|
||||
|
||||
CERT_DER = test_util.load_vector('cert.der')
|
||||
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
|
||||
KEY2 = jose.JWKRSA.load(test_util.load_vector('rsa256_key.pem'))
|
||||
|
||||
|
||||
class ClientTest(unittest.TestCase):
|
||||
"""Tests for acme.client.Client."""
|
||||
# pylint: disable=too-many-instance-attributes,too-many-public-methods
|
||||
|
||||
def setUp(self):
|
||||
self.response = mock.MagicMock(
|
||||
ok=True, status_code=http_client.OK, headers={}, links={})
|
||||
self.net = mock.MagicMock()
|
||||
self.net.post.return_value = self.response
|
||||
self.net.get.return_value = self.response
|
||||
|
||||
self.directory = messages.Directory({
|
||||
messages.NewRegistration:
|
||||
'https://www.letsencrypt-demo.org/acme/new-reg',
|
||||
messages.Revocation:
|
||||
'https://www.letsencrypt-demo.org/acme/revoke-cert',
|
||||
messages.NewAuthorization:
|
||||
'https://www.letsencrypt-demo.org/acme/new-authz',
|
||||
})
|
||||
|
||||
from acme.client import Client
|
||||
self.client = Client(
|
||||
directory=self.directory, key=KEY, alg=jose.RS256, net=self.net)
|
||||
|
||||
self.identifier = messages.Identifier(
|
||||
typ=messages.IDENTIFIER_FQDN, value='example.com')
|
||||
|
||||
# Registration
|
||||
self.contact = ('mailto:cert-admin@example.com', 'tel:+12025551212')
|
||||
reg = messages.Registration(
|
||||
contact=self.contact, key=KEY.public_key())
|
||||
self.new_reg = messages.NewRegistration(**dict(reg))
|
||||
self.regr = messages.RegistrationResource(
|
||||
body=reg, uri='https://www.letsencrypt-demo.org/acme/reg/1',
|
||||
new_authzr_uri='https://www.letsencrypt-demo.org/acme/new-reg',
|
||||
terms_of_service='https://www.letsencrypt-demo.org/tos')
|
||||
|
||||
# Authorization
|
||||
authzr_uri = 'https://www.letsencrypt-demo.org/acme/authz/1'
|
||||
challb = messages.ChallengeBody(
|
||||
uri=(authzr_uri + '/1'), status=messages.STATUS_VALID,
|
||||
chall=challenges.DNS(token=jose.b64decode(
|
||||
'evaGxfADs6pSRb2LAv9IZf17Dt3juxGJ-PCt92wr-oA')))
|
||||
self.challr = messages.ChallengeResource(
|
||||
body=challb, authzr_uri=authzr_uri)
|
||||
self.authz = messages.Authorization(
|
||||
identifier=messages.Identifier(
|
||||
typ=messages.IDENTIFIER_FQDN, value='example.com'),
|
||||
challenges=(challb,), combinations=None)
|
||||
self.authzr = messages.AuthorizationResource(
|
||||
body=self.authz, uri=authzr_uri,
|
||||
new_cert_uri='https://www.letsencrypt-demo.org/acme/new-cert')
|
||||
|
||||
# Request issuance
|
||||
self.certr = messages.CertificateResource(
|
||||
body=messages_test.CERT, authzrs=(self.authzr,),
|
||||
uri='https://www.letsencrypt-demo.org/acme/cert/1',
|
||||
cert_chain_uri='https://www.letsencrypt-demo.org/ca')
|
||||
|
||||
def test_init_downloads_directory(self):
|
||||
uri = 'http://www.letsencrypt-demo.org/directory'
|
||||
from acme.client import Client
|
||||
self.client = Client(
|
||||
directory=uri, key=KEY, alg=jose.RS256, net=self.net)
|
||||
self.net.get.assert_called_once_with(uri)
|
||||
|
||||
def test_register(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
# pylint: disable=no-member
|
||||
self.response.status_code = http_client.CREATED
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
self.response.links.update({
|
||||
'next': {'url': self.regr.new_authzr_uri},
|
||||
'terms-of-service': {'url': self.regr.terms_of_service},
|
||||
})
|
||||
|
||||
self.assertEqual(self.regr, self.client.register(self.new_reg))
|
||||
# TODO: test POST call arguments
|
||||
|
||||
def test_register_missing_next(self):
|
||||
self.response.status_code = http_client.CREATED
|
||||
self.assertRaises(
|
||||
errors.ClientError, self.client.register, self.new_reg)
|
||||
|
||||
def test_update_registration(self):
|
||||
# "Instance of 'Field' has no to_json/update member" bug:
|
||||
# pylint: disable=no-member
|
||||
self.response.headers['Location'] = self.regr.uri
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.assertEqual(self.regr, self.client.update_registration(self.regr))
|
||||
# TODO: test POST call arguments
|
||||
|
||||
# TODO: split here and separate test
|
||||
self.response.json.return_value = self.regr.body.update(
|
||||
contact=()).to_json()
|
||||
self.assertRaises(
|
||||
errors.UnexpectedUpdate, self.client.update_registration, self.regr)
|
||||
|
||||
def test_query_registration(self):
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.assertEqual(self.regr, self.client.query_registration(self.regr))
|
||||
|
||||
def test_query_registration_updates_new_authzr_uri(self):
|
||||
self.response.json.return_value = self.regr.body.to_json()
|
||||
self.response.links = {'next': {'url': 'UPDATED'}}
|
||||
self.assertEqual(
|
||||
'UPDATED',
|
||||
self.client.query_registration(self.regr).new_authzr_uri)
|
||||
|
||||
def test_agree_to_tos(self):
|
||||
self.client.update_registration = mock.Mock()
|
||||
self.client.agree_to_tos(self.regr)
|
||||
regr = self.client.update_registration.call_args[0][0]
|
||||
self.assertEqual(self.regr.terms_of_service, regr.body.agreement)
|
||||
|
||||
def _prepare_response_for_request_challenges(self):
|
||||
self.response.status_code = http_client.CREATED
|
||||
self.response.headers['Location'] = self.authzr.uri
|
||||
self.response.json.return_value = self.authz.to_json()
|
||||
self.response.links = {
|
||||
'next': {'url': self.authzr.new_cert_uri},
|
||||
}
|
||||
|
||||
def test_request_challenges(self):
|
||||
self._prepare_response_for_request_challenges()
|
||||
self.client.request_challenges(self.identifier)
|
||||
self.net.post.assert_called_once_with(
|
||||
self.directory.new_authz,
|
||||
messages.NewAuthorization(identifier=self.identifier))
|
||||
|
||||
def test_requets_challenges_custom_uri(self):
|
||||
self._prepare_response_for_request_challenges()
|
||||
self.client.request_challenges(self.identifier, 'URI')
|
||||
self.net.post.assert_called_once_with('URI', mock.ANY)
|
||||
|
||||
def test_request_challenges_unexpected_update(self):
|
||||
self._prepare_response_for_request_challenges()
|
||||
self.response.json.return_value = self.authz.update(
|
||||
identifier=self.identifier.update(value='foo')).to_json()
|
||||
self.assertRaises(
|
||||
errors.UnexpectedUpdate, self.client.request_challenges,
|
||||
self.identifier, self.authzr.uri)
|
||||
|
||||
def test_request_challenges_missing_next(self):
|
||||
self.response.status_code = http_client.CREATED
|
||||
self.assertRaises(errors.ClientError, self.client.request_challenges,
|
||||
self.identifier)
|
||||
|
||||
def test_request_domain_challenges(self):
|
||||
self.client.request_challenges = mock.MagicMock()
|
||||
self.assertEqual(
|
||||
self.client.request_challenges(self.identifier),
|
||||
self.client.request_domain_challenges('example.com'))
|
||||
|
||||
def test_request_domain_challenges_custom_uri(self):
|
||||
self.client.request_challenges = mock.MagicMock()
|
||||
self.assertEqual(
|
||||
self.client.request_challenges(self.identifier, 'URI'),
|
||||
self.client.request_domain_challenges('example.com', 'URI'))
|
||||
|
||||
def test_answer_challenge(self):
|
||||
self.response.links['up'] = {'url': self.challr.authzr_uri}
|
||||
self.response.json.return_value = self.challr.body.to_json()
|
||||
|
||||
chall_response = challenges.DNSResponse(validation=None)
|
||||
|
||||
self.client.answer_challenge(self.challr.body, chall_response)
|
||||
|
||||
# TODO: split here and separate test
|
||||
self.assertRaises(errors.UnexpectedUpdate, self.client.answer_challenge,
|
||||
self.challr.body.update(uri='foo'), chall_response)
|
||||
|
||||
def test_answer_challenge_missing_next(self):
|
||||
self.assertRaises(
|
||||
errors.ClientError, self.client.answer_challenge,
|
||||
self.challr.body, challenges.DNSResponse(validation=None))
|
||||
|
||||
def test_retry_after_date(self):
|
||||
self.response.headers['Retry-After'] = 'Fri, 31 Dec 1999 23:59:59 GMT'
|
||||
self.assertEqual(
|
||||
datetime.datetime(1999, 12, 31, 23, 59, 59),
|
||||
self.client.retry_after(response=self.response, default=10))
|
||||
|
||||
@mock.patch('acme.client.datetime')
|
||||
def test_retry_after_invalid(self, dt_mock):
|
||||
dt_mock.datetime.now.return_value = datetime.datetime(2015, 3, 27)
|
||||
dt_mock.timedelta = datetime.timedelta
|
||||
|
||||
self.response.headers['Retry-After'] = 'foooo'
|
||||
self.assertEqual(
|
||||
datetime.datetime(2015, 3, 27, 0, 0, 10),
|
||||
self.client.retry_after(response=self.response, default=10))
|
||||
|
||||
@mock.patch('acme.client.datetime')
|
||||
def test_retry_after_overflow(self, dt_mock):
|
||||
dt_mock.datetime.now.return_value = datetime.datetime(2015, 3, 27)
|
||||
dt_mock.timedelta = datetime.timedelta
|
||||
dt_mock.datetime.side_effect = datetime.datetime
|
||||
|
||||
self.response.headers['Retry-After'] = "Tue, 116 Feb 2016 11:50:00 MST"
|
||||
self.assertEqual(
|
||||
datetime.datetime(2015, 3, 27, 0, 0, 10),
|
||||
self.client.retry_after(response=self.response, default=10))
|
||||
|
||||
@mock.patch('acme.client.datetime')
|
||||
def test_retry_after_seconds(self, dt_mock):
|
||||
dt_mock.datetime.now.return_value = datetime.datetime(2015, 3, 27)
|
||||
dt_mock.timedelta = datetime.timedelta
|
||||
|
||||
self.response.headers['Retry-After'] = '50'
|
||||
self.assertEqual(
|
||||
datetime.datetime(2015, 3, 27, 0, 0, 50),
|
||||
self.client.retry_after(response=self.response, default=10))
|
||||
|
||||
@mock.patch('acme.client.datetime')
|
||||
def test_retry_after_missing(self, dt_mock):
|
||||
dt_mock.datetime.now.return_value = datetime.datetime(2015, 3, 27)
|
||||
dt_mock.timedelta = datetime.timedelta
|
||||
|
||||
self.assertEqual(
|
||||
datetime.datetime(2015, 3, 27, 0, 0, 10),
|
||||
self.client.retry_after(response=self.response, default=10))
|
||||
|
||||
def test_poll(self):
|
||||
self.response.json.return_value = self.authzr.body.to_json()
|
||||
self.assertEqual((self.authzr, self.response),
|
||||
self.client.poll(self.authzr))
|
||||
|
||||
# TODO: split here and separate test
|
||||
self.response.json.return_value = self.authz.update(
|
||||
identifier=self.identifier.update(value='foo')).to_json()
|
||||
self.assertRaises(
|
||||
errors.UnexpectedUpdate, self.client.poll, self.authzr)
|
||||
|
||||
def test_request_issuance(self):
|
||||
self.response.content = CERT_DER
|
||||
self.response.headers['Location'] = self.certr.uri
|
||||
self.response.links['up'] = {'url': self.certr.cert_chain_uri}
|
||||
self.assertEqual(self.certr, self.client.request_issuance(
|
||||
messages_test.CSR, (self.authzr,)))
|
||||
# TODO: check POST args
|
||||
|
||||
def test_request_issuance_missing_up(self):
|
||||
self.response.content = CERT_DER
|
||||
self.response.headers['Location'] = self.certr.uri
|
||||
self.assertEqual(
|
||||
self.certr.update(cert_chain_uri=None),
|
||||
self.client.request_issuance(messages_test.CSR, (self.authzr,)))
|
||||
|
||||
def test_request_issuance_missing_location(self):
|
||||
self.assertRaises(
|
||||
errors.ClientError, self.client.request_issuance,
|
||||
messages_test.CSR, (self.authzr,))
|
||||
|
||||
@mock.patch('acme.client.datetime')
|
||||
@mock.patch('acme.client.time')
|
||||
def test_poll_and_request_issuance(self, time_mock, dt_mock):
|
||||
# clock.dt | pylint: disable=no-member
|
||||
clock = mock.MagicMock(dt=datetime.datetime(2015, 3, 27))
|
||||
|
||||
def sleep(seconds):
|
||||
"""increment clock"""
|
||||
clock.dt += datetime.timedelta(seconds=seconds)
|
||||
time_mock.sleep.side_effect = sleep
|
||||
|
||||
def now():
|
||||
"""return current clock value"""
|
||||
return clock.dt
|
||||
dt_mock.datetime.now.side_effect = now
|
||||
dt_mock.timedelta = datetime.timedelta
|
||||
|
||||
def poll(authzr): # pylint: disable=missing-docstring
|
||||
# record poll start time based on the current clock value
|
||||
authzr.times.append(clock.dt)
|
||||
|
||||
# suppose it takes 2 seconds for server to produce the
|
||||
# result, increment clock
|
||||
clock.dt += datetime.timedelta(seconds=2)
|
||||
|
||||
if len(authzr.retries) == 1: # no more retries
|
||||
done = mock.MagicMock(uri=authzr.uri, times=authzr.times)
|
||||
done.body.status = authzr.retries[0]
|
||||
return done, []
|
||||
|
||||
# response (2nd result tuple element) is reduced to only
|
||||
# Retry-After header contents represented as integer
|
||||
# seconds; authzr.retries is a list of Retry-After
|
||||
# headers, head(retries) is peeled of as a current
|
||||
# Retry-After header, and tail(retries) is persisted for
|
||||
# later poll() calls
|
||||
return (mock.MagicMock(retries=authzr.retries[1:],
|
||||
uri=authzr.uri + '.', times=authzr.times),
|
||||
authzr.retries[0])
|
||||
self.client.poll = mock.MagicMock(side_effect=poll)
|
||||
|
||||
mintime = 7
|
||||
|
||||
def retry_after(response, default):
|
||||
# pylint: disable=missing-docstring
|
||||
# check that poll_and_request_issuance correctly passes mintime
|
||||
self.assertEqual(default, mintime)
|
||||
return clock.dt + datetime.timedelta(seconds=response)
|
||||
self.client.retry_after = mock.MagicMock(side_effect=retry_after)
|
||||
|
||||
def request_issuance(csr, authzrs): # pylint: disable=missing-docstring
|
||||
return csr, authzrs
|
||||
self.client.request_issuance = mock.MagicMock(
|
||||
side_effect=request_issuance)
|
||||
|
||||
csr = mock.MagicMock()
|
||||
authzrs = (
|
||||
mock.MagicMock(uri='a', times=[], retries=(
|
||||
8, 20, 30, messages.STATUS_VALID)),
|
||||
mock.MagicMock(uri='b', times=[], retries=(
|
||||
5, messages.STATUS_VALID)),
|
||||
)
|
||||
|
||||
cert, updated_authzrs = self.client.poll_and_request_issuance(
|
||||
csr, authzrs, mintime=mintime,
|
||||
# make sure that max_attempts is per-authorization, rather
|
||||
# than global
|
||||
max_attempts=max(len(authzrs[0].retries), len(authzrs[1].retries)))
|
||||
self.assertTrue(cert[0] is csr)
|
||||
self.assertTrue(cert[1] is updated_authzrs)
|
||||
self.assertEqual(updated_authzrs[0].uri, 'a...')
|
||||
self.assertEqual(updated_authzrs[1].uri, 'b.')
|
||||
self.assertEqual(updated_authzrs[0].times, [
|
||||
datetime.datetime(2015, 3, 27),
|
||||
# a is scheduled for 10, but b is polling [9..11), so it
|
||||
# will be picked up as soon as b is finished, without
|
||||
# additional sleeping
|
||||
datetime.datetime(2015, 3, 27, 0, 0, 11),
|
||||
datetime.datetime(2015, 3, 27, 0, 0, 33),
|
||||
datetime.datetime(2015, 3, 27, 0, 1, 5),
|
||||
])
|
||||
self.assertEqual(updated_authzrs[1].times, [
|
||||
datetime.datetime(2015, 3, 27, 0, 0, 2),
|
||||
datetime.datetime(2015, 3, 27, 0, 0, 9),
|
||||
])
|
||||
self.assertEqual(clock.dt, datetime.datetime(2015, 3, 27, 0, 1, 7))
|
||||
|
||||
# CA sets invalid | TODO: move to a separate test
|
||||
invalid_authzr = mock.MagicMock(
|
||||
times=[], retries=[messages.STATUS_INVALID])
|
||||
self.assertRaises(
|
||||
errors.PollError, self.client.poll_and_request_issuance,
|
||||
csr, authzrs=(invalid_authzr,), mintime=mintime)
|
||||
|
||||
# exceeded max_attemps | TODO: move to a separate test
|
||||
self.assertRaises(
|
||||
errors.PollError, self.client.poll_and_request_issuance,
|
||||
csr, authzrs, mintime=mintime, max_attempts=2)
|
||||
|
||||
def test_check_cert(self):
|
||||
self.response.headers['Location'] = self.certr.uri
|
||||
self.response.content = CERT_DER
|
||||
self.assertEqual(self.certr.update(body=messages_test.CERT),
|
||||
self.client.check_cert(self.certr))
|
||||
|
||||
# TODO: split here and separate test
|
||||
self.response.headers['Location'] = 'foo'
|
||||
self.assertRaises(
|
||||
errors.UnexpectedUpdate, self.client.check_cert, self.certr)
|
||||
|
||||
def test_check_cert_missing_location(self):
|
||||
self.response.content = CERT_DER
|
||||
self.assertRaises(
|
||||
errors.ClientError, self.client.check_cert, self.certr)
|
||||
|
||||
def test_refresh(self):
|
||||
self.client.check_cert = mock.MagicMock()
|
||||
self.assertEqual(
|
||||
self.client.check_cert(self.certr), self.client.refresh(self.certr))
|
||||
|
||||
def test_fetch_chain_no_up_link(self):
|
||||
self.assertEqual([], self.client.fetch_chain(self.certr.update(
|
||||
cert_chain_uri=None)))
|
||||
|
||||
def test_fetch_chain_single(self):
|
||||
# pylint: disable=protected-access
|
||||
self.client._get_cert = mock.MagicMock()
|
||||
self.client._get_cert.return_value = (
|
||||
mock.MagicMock(links={}), "certificate")
|
||||
self.assertEqual([self.client._get_cert(self.certr.cert_chain_uri)[1]],
|
||||
self.client.fetch_chain(self.certr))
|
||||
|
||||
def test_fetch_chain_max(self):
|
||||
# pylint: disable=protected-access
|
||||
up_response = mock.MagicMock(links={'up': {'url': 'http://cert'}})
|
||||
noup_response = mock.MagicMock(links={})
|
||||
self.client._get_cert = mock.MagicMock()
|
||||
self.client._get_cert.side_effect = [
|
||||
(up_response, "cert")] * 9 + [(noup_response, "last_cert")]
|
||||
chain = self.client.fetch_chain(self.certr, max_length=10)
|
||||
self.assertEqual(chain, ["cert"] * 9 + ["last_cert"])
|
||||
|
||||
def test_fetch_chain_too_many(self): # recursive
|
||||
# pylint: disable=protected-access
|
||||
response = mock.MagicMock(links={'up': {'url': 'http://cert'}})
|
||||
self.client._get_cert = mock.MagicMock()
|
||||
self.client._get_cert.return_value = (response, "certificate")
|
||||
self.assertRaises(errors.Error, self.client.fetch_chain, self.certr)
|
||||
|
||||
def test_revoke(self):
|
||||
self.client.revoke(self.certr.body)
|
||||
self.net.post.assert_called_once_with(
|
||||
self.directory[messages.Revocation], mock.ANY, content_type=None)
|
||||
|
||||
def test_revoke_bad_status_raises_error(self):
|
||||
self.response.status_code = http_client.METHOD_NOT_ALLOWED
|
||||
self.assertRaises(errors.ClientError, self.client.revoke, self.certr)
|
||||
|
||||
|
||||
class ClientNetworkTest(unittest.TestCase):
|
||||
"""Tests for acme.client.ClientNetwork."""
|
||||
|
||||
def setUp(self):
|
||||
self.verify_ssl = mock.MagicMock()
|
||||
self.wrap_in_jws = mock.MagicMock(return_value=mock.sentinel.wrapped)
|
||||
|
||||
from acme.client import ClientNetwork
|
||||
self.net = ClientNetwork(
|
||||
key=KEY, alg=jose.RS256, verify_ssl=self.verify_ssl,
|
||||
user_agent='acme-python-test')
|
||||
|
||||
self.response = mock.MagicMock(ok=True, status_code=http_client.OK)
|
||||
self.response.headers = {}
|
||||
self.response.links = {}
|
||||
|
||||
def test_init(self):
|
||||
self.assertTrue(self.net.verify_ssl is self.verify_ssl)
|
||||
|
||||
def test_wrap_in_jws(self):
|
||||
class MockJSONDeSerializable(jose.JSONDeSerializable):
|
||||
# pylint: disable=missing-docstring
|
||||
def __init__(self, value):
|
||||
self.value = value
|
||||
|
||||
def to_partial_json(self):
|
||||
return {'foo': self.value}
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, value):
|
||||
pass # pragma: no cover
|
||||
|
||||
# pylint: disable=protected-access
|
||||
jws_dump = self.net._wrap_in_jws(
|
||||
MockJSONDeSerializable('foo'), nonce=b'Tg')
|
||||
jws = acme_jws.JWS.json_loads(jws_dump)
|
||||
self.assertEqual(json.loads(jws.payload.decode()), {'foo': 'foo'})
|
||||
self.assertEqual(jws.signature.combined.nonce, b'Tg')
|
||||
|
||||
def test_check_response_not_ok_jobj_no_error(self):
|
||||
self.response.ok = False
|
||||
self.response.json.return_value = {}
|
||||
with mock.patch('acme.client.messages.Error.from_json') as from_json:
|
||||
from_json.side_effect = jose.DeserializationError
|
||||
# pylint: disable=protected-access
|
||||
self.assertRaises(
|
||||
errors.ClientError, self.net._check_response, self.response)
|
||||
|
||||
def test_check_response_not_ok_jobj_error(self):
|
||||
self.response.ok = False
|
||||
self.response.json.return_value = messages.Error(
|
||||
detail='foo', typ='serverInternal', title='some title').to_json()
|
||||
# pylint: disable=protected-access
|
||||
self.assertRaises(
|
||||
messages.Error, self.net._check_response, self.response)
|
||||
|
||||
def test_check_response_not_ok_no_jobj(self):
|
||||
self.response.ok = False
|
||||
self.response.json.side_effect = ValueError
|
||||
# pylint: disable=protected-access
|
||||
self.assertRaises(
|
||||
errors.ClientError, self.net._check_response, self.response)
|
||||
|
||||
def test_check_response_ok_no_jobj_ct_required(self):
|
||||
self.response.json.side_effect = ValueError
|
||||
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
|
||||
self.response.headers['Content-Type'] = response_ct
|
||||
# pylint: disable=protected-access
|
||||
self.assertRaises(
|
||||
errors.ClientError, self.net._check_response, self.response,
|
||||
content_type=self.net.JSON_CONTENT_TYPE)
|
||||
|
||||
def test_check_response_ok_no_jobj_no_ct(self):
|
||||
self.response.json.side_effect = ValueError
|
||||
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
|
||||
self.response.headers['Content-Type'] = response_ct
|
||||
# pylint: disable=protected-access,no-value-for-parameter
|
||||
self.assertEqual(
|
||||
self.response, self.net._check_response(self.response))
|
||||
|
||||
def test_check_response_jobj(self):
|
||||
self.response.json.return_value = {}
|
||||
for response_ct in [self.net.JSON_CONTENT_TYPE, 'foo']:
|
||||
self.response.headers['Content-Type'] = response_ct
|
||||
# pylint: disable=protected-access,no-value-for-parameter
|
||||
self.assertEqual(
|
||||
self.response, self.net._check_response(self.response))
|
||||
|
||||
def test_send_request(self):
|
||||
self.net.session = mock.MagicMock()
|
||||
self.net.session.request.return_value = self.response
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.response, self.net._send_request(
|
||||
'HEAD', 'http://example.com/', 'foo', bar='baz'))
|
||||
self.net.session.request.assert_called_once_with(
|
||||
'HEAD', 'http://example.com/', 'foo',
|
||||
headers=mock.ANY, verify=mock.ANY, bar='baz')
|
||||
|
||||
def test_send_request_verify_ssl(self):
|
||||
# pylint: disable=protected-access
|
||||
for verify in True, False:
|
||||
self.net.session = mock.MagicMock()
|
||||
self.net.session.request.return_value = self.response
|
||||
self.net.verify_ssl = verify
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(
|
||||
self.response,
|
||||
self.net._send_request('GET', 'http://example.com/'))
|
||||
self.net.session.request.assert_called_once_with(
|
||||
'GET', 'http://example.com/', verify=verify, headers=mock.ANY)
|
||||
|
||||
def test_send_request_user_agent(self):
|
||||
self.net.session = mock.MagicMock()
|
||||
# pylint: disable=protected-access
|
||||
self.net._send_request('GET', 'http://example.com/',
|
||||
headers={'bar': 'baz'})
|
||||
self.net.session.request.assert_called_once_with(
|
||||
'GET', 'http://example.com/', verify=mock.ANY,
|
||||
headers={'User-Agent': 'acme-python-test', 'bar': 'baz'})
|
||||
|
||||
self.net._send_request('GET', 'http://example.com/',
|
||||
headers={'User-Agent': 'foo2'})
|
||||
self.net.session.request.assert_called_with(
|
||||
'GET', 'http://example.com/',
|
||||
verify=mock.ANY, headers={'User-Agent': 'foo2'})
|
||||
|
||||
def test_del(self):
|
||||
sess = mock.MagicMock()
|
||||
self.net.session = sess
|
||||
del self.net
|
||||
sess.close.assert_called_once_with()
|
||||
|
||||
@mock.patch('acme.client.requests')
|
||||
def test_requests_error_passthrough(self, mock_requests):
|
||||
mock_requests.exceptions = requests.exceptions
|
||||
mock_requests.request.side_effect = requests.exceptions.RequestException
|
||||
# pylint: disable=protected-access
|
||||
self.assertRaises(requests.exceptions.RequestException,
|
||||
self.net._send_request, 'GET', 'uri')
|
||||
|
||||
|
||||
class ClientNetworkWithMockedResponseTest(unittest.TestCase):
|
||||
"""Tests for acme.client.ClientNetwork which mock out response."""
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.client import ClientNetwork
|
||||
self.net = ClientNetwork(key=None, alg=None)
|
||||
|
||||
self.response = mock.MagicMock(ok=True, status_code=http_client.OK)
|
||||
self.response.headers = {}
|
||||
self.response.links = {}
|
||||
self.checked_response = mock.MagicMock()
|
||||
self.obj = mock.MagicMock()
|
||||
self.wrapped_obj = mock.MagicMock()
|
||||
self.content_type = mock.sentinel.content_type
|
||||
|
||||
self.all_nonces = [jose.b64encode(b'Nonce'), jose.b64encode(b'Nonce2')]
|
||||
self.available_nonces = self.all_nonces[:]
|
||||
|
||||
def send_request(*args, **kwargs):
|
||||
# pylint: disable=unused-argument,missing-docstring
|
||||
if self.available_nonces:
|
||||
self.response.headers = {
|
||||
self.net.REPLAY_NONCE_HEADER:
|
||||
self.available_nonces.pop().decode()}
|
||||
else:
|
||||
self.response.headers = {}
|
||||
return self.response
|
||||
|
||||
# pylint: disable=protected-access
|
||||
self.net._send_request = self.send_request = mock.MagicMock(
|
||||
side_effect=send_request)
|
||||
self.net._check_response = self.check_response
|
||||
self.net._wrap_in_jws = mock.MagicMock(return_value=self.wrapped_obj)
|
||||
|
||||
def check_response(self, response, content_type):
|
||||
# pylint: disable=missing-docstring
|
||||
self.assertEqual(self.response, response)
|
||||
self.assertEqual(self.content_type, content_type)
|
||||
return self.checked_response
|
||||
|
||||
def test_head(self):
|
||||
self.assertEqual(self.response, self.net.head(
|
||||
'http://example.com/', 'foo', bar='baz'))
|
||||
self.send_request.assert_called_once_with(
|
||||
'HEAD', 'http://example.com/', 'foo', bar='baz')
|
||||
|
||||
def test_get(self):
|
||||
self.assertEqual(self.checked_response, self.net.get(
|
||||
'http://example.com/', content_type=self.content_type, bar='baz'))
|
||||
self.send_request.assert_called_once_with(
|
||||
'GET', 'http://example.com/', bar='baz')
|
||||
|
||||
def test_post(self):
|
||||
# pylint: disable=protected-access
|
||||
self.assertEqual(self.checked_response, self.net.post(
|
||||
'uri', self.obj, content_type=self.content_type))
|
||||
self.net._wrap_in_jws.assert_called_once_with(
|
||||
self.obj, jose.b64decode(self.all_nonces.pop()))
|
||||
|
||||
assert not self.available_nonces
|
||||
self.assertRaises(errors.MissingNonce, self.net.post,
|
||||
'uri', self.obj, content_type=self.content_type)
|
||||
self.net._wrap_in_jws.assert_called_with(
|
||||
self.obj, jose.b64decode(self.all_nonces.pop()))
|
||||
|
||||
def test_post_wrong_initial_nonce(self): # HEAD
|
||||
self.available_nonces = [b'f', jose.b64encode(b'good')]
|
||||
self.assertRaises(errors.BadNonce, self.net.post, 'uri',
|
||||
self.obj, content_type=self.content_type)
|
||||
|
||||
def test_post_wrong_post_response_nonce(self):
|
||||
self.available_nonces = [jose.b64encode(b'good'), b'f']
|
||||
self.assertRaises(errors.BadNonce, self.net.post, 'uri',
|
||||
self.obj, content_type=self.content_type)
|
||||
|
||||
def test_head_get_post_error_passthrough(self):
|
||||
self.send_request.side_effect = requests.exceptions.RequestException
|
||||
for method in self.net.head, self.net.get:
|
||||
self.assertRaises(
|
||||
requests.exceptions.RequestException, method, 'GET', 'uri')
|
||||
self.assertRaises(requests.exceptions.RequestException,
|
||||
self.net.post, 'uri', obj=self.obj)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -2,33 +2,31 @@
|
||||
import binascii
|
||||
import contextlib
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import socket
|
||||
import sys
|
||||
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
from OpenSSL import SSL # type: ignore # https://github.com/python/typeshed/issues/2052
|
||||
import OpenSSL
|
||||
|
||||
from acme import errors
|
||||
from acme.magic_typing import Callable # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Optional # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Tuple # pylint: disable=unused-import, no-name-in-module
|
||||
from acme.magic_typing import Union # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# Default SSL method selected here is the most compatible, while secure
|
||||
# SSL method: TLSv1_METHOD is only compatible with
|
||||
# TLSSNI01 certificate serving and probing is not affected by SSL
|
||||
# vulnerabilities: prober needs to check certificate for expected
|
||||
# contents anyway. Working SNI is the only thing that's necessary for
|
||||
# the challenge and thus scoping down SSL/TLS method (version) would
|
||||
# cause interoperability issues: TLSv1_METHOD is only compatible with
|
||||
# TLSv1_METHOD, while SSLv23_METHOD is compatible with all other
|
||||
# methods, including TLSv2_METHOD (read more at
|
||||
# https://www.openssl.org/docs/ssl/SSLv23_method.html). _serve_sni
|
||||
# should be changed to use "set_options" to disable SSLv2 and SSLv3,
|
||||
# in case it's used for things other than probing/serving!
|
||||
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD # type: ignore
|
||||
_DEFAULT_TLSSNI01_SSL_METHOD = OpenSSL.SSL.SSLv23_METHOD
|
||||
|
||||
|
||||
class SSLSocket(object):
|
||||
class SSLSocket(object): # pylint: disable=too-few-public-methods
|
||||
"""SSL wrapper for sockets.
|
||||
|
||||
:ivar socket sock: Original wrapped socket.
|
||||
@@ -37,7 +35,7 @@ class SSLSocket(object):
|
||||
:ivar method: See `OpenSSL.SSL.Context` for allowed values.
|
||||
|
||||
"""
|
||||
def __init__(self, sock, certs, method=_DEFAULT_SSL_METHOD):
|
||||
def __init__(self, sock, certs, method=_DEFAULT_TLSSNI01_SSL_METHOD):
|
||||
self.sock = sock
|
||||
self.certs = certs
|
||||
self.method = method
|
||||
@@ -64,9 +62,7 @@ class SSLSocket(object):
|
||||
logger.debug("Server name (%s) not recognized, dropping SSL",
|
||||
server_name)
|
||||
return
|
||||
new_context = SSL.Context(self.method)
|
||||
new_context.set_options(SSL.OP_NO_SSLv2)
|
||||
new_context.set_options(SSL.OP_NO_SSLv3)
|
||||
new_context = OpenSSL.SSL.Context(self.method)
|
||||
new_context.use_privatekey(key)
|
||||
new_context.use_certificate(cert)
|
||||
connection.set_context(new_context)
|
||||
@@ -74,7 +70,7 @@ class SSLSocket(object):
|
||||
class FakeConnection(object):
|
||||
"""Fake OpenSSL.SSL.Connection."""
|
||||
|
||||
# pylint: disable=missing-docstring
|
||||
# pylint: disable=too-few-public-methods,missing-docstring
|
||||
|
||||
def __init__(self, connection):
|
||||
self._wrapped = connection
|
||||
@@ -89,18 +85,16 @@ class SSLSocket(object):
|
||||
def accept(self): # pylint: disable=missing-docstring
|
||||
sock, addr = self.sock.accept()
|
||||
|
||||
context = SSL.Context(self.method)
|
||||
context.set_options(SSL.OP_NO_SSLv2)
|
||||
context.set_options(SSL.OP_NO_SSLv3)
|
||||
context = OpenSSL.SSL.Context(self.method)
|
||||
context.set_tlsext_servername_callback(self._pick_certificate_cb)
|
||||
|
||||
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
|
||||
ssl_sock = self.FakeConnection(OpenSSL.SSL.Connection(context, sock))
|
||||
ssl_sock.set_accept_state()
|
||||
|
||||
logger.debug("Performing handshake with %s", addr)
|
||||
try:
|
||||
ssl_sock.do_handshake()
|
||||
except SSL.Error as error:
|
||||
except OpenSSL.SSL.Error as error:
|
||||
# _pick_certificate_cb might have returned without
|
||||
# creating SSL context (wrong server name)
|
||||
raise socket.error(error)
|
||||
@@ -109,7 +103,7 @@ class SSLSocket(object):
|
||||
|
||||
|
||||
def probe_sni(name, host, port=443, timeout=300,
|
||||
method=_DEFAULT_SSL_METHOD, source_address=('', 0)):
|
||||
method=_DEFAULT_TLSSNI01_SSL_METHOD, source_address=('0', 0)):
|
||||
"""Probe SNI server for SSL certificate.
|
||||
|
||||
:param bytes name: Byte string to send as the server name in the
|
||||
@@ -128,73 +122,29 @@ def probe_sni(name, host, port=443, timeout=300,
|
||||
:rtype: OpenSSL.crypto.X509
|
||||
|
||||
"""
|
||||
context = SSL.Context(method)
|
||||
context = OpenSSL.SSL.Context(method)
|
||||
context.set_timeout(timeout)
|
||||
|
||||
socket_kwargs = {'source_address': source_address}
|
||||
socket_kwargs = {} if sys.version_info < (2, 7) else {
|
||||
'source_address': source_address}
|
||||
|
||||
try:
|
||||
logger.debug(
|
||||
"Attempting to connect to %s:%d%s.", host, port,
|
||||
" from {0}:{1}".format(
|
||||
source_address[0],
|
||||
source_address[1]
|
||||
) if socket_kwargs else ""
|
||||
)
|
||||
socket_tuple = (host, port) # type: Tuple[str, int]
|
||||
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore
|
||||
# pylint: disable=star-args
|
||||
sock = socket.create_connection((host, port), **socket_kwargs)
|
||||
except socket.error as error:
|
||||
raise errors.Error(error)
|
||||
|
||||
with contextlib.closing(sock) as client:
|
||||
client_ssl = SSL.Connection(context, client)
|
||||
client_ssl = OpenSSL.SSL.Connection(context, client)
|
||||
client_ssl.set_connect_state()
|
||||
client_ssl.set_tlsext_host_name(name) # pyOpenSSL>=0.13
|
||||
try:
|
||||
client_ssl.do_handshake()
|
||||
client_ssl.shutdown()
|
||||
except SSL.Error as error:
|
||||
except OpenSSL.SSL.Error as error:
|
||||
raise errors.Error(error)
|
||||
return client_ssl.get_peer_certificate()
|
||||
|
||||
def make_csr(private_key_pem, domains, must_staple=False):
|
||||
"""Generate a CSR containing a list of domains as subjectAltNames.
|
||||
|
||||
:param buffer private_key_pem: Private key, in PEM PKCS#8 format.
|
||||
:param list domains: List of DNS names to include in subjectAltNames of CSR.
|
||||
:param bool must_staple: Whether to include the TLS Feature extension (aka
|
||||
OCSP Must Staple: https://tools.ietf.org/html/rfc7633).
|
||||
:returns: buffer PEM-encoded Certificate Signing Request.
|
||||
"""
|
||||
private_key = crypto.load_privatekey(
|
||||
crypto.FILETYPE_PEM, private_key_pem)
|
||||
csr = crypto.X509Req()
|
||||
extensions = [
|
||||
crypto.X509Extension(
|
||||
b'subjectAltName',
|
||||
critical=False,
|
||||
value=', '.join('DNS:' + d for d in domains).encode('ascii')
|
||||
),
|
||||
]
|
||||
if must_staple:
|
||||
extensions.append(crypto.X509Extension(
|
||||
b"1.3.6.1.5.5.7.1.24",
|
||||
critical=False,
|
||||
value=b"DER:30:03:02:01:05"))
|
||||
csr.add_extensions(extensions)
|
||||
csr.set_pubkey(private_key)
|
||||
csr.set_version(2)
|
||||
csr.sign(private_key, 'sha256')
|
||||
return crypto.dump_certificate_request(
|
||||
crypto.FILETYPE_PEM, csr)
|
||||
|
||||
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req):
|
||||
common_name = loaded_cert_or_req.get_subject().CN
|
||||
sans = _pyopenssl_cert_or_req_san(loaded_cert_or_req)
|
||||
|
||||
if common_name is None:
|
||||
return sans
|
||||
return [common_name] + [d for d in sans if d != common_name]
|
||||
|
||||
def _pyopenssl_cert_or_req_san(cert_or_req):
|
||||
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
|
||||
@@ -222,15 +172,14 @@ def _pyopenssl_cert_or_req_san(cert_or_req):
|
||||
parts_separator = ", "
|
||||
prefix = "DNS" + part_separator
|
||||
|
||||
if isinstance(cert_or_req, crypto.X509):
|
||||
# pylint: disable=line-too-long
|
||||
func = crypto.dump_certificate # type: Union[Callable[[int, crypto.X509Req], bytes], Callable[[int, crypto.X509], bytes]]
|
||||
if isinstance(cert_or_req, OpenSSL.crypto.X509):
|
||||
func = OpenSSL.crypto.dump_certificate
|
||||
else:
|
||||
func = crypto.dump_certificate_request
|
||||
text = func(crypto.FILETYPE_TEXT, cert_or_req).decode("utf-8")
|
||||
func = OpenSSL.crypto.dump_certificate_request
|
||||
text = func(OpenSSL.crypto.FILETYPE_TEXT, cert_or_req).decode("utf-8")
|
||||
# WARNING: this function does not support multiple SANs extensions.
|
||||
# Multiple X509v3 extensions of the same type is disallowed by RFC 5280.
|
||||
match = re.search(r"X509v3 Subject Alternative Name:(?: critical)?\s*(.*)", text)
|
||||
match = re.search(r"X509v3 Subject Alternative Name:\s*(.*)", text)
|
||||
# WARNING: this function assumes that no SAN can include
|
||||
# parts_separator, hence the split!
|
||||
sans_parts = [] if match is None else match.group(1).split(parts_separator)
|
||||
@@ -254,12 +203,12 @@ def gen_ss_cert(key, domains, not_before=None,
|
||||
|
||||
"""
|
||||
assert domains, "Must provide one or more hostnames for the cert."
|
||||
cert = crypto.X509()
|
||||
cert.set_serial_number(int(binascii.hexlify(os.urandom(16)), 16))
|
||||
cert = OpenSSL.crypto.X509()
|
||||
cert.set_serial_number(int(binascii.hexlify(OpenSSL.rand.bytes(16)), 16))
|
||||
cert.set_version(2)
|
||||
|
||||
extensions = [
|
||||
crypto.X509Extension(
|
||||
OpenSSL.crypto.X509Extension(
|
||||
b"basicConstraints", True, b"CA:TRUE, pathlen:0"),
|
||||
]
|
||||
|
||||
@@ -268,7 +217,7 @@ def gen_ss_cert(key, domains, not_before=None,
|
||||
cert.set_issuer(cert.get_subject())
|
||||
|
||||
if force_san or len(domains) > 1:
|
||||
extensions.append(crypto.X509Extension(
|
||||
extensions.append(OpenSSL.crypto.X509Extension(
|
||||
b"subjectAltName",
|
||||
critical=False,
|
||||
value=b", ".join(b"DNS:" + d.encode() for d in domains)
|
||||
@@ -282,26 +231,3 @@ def gen_ss_cert(key, domains, not_before=None,
|
||||
cert.set_pubkey(key)
|
||||
cert.sign(key, "sha256")
|
||||
return cert
|
||||
|
||||
def dump_pyopenssl_chain(chain, filetype=crypto.FILETYPE_PEM):
|
||||
"""Dump certificate chain into a bundle.
|
||||
|
||||
:param list chain: List of `OpenSSL.crypto.X509` (or wrapped in
|
||||
:class:`josepy.util.ComparableX509`).
|
||||
|
||||
:returns: certificate chain bundle
|
||||
:rtype: bytes
|
||||
|
||||
"""
|
||||
# XXX: returns empty string when no chain is available, which
|
||||
# shuts up RenewableCert, but might not be the best solution...
|
||||
|
||||
def _dump_cert(cert):
|
||||
if isinstance(cert, jose.ComparableX509):
|
||||
# pylint: disable=protected-access
|
||||
cert = cert.wrapped
|
||||
return crypto.dump_certificate(filetype, cert)
|
||||
|
||||
# assumes that OpenSSL.crypto.dump_certificate includes ending
|
||||
# newline character
|
||||
return b"".join(_dump_cert(cert) for cert in chain)
|
||||
|
||||
150
acme/acme/crypto_util_test.py
Normal file
150
acme/acme/crypto_util_test.py
Normal file
@@ -0,0 +1,150 @@
|
||||
"""Tests for acme.crypto_util."""
|
||||
import itertools
|
||||
import socket
|
||||
import threading
|
||||
import time
|
||||
import unittest
|
||||
|
||||
import six
|
||||
from six.moves import socketserver # pylint: disable=import-error
|
||||
|
||||
import OpenSSL
|
||||
|
||||
from acme import errors
|
||||
from acme import jose
|
||||
from acme import test_util
|
||||
|
||||
|
||||
class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
"""Tests for acme.crypto_util.SSLSocket/probe_sni."""
|
||||
|
||||
def setUp(self):
|
||||
self.cert = test_util.load_comparable_cert('cert.pem')
|
||||
key = test_util.load_pyopenssl_private_key('rsa512_key.pem')
|
||||
# pylint: disable=protected-access
|
||||
certs = {b'foo': (key, self.cert.wrapped)}
|
||||
|
||||
from acme.crypto_util import SSLSocket
|
||||
|
||||
class _TestServer(socketserver.TCPServer):
|
||||
|
||||
# pylint: disable=too-few-public-methods
|
||||
# six.moves.* | pylint: disable=attribute-defined-outside-init,no-init
|
||||
|
||||
def server_bind(self): # pylint: disable=missing-docstring
|
||||
self.socket = SSLSocket(socket.socket(), certs=certs)
|
||||
socketserver.TCPServer.server_bind(self)
|
||||
|
||||
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
|
||||
self.port = self.server.socket.getsockname()[1]
|
||||
self.server_thread = threading.Thread(
|
||||
# pylint: disable=no-member
|
||||
target=self.server.handle_request)
|
||||
self.server_thread.start()
|
||||
time.sleep(1) # TODO: avoid race conditions in other way
|
||||
|
||||
def tearDown(self):
|
||||
self.server_thread.join()
|
||||
|
||||
def _probe(self, name):
|
||||
from acme.crypto_util import probe_sni
|
||||
return jose.ComparableX509(probe_sni(
|
||||
name, host='127.0.0.1', port=self.port))
|
||||
|
||||
def test_probe_ok(self):
|
||||
self.assertEqual(self.cert, self._probe(b'foo'))
|
||||
|
||||
def test_probe_not_recognized_name(self):
|
||||
self.assertRaises(errors.Error, self._probe, b'bar')
|
||||
|
||||
# TODO: py33/py34 tox hangs forever on do_hendshake in second probe
|
||||
#def probe_connection_error(self):
|
||||
# self._probe(b'foo')
|
||||
# #time.sleep(1) # TODO: avoid race conditions in other way
|
||||
# self.assertRaises(errors.Error, self._probe, b'bar')
|
||||
|
||||
|
||||
class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
|
||||
"""Test for acme.crypto_util._pyopenssl_cert_or_req_san."""
|
||||
|
||||
@classmethod
|
||||
def _call(cls, loader, name):
|
||||
# pylint: disable=protected-access
|
||||
from acme.crypto_util import _pyopenssl_cert_or_req_san
|
||||
return _pyopenssl_cert_or_req_san(loader(name))
|
||||
|
||||
@classmethod
|
||||
def _get_idn_names(cls):
|
||||
"""Returns expected names from '{cert,csr}-idnsans.pem'."""
|
||||
chars = [six.unichr(i) for i in itertools.chain(range(0x3c3, 0x400),
|
||||
range(0x641, 0x6fc),
|
||||
range(0x1820, 0x1877))]
|
||||
return [''.join(chars[i: i + 45]) + '.invalid'
|
||||
for i in range(0, len(chars), 45)]
|
||||
|
||||
def _call_cert(self, name):
|
||||
return self._call(test_util.load_cert, name)
|
||||
|
||||
def _call_csr(self, name):
|
||||
return self._call(test_util.load_csr, name)
|
||||
|
||||
def test_cert_no_sans(self):
|
||||
self.assertEqual(self._call_cert('cert.pem'), [])
|
||||
|
||||
def test_cert_two_sans(self):
|
||||
self.assertEqual(self._call_cert('cert-san.pem'),
|
||||
['example.com', 'www.example.com'])
|
||||
|
||||
def test_cert_hundred_sans(self):
|
||||
self.assertEqual(self._call_cert('cert-100sans.pem'),
|
||||
['example{0}.com'.format(i) for i in range(1, 101)])
|
||||
|
||||
def test_cert_idn_sans(self):
|
||||
self.assertEqual(self._call_cert('cert-idnsans.pem'),
|
||||
self._get_idn_names())
|
||||
|
||||
def test_csr_no_sans(self):
|
||||
self.assertEqual(self._call_csr('csr-nosans.pem'), [])
|
||||
|
||||
def test_csr_one_san(self):
|
||||
self.assertEqual(self._call_csr('csr.pem'), ['example.com'])
|
||||
|
||||
def test_csr_two_sans(self):
|
||||
self.assertEqual(self._call_csr('csr-san.pem'),
|
||||
['example.com', 'www.example.com'])
|
||||
|
||||
def test_csr_six_sans(self):
|
||||
self.assertEqual(self._call_csr('csr-6sans.pem'),
|
||||
['example.com', 'example.org', 'example.net',
|
||||
'example.info', 'subdomain.example.com',
|
||||
'other.subdomain.example.com'])
|
||||
|
||||
def test_csr_hundred_sans(self):
|
||||
self.assertEqual(self._call_csr('csr-100sans.pem'),
|
||||
['example{0}.com'.format(i) for i in range(1, 101)])
|
||||
|
||||
def test_csr_idn_sans(self):
|
||||
self.assertEqual(self._call_csr('csr-idnsans.pem'),
|
||||
self._get_idn_names())
|
||||
|
||||
|
||||
class RandomSnTest(unittest.TestCase):
|
||||
"""Test for random certificate serial numbers."""
|
||||
|
||||
def setUp(self):
|
||||
self.cert_count = 5
|
||||
self.serial_num = []
|
||||
self.key = OpenSSL.crypto.PKey()
|
||||
self.key.generate_key(OpenSSL.crypto.TYPE_RSA, 2048)
|
||||
|
||||
def test_sn_collisions(self):
|
||||
from acme.crypto_util import gen_ss_cert
|
||||
|
||||
for _ in range(self.cert_count):
|
||||
cert = gen_ss_cert(self.key, ['dummy'], force_san=True)
|
||||
self.serial_num.append(cert.get_serial_number())
|
||||
self.assertTrue(len(set(self.serial_num)) > 1)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
45
acme/acme/dns_resolver.py
Normal file
45
acme/acme/dns_resolver.py
Normal file
@@ -0,0 +1,45 @@
|
||||
"""DNS Resolver for ACME client.
|
||||
Required only for local validation of 'dns-01' challenges.
|
||||
"""
|
||||
import logging
|
||||
|
||||
from acme import errors
|
||||
from acme import util
|
||||
|
||||
DNS_REQUIREMENT = 'dnspython>=1.12'
|
||||
|
||||
try:
|
||||
util.activate(DNS_REQUIREMENT)
|
||||
# pragma: no cover
|
||||
import dns.exception
|
||||
import dns.resolver
|
||||
DNS_AVAILABLE = True
|
||||
except errors.DependencyError: # pragma: no cover
|
||||
DNS_AVAILABLE = False
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def txt_records_for_name(name):
|
||||
"""Resolve the name and return the TXT records.
|
||||
|
||||
:param unicode name: Domain name being verified.
|
||||
|
||||
:returns: A list of txt records, if empty the name could not be resolved
|
||||
:rtype: list of unicode
|
||||
|
||||
"""
|
||||
if not DNS_AVAILABLE:
|
||||
raise errors.DependencyError(
|
||||
'{0} is required to use this function'.format(DNS_REQUIREMENT))
|
||||
try:
|
||||
dns_response = dns.resolver.query(name, 'TXT')
|
||||
except dns.resolver.NXDOMAIN as error:
|
||||
return []
|
||||
except dns.exception.DNSException as error:
|
||||
logger.error("Error resolving %s: %s", name, str(error))
|
||||
return []
|
||||
|
||||
return [txt_rec.decode("utf-8") for rdata in dns_response
|
||||
for txt_rec in rdata.strings]
|
||||
77
acme/acme/dns_resolver_test.py
Normal file
77
acme/acme/dns_resolver_test.py
Normal file
@@ -0,0 +1,77 @@
|
||||
"""Tests for acme.dns_resolver."""
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
from six.moves import reload_module # pylint: disable=import-error
|
||||
|
||||
from acme import errors
|
||||
from acme import test_util
|
||||
from acme.dns_resolver import DNS_REQUIREMENT
|
||||
|
||||
|
||||
if test_util.requirement_available(DNS_REQUIREMENT):
|
||||
import dns
|
||||
|
||||
|
||||
def create_txt_response(name, txt_records):
|
||||
"""
|
||||
Returns an RRSet containing the 'txt_records' as the result of a DNS
|
||||
query for 'name'.
|
||||
|
||||
This takes advantage of the fact that an Answer object mostly behaves
|
||||
like an RRset.
|
||||
"""
|
||||
return dns.rrset.from_text_list(name, 60, "IN", "TXT", txt_records)
|
||||
|
||||
|
||||
class TxtRecordsForNameTest(unittest.TestCase):
|
||||
"""Tests for acme.dns_resolver.txt_records_for_name."""
|
||||
@classmethod
|
||||
def _call(cls, *args, **kwargs):
|
||||
from acme.dns_resolver import txt_records_for_name
|
||||
return txt_records_for_name(*args, **kwargs)
|
||||
|
||||
|
||||
@test_util.skip_unless(test_util.requirement_available(DNS_REQUIREMENT),
|
||||
"optional dependency dnspython is not available")
|
||||
class TxtRecordsForNameWithDnsTest(TxtRecordsForNameTest):
|
||||
"""Tests for acme.dns_resolver.txt_records_for_name with dns."""
|
||||
@mock.patch("acme.dns_resolver.dns.resolver.query")
|
||||
def test_txt_records_for_name_with_single_response(self, mock_dns):
|
||||
mock_dns.return_value = create_txt_response('name', ['response'])
|
||||
self.assertEqual(['response'], self._call('name'))
|
||||
|
||||
@mock.patch("acme.dns_resolver.dns.resolver.query")
|
||||
def test_txt_records_for_name_with_multiple_responses(self, mock_dns):
|
||||
mock_dns.return_value = create_txt_response(
|
||||
'name', ['response1', 'response2'])
|
||||
self.assertEqual(['response1', 'response2'], self._call('name'))
|
||||
|
||||
@mock.patch("acme.dns_resolver.dns.resolver.query")
|
||||
def test_txt_records_for_name_domain_not_found(self, mock_dns):
|
||||
mock_dns.side_effect = dns.resolver.NXDOMAIN
|
||||
self.assertEquals([], self._call('name'))
|
||||
|
||||
@mock.patch("acme.dns_resolver.dns.resolver.query")
|
||||
def test_txt_records_for_name_domain_other_error(self, mock_dns):
|
||||
mock_dns.side_effect = dns.exception.DNSException
|
||||
self.assertEquals([], self._call('name'))
|
||||
|
||||
|
||||
class TxtRecordsForNameWithoutDnsTest(TxtRecordsForNameTest):
|
||||
"""Tests for acme.dns_resolver.txt_records_for_name without dns."""
|
||||
def setUp(self):
|
||||
from acme import dns_resolver
|
||||
dns_resolver.DNS_AVAILABLE = False
|
||||
|
||||
def tearDown(self):
|
||||
from acme import dns_resolver
|
||||
reload_module(dns_resolver)
|
||||
|
||||
def test_exception_raised(self):
|
||||
self.assertRaises(
|
||||
errors.DependencyError, self._call, "example.org")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,5 +1,5 @@
|
||||
"""ACME errors."""
|
||||
from josepy import errors as jose_errors
|
||||
from acme.jose import errors as jose_errors
|
||||
|
||||
|
||||
class Error(Exception):
|
||||
@@ -29,12 +29,7 @@ class NonceError(ClientError):
|
||||
class BadNonce(NonceError):
|
||||
"""Bad nonce error."""
|
||||
def __init__(self, nonce, error, *args, **kwargs):
|
||||
# MyPy complains here that there is too many arguments for BaseException constructor.
|
||||
# This is an error fixed in typeshed, see https://github.com/python/mypy/issues/4183
|
||||
# The fix is included in MyPy>=0.740, but upgrading it would bring dozen of errors due to
|
||||
# new types definitions. So we ignore the error until the code base is fixed to match
|
||||
# with MyPy>=0.740 referential.
|
||||
super(BadNonce, self).__init__(*args, **kwargs) # type: ignore
|
||||
super(BadNonce, self).__init__(*args, **kwargs)
|
||||
self.nonce = nonce
|
||||
self.error = error
|
||||
|
||||
@@ -53,8 +48,7 @@ class MissingNonce(NonceError):
|
||||
|
||||
"""
|
||||
def __init__(self, response, *args, **kwargs):
|
||||
# See comment in BadNonce constructor above for an explanation of type: ignore here.
|
||||
super(MissingNonce, self).__init__(*args, **kwargs) # type: ignore
|
||||
super(MissingNonce, self).__init__(*args, **kwargs)
|
||||
self.response = response
|
||||
|
||||
def __str__(self):
|
||||
@@ -88,45 +82,3 @@ class PollError(ClientError):
|
||||
def __repr__(self):
|
||||
return '{0}(exhausted={1!r}, updated={2!r})'.format(
|
||||
self.__class__.__name__, self.exhausted, self.updated)
|
||||
|
||||
|
||||
class ValidationError(Error):
|
||||
"""Error for authorization failures. Contains a list of authorization
|
||||
resources, each of which is invalid and should have an error field.
|
||||
"""
|
||||
def __init__(self, failed_authzrs):
|
||||
self.failed_authzrs = failed_authzrs
|
||||
super(ValidationError, self).__init__()
|
||||
|
||||
|
||||
class TimeoutError(Error): # pylint: disable=redefined-builtin
|
||||
"""Error for when polling an authorization or an order times out."""
|
||||
|
||||
|
||||
class IssuanceError(Error):
|
||||
"""Error sent by the server after requesting issuance of a certificate."""
|
||||
|
||||
def __init__(self, error):
|
||||
"""Initialize.
|
||||
|
||||
:param messages.Error error: The error provided by the server.
|
||||
"""
|
||||
self.error = error
|
||||
super(IssuanceError, self).__init__()
|
||||
|
||||
|
||||
class ConflictError(ClientError):
|
||||
"""Error for when the server returns a 409 (Conflict) HTTP status.
|
||||
|
||||
In the version of ACME implemented by Boulder, this is used to find an
|
||||
account if you only have the private key, but don't know the account URL.
|
||||
|
||||
Also used in V2 of the ACME client for the same purpose.
|
||||
"""
|
||||
def __init__(self, location):
|
||||
self.location = location
|
||||
super(ConflictError, self).__init__()
|
||||
|
||||
|
||||
class WildcardUnsupportedError(Error):
|
||||
"""Error for when a wildcard is requested but is unsupported by ACME CA."""
|
||||
|
||||
@@ -1,9 +1,11 @@
|
||||
"""ACME JSON fields."""
|
||||
import logging
|
||||
|
||||
import josepy as jose
|
||||
import pyrfc3339
|
||||
|
||||
from acme import jose
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
|
||||
@@ -2,9 +2,10 @@
|
||||
import datetime
|
||||
import unittest
|
||||
|
||||
import josepy as jose
|
||||
import pytz
|
||||
|
||||
from acme import jose
|
||||
|
||||
|
||||
class FixedTest(unittest.TestCase):
|
||||
"""Tests for acme.fields.Fixed."""
|
||||
82
acme/acme/jose/__init__.py
Normal file
82
acme/acme/jose/__init__.py
Normal file
@@ -0,0 +1,82 @@
|
||||
"""Javascript Object Signing and Encryption (jose).
|
||||
|
||||
This package is a Python implementation of the stadards developed by
|
||||
IETF `Javascript Object Signing and Encryption (Active WG)`_, in
|
||||
particular the following RFCs:
|
||||
|
||||
- `JSON Web Algorithms (JWA)`_
|
||||
- `JSON Web Key (JWK)`_
|
||||
- `JSON Web Signature (JWS)`_
|
||||
|
||||
|
||||
.. _`Javascript Object Signing and Encryption (Active WG)`:
|
||||
https://tools.ietf.org/wg/jose/
|
||||
|
||||
.. _`JSON Web Algorithms (JWA)`:
|
||||
https://datatracker.ietf.org/doc/draft-ietf-jose-json-web-algorithms/
|
||||
|
||||
.. _`JSON Web Key (JWK)`:
|
||||
https://datatracker.ietf.org/doc/draft-ietf-jose-json-web-key/
|
||||
|
||||
.. _`JSON Web Signature (JWS)`:
|
||||
https://datatracker.ietf.org/doc/draft-ietf-jose-json-web-signature/
|
||||
|
||||
"""
|
||||
from acme.jose.b64 import (
|
||||
b64decode,
|
||||
b64encode,
|
||||
)
|
||||
|
||||
from acme.jose.errors import (
|
||||
DeserializationError,
|
||||
SerializationError,
|
||||
Error,
|
||||
UnrecognizedTypeError,
|
||||
)
|
||||
|
||||
from acme.jose.interfaces import JSONDeSerializable
|
||||
|
||||
from acme.jose.json_util import (
|
||||
Field,
|
||||
JSONObjectWithFields,
|
||||
TypedJSONObjectWithFields,
|
||||
decode_b64jose,
|
||||
decode_cert,
|
||||
decode_csr,
|
||||
decode_hex16,
|
||||
encode_b64jose,
|
||||
encode_cert,
|
||||
encode_csr,
|
||||
encode_hex16,
|
||||
)
|
||||
|
||||
from acme.jose.jwa import (
|
||||
HS256,
|
||||
HS384,
|
||||
HS512,
|
||||
JWASignature,
|
||||
PS256,
|
||||
PS384,
|
||||
PS512,
|
||||
RS256,
|
||||
RS384,
|
||||
RS512,
|
||||
)
|
||||
|
||||
from acme.jose.jwk import (
|
||||
JWK,
|
||||
JWKRSA,
|
||||
)
|
||||
|
||||
from acme.jose.jws import (
|
||||
Header,
|
||||
JWS,
|
||||
Signature,
|
||||
)
|
||||
|
||||
from acme.jose.util import (
|
||||
ComparableX509,
|
||||
ComparableKey,
|
||||
ComparableRSAKey,
|
||||
ImmutableMap,
|
||||
)
|
||||
61
acme/acme/jose/b64.py
Normal file
61
acme/acme/jose/b64.py
Normal file
@@ -0,0 +1,61 @@
|
||||
"""JOSE Base64.
|
||||
|
||||
`JOSE Base64`_ is defined as:
|
||||
|
||||
- URL-safe Base64
|
||||
- padding stripped
|
||||
|
||||
|
||||
.. _`JOSE Base64`:
|
||||
https://tools.ietf.org/html/draft-ietf-jose-json-web-signature-37#appendix-C
|
||||
|
||||
.. Do NOT try to call this module "base64", as it will "shadow" the
|
||||
standard library.
|
||||
|
||||
"""
|
||||
import base64
|
||||
|
||||
import six
|
||||
|
||||
|
||||
def b64encode(data):
|
||||
"""JOSE Base64 encode.
|
||||
|
||||
:param data: Data to be encoded.
|
||||
:type data: `bytes`
|
||||
|
||||
:returns: JOSE Base64 string.
|
||||
:rtype: bytes
|
||||
|
||||
:raises TypeError: if `data` is of incorrect type
|
||||
|
||||
"""
|
||||
if not isinstance(data, six.binary_type):
|
||||
raise TypeError('argument should be {0}'.format(six.binary_type))
|
||||
return base64.urlsafe_b64encode(data).rstrip(b'=')
|
||||
|
||||
|
||||
def b64decode(data):
|
||||
"""JOSE Base64 decode.
|
||||
|
||||
:param data: Base64 string to be decoded. If it's unicode, then
|
||||
only ASCII characters are allowed.
|
||||
:type data: `bytes` or `unicode`
|
||||
|
||||
:returns: Decoded data.
|
||||
:rtype: bytes
|
||||
|
||||
:raises TypeError: if input is of incorrect type
|
||||
:raises ValueError: if input is unicode with non-ASCII characters
|
||||
|
||||
"""
|
||||
if isinstance(data, six.string_types):
|
||||
try:
|
||||
data = data.encode('ascii')
|
||||
except UnicodeEncodeError:
|
||||
raise ValueError(
|
||||
'unicode argument should contain only ASCII characters')
|
||||
elif not isinstance(data, six.binary_type):
|
||||
raise TypeError('argument should be a str or unicode')
|
||||
|
||||
return base64.urlsafe_b64decode(data + b'=' * (4 - (len(data) % 4)))
|
||||
77
acme/acme/jose/b64_test.py
Normal file
77
acme/acme/jose/b64_test.py
Normal file
@@ -0,0 +1,77 @@
|
||||
"""Tests for acme.jose.b64."""
|
||||
import unittest
|
||||
|
||||
import six
|
||||
|
||||
|
||||
# https://en.wikipedia.org/wiki/Base64#Examples
|
||||
B64_PADDING_EXAMPLES = {
|
||||
b'any carnal pleasure.': (b'YW55IGNhcm5hbCBwbGVhc3VyZS4', b'='),
|
||||
b'any carnal pleasure': (b'YW55IGNhcm5hbCBwbGVhc3VyZQ', b'=='),
|
||||
b'any carnal pleasur': (b'YW55IGNhcm5hbCBwbGVhc3Vy', b''),
|
||||
b'any carnal pleasu': (b'YW55IGNhcm5hbCBwbGVhc3U', b'='),
|
||||
b'any carnal pleas': (b'YW55IGNhcm5hbCBwbGVhcw', b'=='),
|
||||
}
|
||||
|
||||
|
||||
B64_URL_UNSAFE_EXAMPLES = {
|
||||
six.int2byte(251) + six.int2byte(239): b'--8',
|
||||
six.int2byte(255) * 2: b'__8',
|
||||
}
|
||||
|
||||
|
||||
class B64EncodeTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.b64.b64encode."""
|
||||
|
||||
@classmethod
|
||||
def _call(cls, data):
|
||||
from acme.jose.b64 import b64encode
|
||||
return b64encode(data)
|
||||
|
||||
def test_empty(self):
|
||||
self.assertEqual(self._call(b''), b'')
|
||||
|
||||
def test_unsafe_url(self):
|
||||
for text, b64 in six.iteritems(B64_URL_UNSAFE_EXAMPLES):
|
||||
self.assertEqual(self._call(text), b64)
|
||||
|
||||
def test_different_paddings(self):
|
||||
for text, (b64, _) in six.iteritems(B64_PADDING_EXAMPLES):
|
||||
self.assertEqual(self._call(text), b64)
|
||||
|
||||
def test_unicode_fails_with_type_error(self):
|
||||
self.assertRaises(TypeError, self._call, u'some unicode')
|
||||
|
||||
|
||||
class B64DecodeTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.b64.b64decode."""
|
||||
|
||||
@classmethod
|
||||
def _call(cls, data):
|
||||
from acme.jose.b64 import b64decode
|
||||
return b64decode(data)
|
||||
|
||||
def test_unsafe_url(self):
|
||||
for text, b64 in six.iteritems(B64_URL_UNSAFE_EXAMPLES):
|
||||
self.assertEqual(self._call(b64), text)
|
||||
|
||||
def test_input_without_padding(self):
|
||||
for text, (b64, _) in six.iteritems(B64_PADDING_EXAMPLES):
|
||||
self.assertEqual(self._call(b64), text)
|
||||
|
||||
def test_input_with_padding(self):
|
||||
for text, (b64, pad) in six.iteritems(B64_PADDING_EXAMPLES):
|
||||
self.assertEqual(self._call(b64 + pad), text)
|
||||
|
||||
def test_unicode_with_ascii(self):
|
||||
self.assertEqual(self._call(u'YQ'), b'a')
|
||||
|
||||
def test_non_ascii_unicode_fails(self):
|
||||
self.assertRaises(ValueError, self._call, u'\u0105')
|
||||
|
||||
def test_type_error_no_unicode_or_bytes(self):
|
||||
self.assertRaises(TypeError, self._call, object())
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
35
acme/acme/jose/errors.py
Normal file
35
acme/acme/jose/errors.py
Normal file
@@ -0,0 +1,35 @@
|
||||
"""JOSE errors."""
|
||||
|
||||
|
||||
class Error(Exception):
|
||||
"""Generic JOSE Error."""
|
||||
|
||||
|
||||
class DeserializationError(Error):
|
||||
"""JSON deserialization error."""
|
||||
|
||||
def __str__(self):
|
||||
return "Deserialization error: {0}".format(
|
||||
super(DeserializationError, self).__str__())
|
||||
|
||||
|
||||
class SerializationError(Error):
|
||||
"""JSON serialization error."""
|
||||
|
||||
|
||||
class UnrecognizedTypeError(DeserializationError):
|
||||
"""Unrecognized type error.
|
||||
|
||||
:ivar str typ: The unrecognized type of the JSON object.
|
||||
:ivar jobj: Full JSON object.
|
||||
|
||||
"""
|
||||
|
||||
def __init__(self, typ, jobj):
|
||||
self.typ = typ
|
||||
self.jobj = jobj
|
||||
super(UnrecognizedTypeError, self).__init__(str(self))
|
||||
|
||||
def __str__(self):
|
||||
return '{0} was not recognized, full message: {1}'.format(
|
||||
self.typ, self.jobj)
|
||||
17
acme/acme/jose/errors_test.py
Normal file
17
acme/acme/jose/errors_test.py
Normal file
@@ -0,0 +1,17 @@
|
||||
"""Tests for acme.jose.errors."""
|
||||
import unittest
|
||||
|
||||
|
||||
class UnrecognizedTypeErrorTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
from acme.jose.errors import UnrecognizedTypeError
|
||||
self.error = UnrecognizedTypeError('foo', {'type': 'foo'})
|
||||
|
||||
def test_str(self):
|
||||
self.assertEqual(
|
||||
"foo was not recognized, full message: {'type': 'foo'}",
|
||||
str(self.error))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
216
acme/acme/jose/interfaces.py
Normal file
216
acme/acme/jose/interfaces.py
Normal file
@@ -0,0 +1,216 @@
|
||||
"""JOSE interfaces."""
|
||||
import abc
|
||||
import collections
|
||||
import json
|
||||
|
||||
import six
|
||||
|
||||
from acme.jose import errors
|
||||
from acme.jose import util
|
||||
|
||||
# pylint: disable=no-self-argument,no-method-argument,no-init,inherit-non-class
|
||||
# pylint: disable=too-few-public-methods
|
||||
|
||||
|
||||
@six.add_metaclass(abc.ABCMeta)
|
||||
class JSONDeSerializable(object):
|
||||
# pylint: disable=too-few-public-methods
|
||||
"""Interface for (de)serializable JSON objects.
|
||||
|
||||
Please recall, that standard Python library implements
|
||||
:class:`json.JSONEncoder` and :class:`json.JSONDecoder` that perform
|
||||
translations based on respective :ref:`conversion tables
|
||||
<conversion-table>` that look pretty much like the one below (for
|
||||
complete tables see relevant Python documentation):
|
||||
|
||||
.. _conversion-table:
|
||||
|
||||
====== ======
|
||||
JSON Python
|
||||
====== ======
|
||||
object dict
|
||||
... ...
|
||||
====== ======
|
||||
|
||||
While the above **conversion table** is about translation of JSON
|
||||
documents to/from the basic Python types only,
|
||||
:class:`JSONDeSerializable` introduces the following two concepts:
|
||||
|
||||
serialization
|
||||
Turning an arbitrary Python object into Python object that can
|
||||
be encoded into a JSON document. **Full serialization** produces
|
||||
a Python object composed of only basic types as required by the
|
||||
:ref:`conversion table <conversion-table>`. **Partial
|
||||
serialization** (accomplished by :meth:`to_partial_json`)
|
||||
produces a Python object that might also be built from other
|
||||
:class:`JSONDeSerializable` objects.
|
||||
|
||||
deserialization
|
||||
Turning a decoded Python object (necessarily one of the basic
|
||||
types as required by the :ref:`conversion table
|
||||
<conversion-table>`) into an arbitrary Python object.
|
||||
|
||||
Serialization produces **serialized object** ("partially serialized
|
||||
object" or "fully serialized object" for partial and full
|
||||
serialization respectively) and deserialization produces
|
||||
**deserialized object**, both usually denoted in the source code as
|
||||
``jobj``.
|
||||
|
||||
Wording in the official Python documentation might be confusing
|
||||
after reading the above, but in the light of those definitions, one
|
||||
can view :meth:`json.JSONDecoder.decode` as decoder and
|
||||
deserializer of basic types, :meth:`json.JSONEncoder.default` as
|
||||
serializer of basic types, :meth:`json.JSONEncoder.encode` as
|
||||
serializer and encoder of basic types.
|
||||
|
||||
One could extend :mod:`json` to support arbitrary object
|
||||
(de)serialization either by:
|
||||
|
||||
- overriding :meth:`json.JSONDecoder.decode` and
|
||||
:meth:`json.JSONEncoder.default` in subclasses
|
||||
|
||||
- or passing ``object_hook`` argument (or ``object_hook_pairs``)
|
||||
to :func:`json.load`/:func:`json.loads` or ``default`` argument
|
||||
for :func:`json.dump`/:func:`json.dumps`.
|
||||
|
||||
Interestingly, ``default`` is required to perform only partial
|
||||
serialization, as :func:`json.dumps` applies ``default``
|
||||
recursively. This is the idea behind making :meth:`to_partial_json`
|
||||
produce only partial serialization, while providing custom
|
||||
:meth:`json_dumps` that dumps with ``default`` set to
|
||||
:meth:`json_dump_default`.
|
||||
|
||||
To make further documentation a bit more concrete, please, consider
|
||||
the following imaginatory implementation example::
|
||||
|
||||
class Foo(JSONDeSerializable):
|
||||
def to_partial_json(self):
|
||||
return 'foo'
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
return Foo()
|
||||
|
||||
class Bar(JSONDeSerializable):
|
||||
def to_partial_json(self):
|
||||
return [Foo(), Foo()]
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
return Bar()
|
||||
|
||||
"""
|
||||
|
||||
@abc.abstractmethod
|
||||
def to_partial_json(self): # pragma: no cover
|
||||
"""Partially serialize.
|
||||
|
||||
Following the example, **partial serialization** means the following::
|
||||
|
||||
assert isinstance(Bar().to_partial_json()[0], Foo)
|
||||
assert isinstance(Bar().to_partial_json()[1], Foo)
|
||||
|
||||
# in particular...
|
||||
assert Bar().to_partial_json() != ['foo', 'foo']
|
||||
|
||||
:raises acme.jose.errors.SerializationError:
|
||||
in case of any serialization error.
|
||||
:returns: Partially serializable object.
|
||||
|
||||
"""
|
||||
raise NotImplementedError()
|
||||
|
||||
def to_json(self):
|
||||
"""Fully serialize.
|
||||
|
||||
Again, following the example from before, **full serialization**
|
||||
means the following::
|
||||
|
||||
assert Bar().to_json() == ['foo', 'foo']
|
||||
|
||||
:raises acme.jose.errors.SerializationError:
|
||||
in case of any serialization error.
|
||||
:returns: Fully serialized object.
|
||||
|
||||
"""
|
||||
def _serialize(obj):
|
||||
if isinstance(obj, JSONDeSerializable):
|
||||
return _serialize(obj.to_partial_json())
|
||||
if isinstance(obj, six.string_types): # strings are Sequence
|
||||
return obj
|
||||
elif isinstance(obj, list):
|
||||
return [_serialize(subobj) for subobj in obj]
|
||||
elif isinstance(obj, collections.Sequence):
|
||||
# default to tuple, otherwise Mapping could get
|
||||
# unhashable list
|
||||
return tuple(_serialize(subobj) for subobj in obj)
|
||||
elif isinstance(obj, collections.Mapping):
|
||||
return dict((_serialize(key), _serialize(value))
|
||||
for key, value in six.iteritems(obj))
|
||||
else:
|
||||
return obj
|
||||
|
||||
return _serialize(self)
|
||||
|
||||
@util.abstractclassmethod
|
||||
def from_json(cls, jobj): # pylint: disable=unused-argument
|
||||
"""Deserialize a decoded JSON document.
|
||||
|
||||
:param jobj: Python object, composed of only other basic data
|
||||
types, as decoded from JSON document. Not necessarily
|
||||
:class:`dict` (as decoded from "JSON object" document).
|
||||
|
||||
:raises acme.jose.errors.DeserializationError:
|
||||
if decoding was unsuccessful, e.g. in case of unparseable
|
||||
X509 certificate, or wrong padding in JOSE base64 encoded
|
||||
string, etc.
|
||||
|
||||
"""
|
||||
# TypeError: Can't instantiate abstract class <cls> with
|
||||
# abstract methods from_json, to_partial_json
|
||||
return cls() # pylint: disable=abstract-class-instantiated
|
||||
|
||||
@classmethod
|
||||
def json_loads(cls, json_string):
|
||||
"""Deserialize from JSON document string."""
|
||||
try:
|
||||
loads = json.loads(json_string)
|
||||
except ValueError as error:
|
||||
raise errors.DeserializationError(error)
|
||||
return cls.from_json(loads)
|
||||
|
||||
def json_dumps(self, **kwargs):
|
||||
"""Dump to JSON string using proper serializer.
|
||||
|
||||
:returns: JSON document string.
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
return json.dumps(self, default=self.json_dump_default, **kwargs)
|
||||
|
||||
def json_dumps_pretty(self):
|
||||
"""Dump the object to pretty JSON document string.
|
||||
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
return self.json_dumps(sort_keys=True, indent=4, separators=(',', ': '))
|
||||
|
||||
@classmethod
|
||||
def json_dump_default(cls, python_object):
|
||||
"""Serialize Python object.
|
||||
|
||||
This function is meant to be passed as ``default`` to
|
||||
:func:`json.dump` or :func:`json.dumps`. They call
|
||||
``default(python_object)`` only for non-basic Python types, so
|
||||
this function necessarily raises :class:`TypeError` if
|
||||
``python_object`` is not an instance of
|
||||
:class:`IJSONSerializable`.
|
||||
|
||||
Please read the class docstring for more information.
|
||||
|
||||
"""
|
||||
if isinstance(python_object, JSONDeSerializable):
|
||||
return python_object.to_partial_json()
|
||||
else: # this branch is necessary, cannot just "return"
|
||||
raise TypeError(repr(python_object) + ' is not JSON serializable')
|
||||
114
acme/acme/jose/interfaces_test.py
Normal file
114
acme/acme/jose/interfaces_test.py
Normal file
@@ -0,0 +1,114 @@
|
||||
"""Tests for acme.jose.interfaces."""
|
||||
import unittest
|
||||
|
||||
|
||||
class JSONDeSerializableTest(unittest.TestCase):
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
def setUp(self):
|
||||
from acme.jose.interfaces import JSONDeSerializable
|
||||
|
||||
# pylint: disable=missing-docstring,invalid-name
|
||||
|
||||
class Basic(JSONDeSerializable):
|
||||
def __init__(self, v):
|
||||
self.v = v
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.v
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
return cls(jobj)
|
||||
|
||||
class Sequence(JSONDeSerializable):
|
||||
def __init__(self, x, y):
|
||||
self.x = x
|
||||
self.y = y
|
||||
|
||||
def to_partial_json(self):
|
||||
return [self.x, self.y]
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
return cls(
|
||||
Basic.from_json(jobj[0]), Basic.from_json(jobj[1]))
|
||||
|
||||
class Mapping(JSONDeSerializable):
|
||||
def __init__(self, x, y):
|
||||
self.x = x
|
||||
self.y = y
|
||||
|
||||
def to_partial_json(self):
|
||||
return {self.x: self.y}
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
pass # pragma: no cover
|
||||
|
||||
self.basic1 = Basic('foo1')
|
||||
self.basic2 = Basic('foo2')
|
||||
self.seq = Sequence(self.basic1, self.basic2)
|
||||
self.mapping = Mapping(self.basic1, self.basic2)
|
||||
self.nested = Basic([[self.basic1]])
|
||||
self.tuple = Basic(('foo',))
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
self.Basic = Basic
|
||||
self.Sequence = Sequence
|
||||
self.Mapping = Mapping
|
||||
|
||||
def test_to_json_sequence(self):
|
||||
self.assertEqual(self.seq.to_json(), ['foo1', 'foo2'])
|
||||
|
||||
def test_to_json_mapping(self):
|
||||
self.assertEqual(self.mapping.to_json(), {'foo1': 'foo2'})
|
||||
|
||||
def test_to_json_other(self):
|
||||
mock_value = object()
|
||||
self.assertTrue(self.Basic(mock_value).to_json() is mock_value)
|
||||
|
||||
def test_to_json_nested(self):
|
||||
self.assertEqual(self.nested.to_json(), [['foo1']])
|
||||
|
||||
def test_to_json(self):
|
||||
self.assertEqual(self.tuple.to_json(), (('foo', )))
|
||||
|
||||
def test_from_json_not_implemented(self):
|
||||
from acme.jose.interfaces import JSONDeSerializable
|
||||
self.assertRaises(TypeError, JSONDeSerializable.from_json, 'xxx')
|
||||
|
||||
def test_json_loads(self):
|
||||
seq = self.Sequence.json_loads('["foo1", "foo2"]')
|
||||
self.assertTrue(isinstance(seq, self.Sequence))
|
||||
self.assertTrue(isinstance(seq.x, self.Basic))
|
||||
self.assertTrue(isinstance(seq.y, self.Basic))
|
||||
self.assertEqual(seq.x.v, 'foo1')
|
||||
self.assertEqual(seq.y.v, 'foo2')
|
||||
|
||||
def test_json_dumps(self):
|
||||
self.assertEqual('["foo1", "foo2"]', self.seq.json_dumps())
|
||||
|
||||
def test_json_dumps_pretty(self):
|
||||
self.assertEqual(self.seq.json_dumps_pretty(),
|
||||
'[\n "foo1",\n "foo2"\n]')
|
||||
|
||||
def test_json_dump_default(self):
|
||||
from acme.jose.interfaces import JSONDeSerializable
|
||||
|
||||
self.assertEqual(
|
||||
'foo1', JSONDeSerializable.json_dump_default(self.basic1))
|
||||
|
||||
jobj = JSONDeSerializable.json_dump_default(self.seq)
|
||||
self.assertEqual(len(jobj), 2)
|
||||
self.assertTrue(jobj[0] is self.basic1)
|
||||
self.assertTrue(jobj[1] is self.basic2)
|
||||
|
||||
def test_json_dump_default_type_error(self):
|
||||
from acme.jose.interfaces import JSONDeSerializable
|
||||
self.assertRaises(
|
||||
TypeError, JSONDeSerializable.json_dump_default, object())
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
489
acme/acme/jose/json_util.py
Normal file
489
acme/acme/jose/json_util.py
Normal file
@@ -0,0 +1,489 @@
|
||||
"""JSON (de)serialization framework.
|
||||
|
||||
The framework presented here is somewhat based on `Go's "json" package`_
|
||||
(especially the ``omitempty`` functionality).
|
||||
|
||||
.. _`Go's "json" package`: http://golang.org/pkg/encoding/json/
|
||||
|
||||
"""
|
||||
import abc
|
||||
import binascii
|
||||
import logging
|
||||
|
||||
import OpenSSL
|
||||
import six
|
||||
|
||||
from acme.jose import b64
|
||||
from acme.jose import errors
|
||||
from acme.jose import interfaces
|
||||
from acme.jose import util
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Field(object):
|
||||
"""JSON object field.
|
||||
|
||||
:class:`Field` is meant to be used together with
|
||||
:class:`JSONObjectWithFields`.
|
||||
|
||||
``encoder`` (``decoder``) is a callable that accepts a single
|
||||
parameter, i.e. a value to be encoded (decoded), and returns the
|
||||
serialized (deserialized) value. In case of errors it should raise
|
||||
:class:`~acme.jose.errors.SerializationError`
|
||||
(:class:`~acme.jose.errors.DeserializationError`).
|
||||
|
||||
Note, that ``decoder`` should perform partial serialization only.
|
||||
|
||||
:ivar str json_name: Name of the field when encoded to JSON.
|
||||
:ivar default: Default value (used when not present in JSON object).
|
||||
:ivar bool omitempty: If ``True`` and the field value is empty, then
|
||||
it will not be included in the serialized JSON object, and
|
||||
``default`` will be used for deserialization. Otherwise, if ``False``,
|
||||
field is considered as required, value will always be included in the
|
||||
serialized JSON objected, and it must also be present when
|
||||
deserializing.
|
||||
|
||||
"""
|
||||
__slots__ = ('json_name', 'default', 'omitempty', 'fdec', 'fenc')
|
||||
|
||||
def __init__(self, json_name, default=None, omitempty=False,
|
||||
decoder=None, encoder=None):
|
||||
# pylint: disable=too-many-arguments
|
||||
self.json_name = json_name
|
||||
self.default = default
|
||||
self.omitempty = omitempty
|
||||
|
||||
self.fdec = self.default_decoder if decoder is None else decoder
|
||||
self.fenc = self.default_encoder if encoder is None else encoder
|
||||
|
||||
@classmethod
|
||||
def _empty(cls, value):
|
||||
"""Is the provided value cosidered "empty" for this field?
|
||||
|
||||
This is useful for subclasses that might want to override the
|
||||
definition of being empty, e.g. for some more exotic data types.
|
||||
|
||||
"""
|
||||
return not isinstance(value, bool) and not value
|
||||
|
||||
def omit(self, value):
|
||||
"""Omit the value in output?"""
|
||||
return self._empty(value) and self.omitempty
|
||||
|
||||
def _update_params(self, **kwargs):
|
||||
current = dict(json_name=self.json_name, default=self.default,
|
||||
omitempty=self.omitempty,
|
||||
decoder=self.fdec, encoder=self.fenc)
|
||||
current.update(kwargs)
|
||||
return type(self)(**current) # pylint: disable=star-args
|
||||
|
||||
def decoder(self, fdec):
|
||||
"""Descriptor to change the decoder on JSON object field."""
|
||||
return self._update_params(decoder=fdec)
|
||||
|
||||
def encoder(self, fenc):
|
||||
"""Descriptor to change the encoder on JSON object field."""
|
||||
return self._update_params(encoder=fenc)
|
||||
|
||||
def decode(self, value):
|
||||
"""Decode a value, optionally with context JSON object."""
|
||||
return self.fdec(value)
|
||||
|
||||
def encode(self, value):
|
||||
"""Encode a value, optionally with context JSON object."""
|
||||
return self.fenc(value)
|
||||
|
||||
@classmethod
|
||||
def default_decoder(cls, value):
|
||||
"""Default decoder.
|
||||
|
||||
Recursively deserialize into immutable types (
|
||||
:class:`acme.jose.util.frozendict` instead of
|
||||
:func:`dict`, :func:`tuple` instead of :func:`list`).
|
||||
|
||||
"""
|
||||
# bases cases for different types returned by json.loads
|
||||
if isinstance(value, list):
|
||||
return tuple(cls.default_decoder(subvalue) for subvalue in value)
|
||||
elif isinstance(value, dict):
|
||||
return util.frozendict(
|
||||
dict((cls.default_decoder(key), cls.default_decoder(value))
|
||||
for key, value in six.iteritems(value)))
|
||||
else: # integer or string
|
||||
return value
|
||||
|
||||
@classmethod
|
||||
def default_encoder(cls, value):
|
||||
"""Default (passthrough) encoder."""
|
||||
# field.to_partial_json() is no good as encoder has to do partial
|
||||
# serialization only
|
||||
return value
|
||||
|
||||
|
||||
class JSONObjectWithFieldsMeta(abc.ABCMeta):
|
||||
"""Metaclass for :class:`JSONObjectWithFields` and its subclasses.
|
||||
|
||||
It makes sure that, for any class ``cls`` with ``__metaclass__``
|
||||
set to ``JSONObjectWithFieldsMeta``:
|
||||
|
||||
1. All fields (attributes of type :class:`Field`) in the class
|
||||
definition are moved to the ``cls._fields`` dictionary, where
|
||||
keys are field attribute names and values are fields themselves.
|
||||
|
||||
2. ``cls.__slots__`` is extended by all field attribute names
|
||||
(i.e. not :attr:`Field.json_name`). Original ``cls.__slots__``
|
||||
are stored in ``cls._orig_slots``.
|
||||
|
||||
In a consequence, for a field attribute name ``some_field``,
|
||||
``cls.some_field`` will be a slot descriptor and not an instance
|
||||
of :class:`Field`. For example::
|
||||
|
||||
some_field = Field('someField', default=())
|
||||
|
||||
class Foo(object):
|
||||
__metaclass__ = JSONObjectWithFieldsMeta
|
||||
__slots__ = ('baz',)
|
||||
some_field = some_field
|
||||
|
||||
assert Foo.__slots__ == ('some_field', 'baz')
|
||||
assert Foo._orig_slots == ()
|
||||
assert Foo.some_field is not Field
|
||||
|
||||
assert Foo._fields.keys() == ['some_field']
|
||||
assert Foo._fields['some_field'] is some_field
|
||||
|
||||
As an implementation note, this metaclass inherits from
|
||||
:class:`abc.ABCMeta` (and not the usual :class:`type`) to mitigate
|
||||
the metaclass conflict (:class:`ImmutableMap` and
|
||||
:class:`JSONDeSerializable`, parents of :class:`JSONObjectWithFields`,
|
||||
use :class:`abc.ABCMeta` as its metaclass).
|
||||
|
||||
"""
|
||||
|
||||
def __new__(mcs, name, bases, dikt):
|
||||
fields = {}
|
||||
|
||||
for base in bases:
|
||||
fields.update(getattr(base, '_fields', {}))
|
||||
# Do not reorder, this class might override fields from base classes!
|
||||
for key, value in tuple(six.iteritems(dikt)):
|
||||
# not six.iterkeys() (in-place edit!)
|
||||
if isinstance(value, Field):
|
||||
fields[key] = dikt.pop(key)
|
||||
|
||||
dikt['_orig_slots'] = dikt.get('__slots__', ())
|
||||
dikt['__slots__'] = tuple(
|
||||
list(dikt['_orig_slots']) + list(six.iterkeys(fields)))
|
||||
dikt['_fields'] = fields
|
||||
|
||||
return abc.ABCMeta.__new__(mcs, name, bases, dikt)
|
||||
|
||||
|
||||
@six.add_metaclass(JSONObjectWithFieldsMeta)
|
||||
class JSONObjectWithFields(util.ImmutableMap, interfaces.JSONDeSerializable):
|
||||
# pylint: disable=too-few-public-methods
|
||||
"""JSON object with fields.
|
||||
|
||||
Example::
|
||||
|
||||
class Foo(JSONObjectWithFields):
|
||||
bar = Field('Bar')
|
||||
empty = Field('Empty', omitempty=True)
|
||||
|
||||
@bar.encoder
|
||||
def bar(value):
|
||||
return value + 'bar'
|
||||
|
||||
@bar.decoder
|
||||
def bar(value):
|
||||
if not value.endswith('bar'):
|
||||
raise errors.DeserializationError('No bar suffix!')
|
||||
return value[:-3]
|
||||
|
||||
assert Foo(bar='baz').to_partial_json() == {'Bar': 'bazbar'}
|
||||
assert Foo.from_json({'Bar': 'bazbar'}) == Foo(bar='baz')
|
||||
assert (Foo.from_json({'Bar': 'bazbar', 'Empty': '!'})
|
||||
== Foo(bar='baz', empty='!'))
|
||||
assert Foo(bar='baz').bar == 'baz'
|
||||
|
||||
"""
|
||||
|
||||
@classmethod
|
||||
def _defaults(cls):
|
||||
"""Get default fields values."""
|
||||
return dict([(slot, field.default) for slot, field
|
||||
in six.iteritems(cls._fields)])
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
# pylint: disable=star-args
|
||||
super(JSONObjectWithFields, self).__init__(
|
||||
**(dict(self._defaults(), **kwargs)))
|
||||
|
||||
def encode(self, name):
|
||||
"""Encode a single field.
|
||||
|
||||
:param str name: Name of the field to be encoded.
|
||||
|
||||
:raises errors.SerializationError: if field cannot be serialized
|
||||
:raises errors.Error: if field could not be found
|
||||
|
||||
"""
|
||||
try:
|
||||
field = self._fields[name]
|
||||
except KeyError:
|
||||
raise errors.Error("Field not found: {0}".format(name))
|
||||
|
||||
return field.encode(getattr(self, name))
|
||||
|
||||
def fields_to_partial_json(self):
|
||||
"""Serialize fields to JSON."""
|
||||
jobj = {}
|
||||
omitted = set()
|
||||
for slot, field in six.iteritems(self._fields):
|
||||
value = getattr(self, slot)
|
||||
|
||||
if field.omit(value):
|
||||
omitted.add((slot, value))
|
||||
else:
|
||||
try:
|
||||
jobj[field.json_name] = field.encode(value)
|
||||
except errors.SerializationError as error:
|
||||
raise errors.SerializationError(
|
||||
'Could not encode {0} ({1}): {2}'.format(
|
||||
slot, value, error))
|
||||
if omitted:
|
||||
# pylint: disable=star-args
|
||||
logger.debug('Omitted empty fields: %s', ', '.join(
|
||||
'{0!s}={1!r}'.format(*field) for field in omitted))
|
||||
return jobj
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.fields_to_partial_json()
|
||||
|
||||
@classmethod
|
||||
def _check_required(cls, jobj):
|
||||
missing = set()
|
||||
for _, field in six.iteritems(cls._fields):
|
||||
if not field.omitempty and field.json_name not in jobj:
|
||||
missing.add(field.json_name)
|
||||
|
||||
if missing:
|
||||
raise errors.DeserializationError(
|
||||
'The following field are required: {0}'.format(
|
||||
','.join(missing)))
|
||||
|
||||
@classmethod
|
||||
def fields_from_json(cls, jobj):
|
||||
"""Deserialize fields from JSON."""
|
||||
cls._check_required(jobj)
|
||||
fields = {}
|
||||
for slot, field in six.iteritems(cls._fields):
|
||||
if field.json_name not in jobj and field.omitempty:
|
||||
fields[slot] = field.default
|
||||
else:
|
||||
value = jobj[field.json_name]
|
||||
try:
|
||||
fields[slot] = field.decode(value)
|
||||
except errors.DeserializationError as error:
|
||||
raise errors.DeserializationError(
|
||||
'Could not decode {0!r} ({1!r}): {2}'.format(
|
||||
slot, value, error))
|
||||
return fields
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
return cls(**cls.fields_from_json(jobj))
|
||||
|
||||
|
||||
def encode_b64jose(data):
|
||||
"""Encode JOSE Base-64 field.
|
||||
|
||||
:param bytes data:
|
||||
:rtype: `unicode`
|
||||
|
||||
"""
|
||||
# b64encode produces ASCII characters only
|
||||
return b64.b64encode(data).decode('ascii')
|
||||
|
||||
|
||||
def decode_b64jose(data, size=None, minimum=False):
|
||||
"""Decode JOSE Base-64 field.
|
||||
|
||||
:param unicode data:
|
||||
:param int size: Required length (after decoding).
|
||||
:param bool minimum: If ``True``, then `size` will be treated as
|
||||
minimum required length, as opposed to exact equality.
|
||||
|
||||
:rtype: bytes
|
||||
|
||||
"""
|
||||
error_cls = TypeError if six.PY2 else binascii.Error
|
||||
try:
|
||||
decoded = b64.b64decode(data.encode())
|
||||
except error_cls as error:
|
||||
raise errors.DeserializationError(error)
|
||||
|
||||
if size is not None and ((not minimum and len(decoded) != size) or
|
||||
(minimum and len(decoded) < size)):
|
||||
raise errors.DeserializationError(
|
||||
"Expected at least or exactly {0} bytes".format(size))
|
||||
|
||||
return decoded
|
||||
|
||||
|
||||
def encode_hex16(value):
|
||||
"""Hexlify.
|
||||
|
||||
:param bytes value:
|
||||
:rtype: unicode
|
||||
|
||||
"""
|
||||
return binascii.hexlify(value).decode()
|
||||
|
||||
|
||||
def decode_hex16(value, size=None, minimum=False):
|
||||
"""Decode hexlified field.
|
||||
|
||||
:param unicode value:
|
||||
:param int size: Required length (after decoding).
|
||||
:param bool minimum: If ``True``, then `size` will be treated as
|
||||
minimum required length, as opposed to exact equality.
|
||||
|
||||
:rtype: bytes
|
||||
|
||||
"""
|
||||
value = value.encode()
|
||||
if size is not None and ((not minimum and len(value) != size * 2) or
|
||||
(minimum and len(value) < size * 2)):
|
||||
raise errors.DeserializationError()
|
||||
error_cls = TypeError if six.PY2 else binascii.Error
|
||||
try:
|
||||
return binascii.unhexlify(value)
|
||||
except error_cls as error:
|
||||
raise errors.DeserializationError(error)
|
||||
|
||||
|
||||
def encode_cert(cert):
|
||||
"""Encode certificate as JOSE Base-64 DER.
|
||||
|
||||
:type cert: `OpenSSL.crypto.X509` wrapped in `.ComparableX509`
|
||||
:rtype: unicode
|
||||
|
||||
"""
|
||||
return encode_b64jose(OpenSSL.crypto.dump_certificate(
|
||||
OpenSSL.crypto.FILETYPE_ASN1, cert.wrapped))
|
||||
|
||||
|
||||
def decode_cert(b64der):
|
||||
"""Decode JOSE Base-64 DER-encoded certificate.
|
||||
|
||||
:param unicode b64der:
|
||||
:rtype: `OpenSSL.crypto.X509` wrapped in `.ComparableX509`
|
||||
|
||||
"""
|
||||
try:
|
||||
return util.ComparableX509(OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_ASN1, decode_b64jose(b64der)))
|
||||
except OpenSSL.crypto.Error as error:
|
||||
raise errors.DeserializationError(error)
|
||||
|
||||
|
||||
def encode_csr(csr):
|
||||
"""Encode CSR as JOSE Base-64 DER.
|
||||
|
||||
:type csr: `OpenSSL.crypto.X509Req` wrapped in `.ComparableX509`
|
||||
:rtype: unicode
|
||||
|
||||
"""
|
||||
return encode_b64jose(OpenSSL.crypto.dump_certificate_request(
|
||||
OpenSSL.crypto.FILETYPE_ASN1, csr.wrapped))
|
||||
|
||||
|
||||
def decode_csr(b64der):
|
||||
"""Decode JOSE Base-64 DER-encoded CSR.
|
||||
|
||||
:param unicode b64der:
|
||||
:rtype: `OpenSSL.crypto.X509Req` wrapped in `.ComparableX509`
|
||||
|
||||
"""
|
||||
try:
|
||||
return util.ComparableX509(OpenSSL.crypto.load_certificate_request(
|
||||
OpenSSL.crypto.FILETYPE_ASN1, decode_b64jose(b64der)))
|
||||
except OpenSSL.crypto.Error as error:
|
||||
raise errors.DeserializationError(error)
|
||||
|
||||
|
||||
class TypedJSONObjectWithFields(JSONObjectWithFields):
|
||||
"""JSON object with type."""
|
||||
|
||||
typ = NotImplemented
|
||||
"""Type of the object. Subclasses must override."""
|
||||
|
||||
type_field_name = "type"
|
||||
"""Field name used to distinguish different object types.
|
||||
|
||||
Subclasses will probably have to override this.
|
||||
|
||||
"""
|
||||
|
||||
TYPES = NotImplemented
|
||||
"""Types registered for JSON deserialization"""
|
||||
|
||||
@classmethod
|
||||
def register(cls, type_cls, typ=None):
|
||||
"""Register class for JSON deserialization."""
|
||||
typ = type_cls.typ if typ is None else typ
|
||||
cls.TYPES[typ] = type_cls
|
||||
return type_cls
|
||||
|
||||
@classmethod
|
||||
def get_type_cls(cls, jobj):
|
||||
"""Get the registered class for ``jobj``."""
|
||||
if cls in six.itervalues(cls.TYPES):
|
||||
if cls.type_field_name not in jobj:
|
||||
raise errors.DeserializationError(
|
||||
"Missing type field ({0})".format(cls.type_field_name))
|
||||
# cls is already registered type_cls, force to use it
|
||||
# so that, e.g Revocation.from_json(jobj) fails if
|
||||
# jobj["type"] != "revocation".
|
||||
return cls
|
||||
|
||||
if not isinstance(jobj, dict):
|
||||
raise errors.DeserializationError(
|
||||
"{0} is not a dictionary object".format(jobj))
|
||||
try:
|
||||
typ = jobj[cls.type_field_name]
|
||||
except KeyError:
|
||||
raise errors.DeserializationError("missing type field")
|
||||
|
||||
try:
|
||||
return cls.TYPES[typ]
|
||||
except KeyError:
|
||||
raise errors.UnrecognizedTypeError(typ, jobj)
|
||||
|
||||
def to_partial_json(self):
|
||||
"""Get JSON serializable object.
|
||||
|
||||
:returns: Serializable JSON object representing ACME typed object.
|
||||
:meth:`validate` will almost certainly not work, due to reasons
|
||||
explained in :class:`acme.interfaces.IJSONSerializable`.
|
||||
:rtype: dict
|
||||
|
||||
"""
|
||||
jobj = self.fields_to_partial_json()
|
||||
jobj[self.type_field_name] = self.typ
|
||||
return jobj
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
"""Deserialize ACME object from valid JSON object.
|
||||
|
||||
:raises acme.errors.UnrecognizedTypeError: if type
|
||||
of the ACME object has not been registered.
|
||||
|
||||
"""
|
||||
# make sure subclasses don't cause infinite recursive from_json calls
|
||||
type_cls = cls.get_type_cls(jobj)
|
||||
return type_cls(**type_cls.fields_from_json(jobj))
|
||||
381
acme/acme/jose/json_util_test.py
Normal file
381
acme/acme/jose/json_util_test.py
Normal file
@@ -0,0 +1,381 @@
|
||||
"""Tests for acme.jose.json_util."""
|
||||
import itertools
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
import six
|
||||
|
||||
from acme import test_util
|
||||
|
||||
from acme.jose import errors
|
||||
from acme.jose import interfaces
|
||||
from acme.jose import util
|
||||
|
||||
|
||||
CERT = test_util.load_comparable_cert('cert.pem')
|
||||
CSR = test_util.load_comparable_csr('csr.pem')
|
||||
|
||||
|
||||
class FieldTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.json_util.Field."""
|
||||
|
||||
def test_no_omit_boolean(self):
|
||||
from acme.jose.json_util import Field
|
||||
for default, omitempty, value in itertools.product(
|
||||
[True, False], [True, False], [True, False]):
|
||||
self.assertFalse(
|
||||
Field("foo", default=default, omitempty=omitempty).omit(value))
|
||||
|
||||
def test_descriptors(self):
|
||||
mock_value = mock.MagicMock()
|
||||
|
||||
# pylint: disable=missing-docstring
|
||||
|
||||
def decoder(unused_value):
|
||||
return 'd'
|
||||
|
||||
def encoder(unused_value):
|
||||
return 'e'
|
||||
|
||||
from acme.jose.json_util import Field
|
||||
field = Field('foo')
|
||||
|
||||
field = field.encoder(encoder)
|
||||
self.assertEqual('e', field.encode(mock_value))
|
||||
|
||||
field = field.decoder(decoder)
|
||||
self.assertEqual('e', field.encode(mock_value))
|
||||
self.assertEqual('d', field.decode(mock_value))
|
||||
|
||||
def test_default_encoder_is_partial(self):
|
||||
class MockField(interfaces.JSONDeSerializable):
|
||||
# pylint: disable=missing-docstring
|
||||
def to_partial_json(self):
|
||||
return 'foo' # pragma: no cover
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
pass # pragma: no cover
|
||||
mock_field = MockField()
|
||||
|
||||
from acme.jose.json_util import Field
|
||||
self.assertTrue(Field.default_encoder(mock_field) is mock_field)
|
||||
# in particular...
|
||||
self.assertNotEqual('foo', Field.default_encoder(mock_field))
|
||||
|
||||
def test_default_encoder_passthrough(self):
|
||||
mock_value = mock.MagicMock()
|
||||
from acme.jose.json_util import Field
|
||||
self.assertTrue(Field.default_encoder(mock_value) is mock_value)
|
||||
|
||||
def test_default_decoder_list_to_tuple(self):
|
||||
from acme.jose.json_util import Field
|
||||
self.assertEqual((1, 2, 3), Field.default_decoder([1, 2, 3]))
|
||||
|
||||
def test_default_decoder_dict_to_frozendict(self):
|
||||
from acme.jose.json_util import Field
|
||||
obj = Field.default_decoder({'x': 2})
|
||||
self.assertTrue(isinstance(obj, util.frozendict))
|
||||
self.assertEqual(obj, util.frozendict(x=2))
|
||||
|
||||
def test_default_decoder_passthrough(self):
|
||||
mock_value = mock.MagicMock()
|
||||
from acme.jose.json_util import Field
|
||||
self.assertTrue(Field.default_decoder(mock_value) is mock_value)
|
||||
|
||||
|
||||
class JSONObjectWithFieldsMetaTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.json_util.JSONObjectWithFieldsMeta."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.jose.json_util import Field
|
||||
from acme.jose.json_util import JSONObjectWithFieldsMeta
|
||||
self.field = Field('Baz')
|
||||
self.field2 = Field('Baz2')
|
||||
# pylint: disable=invalid-name,missing-docstring,too-few-public-methods
|
||||
# pylint: disable=blacklisted-name
|
||||
|
||||
@six.add_metaclass(JSONObjectWithFieldsMeta)
|
||||
class A(object):
|
||||
__slots__ = ('bar',)
|
||||
baz = self.field
|
||||
|
||||
class B(A):
|
||||
pass
|
||||
|
||||
class C(A):
|
||||
baz = self.field2
|
||||
|
||||
self.a_cls = A
|
||||
self.b_cls = B
|
||||
self.c_cls = C
|
||||
|
||||
def test_fields(self):
|
||||
# pylint: disable=protected-access,no-member
|
||||
self.assertEqual({'baz': self.field}, self.a_cls._fields)
|
||||
self.assertEqual({'baz': self.field}, self.b_cls._fields)
|
||||
|
||||
def test_fields_inheritance(self):
|
||||
# pylint: disable=protected-access,no-member
|
||||
self.assertEqual({'baz': self.field2}, self.c_cls._fields)
|
||||
|
||||
def test_slots(self):
|
||||
self.assertEqual(('bar', 'baz'), self.a_cls.__slots__)
|
||||
self.assertEqual(('baz',), self.b_cls.__slots__)
|
||||
|
||||
def test_orig_slots(self):
|
||||
# pylint: disable=protected-access,no-member
|
||||
self.assertEqual(('bar',), self.a_cls._orig_slots)
|
||||
self.assertEqual((), self.b_cls._orig_slots)
|
||||
|
||||
|
||||
class JSONObjectWithFieldsTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.json_util.JSONObjectWithFields."""
|
||||
# pylint: disable=protected-access
|
||||
|
||||
def setUp(self):
|
||||
from acme.jose.json_util import JSONObjectWithFields
|
||||
from acme.jose.json_util import Field
|
||||
|
||||
class MockJSONObjectWithFields(JSONObjectWithFields):
|
||||
# pylint: disable=invalid-name,missing-docstring,no-self-argument
|
||||
# pylint: disable=too-few-public-methods
|
||||
x = Field('x', omitempty=True,
|
||||
encoder=(lambda x: x * 2),
|
||||
decoder=(lambda x: x / 2))
|
||||
y = Field('y')
|
||||
z = Field('Z') # on purpose uppercase
|
||||
|
||||
@y.encoder
|
||||
def y(value):
|
||||
if value == 500:
|
||||
raise errors.SerializationError()
|
||||
return value
|
||||
|
||||
@y.decoder
|
||||
def y(value):
|
||||
if value == 500:
|
||||
raise errors.DeserializationError()
|
||||
return value
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
self.MockJSONObjectWithFields = MockJSONObjectWithFields
|
||||
self.mock = MockJSONObjectWithFields(x=None, y=2, z=3)
|
||||
|
||||
def test_init_defaults(self):
|
||||
self.assertEqual(self.mock, self.MockJSONObjectWithFields(y=2, z=3))
|
||||
|
||||
def test_encode(self):
|
||||
self.assertEqual(10, self.MockJSONObjectWithFields(
|
||||
x=5, y=0, z=0).encode("x"))
|
||||
|
||||
def test_encode_wrong_field(self):
|
||||
self.assertRaises(errors.Error, self.mock.encode, 'foo')
|
||||
|
||||
def test_encode_serialization_error_passthrough(self):
|
||||
self.assertRaises(
|
||||
errors.SerializationError,
|
||||
self.MockJSONObjectWithFields(y=500, z=None).encode, "y")
|
||||
|
||||
def test_fields_to_partial_json_omits_empty(self):
|
||||
self.assertEqual(self.mock.fields_to_partial_json(), {'y': 2, 'Z': 3})
|
||||
|
||||
def test_fields_from_json_fills_default_for_empty(self):
|
||||
self.assertEqual(
|
||||
{'x': None, 'y': 2, 'z': 3},
|
||||
self.MockJSONObjectWithFields.fields_from_json({'y': 2, 'Z': 3}))
|
||||
|
||||
def test_fields_from_json_fails_on_missing(self):
|
||||
self.assertRaises(
|
||||
errors.DeserializationError,
|
||||
self.MockJSONObjectWithFields.fields_from_json, {'y': 0})
|
||||
self.assertRaises(
|
||||
errors.DeserializationError,
|
||||
self.MockJSONObjectWithFields.fields_from_json, {'Z': 0})
|
||||
self.assertRaises(
|
||||
errors.DeserializationError,
|
||||
self.MockJSONObjectWithFields.fields_from_json, {'x': 0, 'y': 0})
|
||||
self.assertRaises(
|
||||
errors.DeserializationError,
|
||||
self.MockJSONObjectWithFields.fields_from_json, {'x': 0, 'Z': 0})
|
||||
|
||||
def test_fields_to_partial_json_encoder(self):
|
||||
self.assertEqual(
|
||||
self.MockJSONObjectWithFields(x=1, y=2, z=3).to_partial_json(),
|
||||
{'x': 2, 'y': 2, 'Z': 3})
|
||||
|
||||
def test_fields_from_json_decoder(self):
|
||||
self.assertEqual(
|
||||
{'x': 2, 'y': 2, 'z': 3},
|
||||
self.MockJSONObjectWithFields.fields_from_json(
|
||||
{'x': 4, 'y': 2, 'Z': 3}))
|
||||
|
||||
def test_fields_to_partial_json_error_passthrough(self):
|
||||
self.assertRaises(
|
||||
errors.SerializationError, self.MockJSONObjectWithFields(
|
||||
x=1, y=500, z=3).to_partial_json)
|
||||
|
||||
def test_fields_from_json_error_passthrough(self):
|
||||
self.assertRaises(
|
||||
errors.DeserializationError,
|
||||
self.MockJSONObjectWithFields.from_json,
|
||||
{'x': 4, 'y': 500, 'Z': 3})
|
||||
|
||||
|
||||
class DeEncodersTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
self.b64_cert = (
|
||||
u'MIIB3jCCAYigAwIBAgICBTkwDQYJKoZIhvcNAQELBQAwdzELMAkGA1UEBhM'
|
||||
u'CVVMxETAPBgNVBAgMCE1pY2hpZ2FuMRIwEAYDVQQHDAlBbm4gQXJib3IxKz'
|
||||
u'ApBgNVBAoMIlVuaXZlcnNpdHkgb2YgTWljaGlnYW4gYW5kIHRoZSBFRkYxF'
|
||||
u'DASBgNVBAMMC2V4YW1wbGUuY29tMB4XDTE0MTIxMTIyMzQ0NVoXDTE0MTIx'
|
||||
u'ODIyMzQ0NVowdzELMAkGA1UEBhMCVVMxETAPBgNVBAgMCE1pY2hpZ2FuMRI'
|
||||
u'wEAYDVQQHDAlBbm4gQXJib3IxKzApBgNVBAoMIlVuaXZlcnNpdHkgb2YgTW'
|
||||
u'ljaGlnYW4gYW5kIHRoZSBFRkYxFDASBgNVBAMMC2V4YW1wbGUuY29tMFwwD'
|
||||
u'QYJKoZIhvcNAQEBBQADSwAwSAJBAKx1c7RR7R_drnBSQ_zfx1vQLHUbFLh1'
|
||||
u'AQQQ5R8DZUXd36efNK79vukFhN9HFoHZiUvOjm0c-pVE6K-EdE_twuUCAwE'
|
||||
u'AATANBgkqhkiG9w0BAQsFAANBAC24z0IdwIVKSlntksllvr6zJepBH5fMnd'
|
||||
u'fk3XJp10jT6VE-14KNtjh02a56GoraAvJAT5_H67E8GvJ_ocNnB_o'
|
||||
)
|
||||
self.b64_csr = (
|
||||
u'MIIBXTCCAQcCAQAweTELMAkGA1UEBhMCVVMxETAPBgNVBAgMCE1pY2hpZ2F'
|
||||
u'uMRIwEAYDVQQHDAlBbm4gQXJib3IxDDAKBgNVBAoMA0VGRjEfMB0GA1UECw'
|
||||
u'wWVW5pdmVyc2l0eSBvZiBNaWNoaWdhbjEUMBIGA1UEAwwLZXhhbXBsZS5jb'
|
||||
u'20wXDANBgkqhkiG9w0BAQEFAANLADBIAkEArHVztFHtH92ucFJD_N_HW9As'
|
||||
u'dRsUuHUBBBDlHwNlRd3fp580rv2-6QWE30cWgdmJS86ObRz6lUTor4R0T-3'
|
||||
u'C5QIDAQABoCkwJwYJKoZIhvcNAQkOMRowGDAWBgNVHREEDzANggtleGFtcG'
|
||||
u'xlLmNvbTANBgkqhkiG9w0BAQsFAANBAHJH_O6BtC9aGzEVCMGOZ7z9iIRHW'
|
||||
u'Szr9x_bOzn7hLwsbXPAgO1QxEwL-X-4g20Gn9XBE1N9W6HCIEut2d8wACg'
|
||||
)
|
||||
|
||||
def test_encode_b64jose(self):
|
||||
from acme.jose.json_util import encode_b64jose
|
||||
encoded = encode_b64jose(b'x')
|
||||
self.assertTrue(isinstance(encoded, six.string_types))
|
||||
self.assertEqual(u'eA', encoded)
|
||||
|
||||
def test_decode_b64jose(self):
|
||||
from acme.jose.json_util import decode_b64jose
|
||||
decoded = decode_b64jose(u'eA')
|
||||
self.assertTrue(isinstance(decoded, six.binary_type))
|
||||
self.assertEqual(b'x', decoded)
|
||||
|
||||
def test_decode_b64jose_padding_error(self):
|
||||
from acme.jose.json_util import decode_b64jose
|
||||
self.assertRaises(errors.DeserializationError, decode_b64jose, u'x')
|
||||
|
||||
def test_decode_b64jose_size(self):
|
||||
from acme.jose.json_util import decode_b64jose
|
||||
self.assertEqual(b'foo', decode_b64jose(u'Zm9v', size=3))
|
||||
self.assertRaises(
|
||||
errors.DeserializationError, decode_b64jose, u'Zm9v', size=2)
|
||||
self.assertRaises(
|
||||
errors.DeserializationError, decode_b64jose, u'Zm9v', size=4)
|
||||
|
||||
def test_decode_b64jose_minimum_size(self):
|
||||
from acme.jose.json_util import decode_b64jose
|
||||
self.assertEqual(b'foo', decode_b64jose(u'Zm9v', size=3, minimum=True))
|
||||
self.assertEqual(b'foo', decode_b64jose(u'Zm9v', size=2, minimum=True))
|
||||
self.assertRaises(errors.DeserializationError, decode_b64jose,
|
||||
u'Zm9v', size=4, minimum=True)
|
||||
|
||||
def test_encode_hex16(self):
|
||||
from acme.jose.json_util import encode_hex16
|
||||
encoded = encode_hex16(b'foo')
|
||||
self.assertEqual(u'666f6f', encoded)
|
||||
self.assertTrue(isinstance(encoded, six.string_types))
|
||||
|
||||
def test_decode_hex16(self):
|
||||
from acme.jose.json_util import decode_hex16
|
||||
decoded = decode_hex16(u'666f6f')
|
||||
self.assertEqual(b'foo', decoded)
|
||||
self.assertTrue(isinstance(decoded, six.binary_type))
|
||||
|
||||
def test_decode_hex16_minimum_size(self):
|
||||
from acme.jose.json_util import decode_hex16
|
||||
self.assertEqual(b'foo', decode_hex16(u'666f6f', size=3, minimum=True))
|
||||
self.assertEqual(b'foo', decode_hex16(u'666f6f', size=2, minimum=True))
|
||||
self.assertRaises(errors.DeserializationError, decode_hex16,
|
||||
u'666f6f', size=4, minimum=True)
|
||||
|
||||
def test_decode_hex16_odd_length(self):
|
||||
from acme.jose.json_util import decode_hex16
|
||||
self.assertRaises(errors.DeserializationError, decode_hex16, u'x')
|
||||
|
||||
def test_encode_cert(self):
|
||||
from acme.jose.json_util import encode_cert
|
||||
self.assertEqual(self.b64_cert, encode_cert(CERT))
|
||||
|
||||
def test_decode_cert(self):
|
||||
from acme.jose.json_util import decode_cert
|
||||
cert = decode_cert(self.b64_cert)
|
||||
self.assertTrue(isinstance(cert, util.ComparableX509))
|
||||
self.assertEqual(cert, CERT)
|
||||
self.assertRaises(errors.DeserializationError, decode_cert, u'')
|
||||
|
||||
def test_encode_csr(self):
|
||||
from acme.jose.json_util import encode_csr
|
||||
self.assertEqual(self.b64_csr, encode_csr(CSR))
|
||||
|
||||
def test_decode_csr(self):
|
||||
from acme.jose.json_util import decode_csr
|
||||
csr = decode_csr(self.b64_csr)
|
||||
self.assertTrue(isinstance(csr, util.ComparableX509))
|
||||
self.assertEqual(csr, CSR)
|
||||
self.assertRaises(errors.DeserializationError, decode_csr, u'')
|
||||
|
||||
|
||||
class TypedJSONObjectWithFieldsTest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
from acme.jose.json_util import TypedJSONObjectWithFields
|
||||
|
||||
# pylint: disable=missing-docstring,abstract-method
|
||||
# pylint: disable=too-few-public-methods
|
||||
|
||||
class MockParentTypedJSONObjectWithFields(TypedJSONObjectWithFields):
|
||||
TYPES = {}
|
||||
type_field_name = 'type'
|
||||
|
||||
@MockParentTypedJSONObjectWithFields.register
|
||||
class MockTypedJSONObjectWithFields(
|
||||
MockParentTypedJSONObjectWithFields):
|
||||
typ = 'test'
|
||||
__slots__ = ('foo',)
|
||||
|
||||
@classmethod
|
||||
def fields_from_json(cls, jobj):
|
||||
return {'foo': jobj['foo']}
|
||||
|
||||
def fields_to_partial_json(self):
|
||||
return {'foo': self.foo}
|
||||
|
||||
self.parent_cls = MockParentTypedJSONObjectWithFields
|
||||
self.msg = MockTypedJSONObjectWithFields(foo='bar')
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.msg.to_partial_json(), {
|
||||
'type': 'test',
|
||||
'foo': 'bar',
|
||||
})
|
||||
|
||||
def test_from_json_non_dict_fails(self):
|
||||
for value in [[], (), 5, "asd"]: # all possible input types
|
||||
self.assertRaises(
|
||||
errors.DeserializationError, self.parent_cls.from_json, value)
|
||||
|
||||
def test_from_json_dict_no_type_fails(self):
|
||||
self.assertRaises(
|
||||
errors.DeserializationError, self.parent_cls.from_json, {})
|
||||
|
||||
def test_from_json_unknown_type_fails(self):
|
||||
self.assertRaises(errors.UnrecognizedTypeError,
|
||||
self.parent_cls.from_json, {'type': 'bar'})
|
||||
|
||||
def test_from_json_returns_obj(self):
|
||||
self.assertEqual({'foo': 'bar'}, self.parent_cls.from_json(
|
||||
{'type': 'test', 'foo': 'bar'}))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
180
acme/acme/jose/jwa.py
Normal file
180
acme/acme/jose/jwa.py
Normal file
@@ -0,0 +1,180 @@
|
||||
"""JSON Web Algorithm.
|
||||
|
||||
https://tools.ietf.org/html/draft-ietf-jose-json-web-algorithms-40
|
||||
|
||||
"""
|
||||
import abc
|
||||
import collections
|
||||
import logging
|
||||
|
||||
import cryptography.exceptions
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import hashes
|
||||
from cryptography.hazmat.primitives import hmac
|
||||
from cryptography.hazmat.primitives.asymmetric import padding
|
||||
|
||||
from acme.jose import errors
|
||||
from acme.jose import interfaces
|
||||
from acme.jose import jwk
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class JWA(interfaces.JSONDeSerializable): # pylint: disable=abstract-method
|
||||
# pylint: disable=too-few-public-methods
|
||||
# for some reason disable=abstract-method has to be on the line
|
||||
# above...
|
||||
"""JSON Web Algorithm."""
|
||||
|
||||
|
||||
class JWASignature(JWA, collections.Hashable):
|
||||
"""JSON Web Signature Algorithm."""
|
||||
SIGNATURES = {}
|
||||
|
||||
def __init__(self, name):
|
||||
self.name = name
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, JWASignature):
|
||||
return NotImplemented
|
||||
return self.name == other.name
|
||||
|
||||
def __hash__(self):
|
||||
return hash((self.__class__, self.name))
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self == other
|
||||
|
||||
@classmethod
|
||||
def register(cls, signature_cls):
|
||||
"""Register class for JSON deserialization."""
|
||||
cls.SIGNATURES[signature_cls.name] = signature_cls
|
||||
return signature_cls
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.name
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
return cls.SIGNATURES[jobj]
|
||||
|
||||
@abc.abstractmethod
|
||||
def sign(self, key, msg): # pragma: no cover
|
||||
"""Sign the ``msg`` using ``key``."""
|
||||
raise NotImplementedError()
|
||||
|
||||
@abc.abstractmethod
|
||||
def verify(self, key, msg, sig): # pragma: no cover
|
||||
"""Verify the ``msg` and ``sig`` using ``key``."""
|
||||
raise NotImplementedError()
|
||||
|
||||
def __repr__(self):
|
||||
return self.name
|
||||
|
||||
|
||||
class _JWAHS(JWASignature):
|
||||
|
||||
kty = jwk.JWKOct
|
||||
|
||||
def __init__(self, name, hash_):
|
||||
super(_JWAHS, self).__init__(name)
|
||||
self.hash = hash_()
|
||||
|
||||
def sign(self, key, msg):
|
||||
signer = hmac.HMAC(key, self.hash, backend=default_backend())
|
||||
signer.update(msg)
|
||||
return signer.finalize()
|
||||
|
||||
def verify(self, key, msg, sig):
|
||||
verifier = hmac.HMAC(key, self.hash, backend=default_backend())
|
||||
verifier.update(msg)
|
||||
try:
|
||||
verifier.verify(sig)
|
||||
except cryptography.exceptions.InvalidSignature as error:
|
||||
logger.debug(error, exc_info=True)
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
|
||||
class _JWARSA(object):
|
||||
|
||||
kty = jwk.JWKRSA
|
||||
padding = NotImplemented
|
||||
hash = NotImplemented
|
||||
|
||||
def sign(self, key, msg):
|
||||
"""Sign the ``msg`` using ``key``."""
|
||||
try:
|
||||
signer = key.signer(self.padding, self.hash)
|
||||
except AttributeError as error:
|
||||
logger.debug(error, exc_info=True)
|
||||
raise errors.Error("Public key cannot be used for signing")
|
||||
except ValueError as error: # digest too large
|
||||
logger.debug(error, exc_info=True)
|
||||
raise errors.Error(str(error))
|
||||
signer.update(msg)
|
||||
try:
|
||||
return signer.finalize()
|
||||
except ValueError as error:
|
||||
logger.debug(error, exc_info=True)
|
||||
raise errors.Error(str(error))
|
||||
|
||||
def verify(self, key, msg, sig):
|
||||
"""Verify the ``msg` and ``sig`` using ``key``."""
|
||||
verifier = key.verifier(sig, self.padding, self.hash)
|
||||
verifier.update(msg)
|
||||
try:
|
||||
verifier.verify()
|
||||
except cryptography.exceptions.InvalidSignature as error:
|
||||
logger.debug(error, exc_info=True)
|
||||
return False
|
||||
else:
|
||||
return True
|
||||
|
||||
|
||||
class _JWARS(_JWARSA, JWASignature):
|
||||
|
||||
def __init__(self, name, hash_):
|
||||
super(_JWARS, self).__init__(name)
|
||||
self.padding = padding.PKCS1v15()
|
||||
self.hash = hash_()
|
||||
|
||||
|
||||
class _JWAPS(_JWARSA, JWASignature):
|
||||
|
||||
def __init__(self, name, hash_):
|
||||
super(_JWAPS, self).__init__(name)
|
||||
self.padding = padding.PSS(
|
||||
mgf=padding.MGF1(hash_()),
|
||||
salt_length=padding.PSS.MAX_LENGTH)
|
||||
self.hash = hash_()
|
||||
|
||||
|
||||
class _JWAES(JWASignature): # pylint: disable=abstract-class-not-used
|
||||
|
||||
# TODO: implement ES signatures
|
||||
|
||||
def sign(self, key, msg): # pragma: no cover
|
||||
raise NotImplementedError()
|
||||
|
||||
def verify(self, key, msg, sig): # pragma: no cover
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
HS256 = JWASignature.register(_JWAHS('HS256', hashes.SHA256))
|
||||
HS384 = JWASignature.register(_JWAHS('HS384', hashes.SHA384))
|
||||
HS512 = JWASignature.register(_JWAHS('HS512', hashes.SHA512))
|
||||
|
||||
RS256 = JWASignature.register(_JWARS('RS256', hashes.SHA256))
|
||||
RS384 = JWASignature.register(_JWARS('RS384', hashes.SHA384))
|
||||
RS512 = JWASignature.register(_JWARS('RS512', hashes.SHA512))
|
||||
|
||||
PS256 = JWASignature.register(_JWAPS('PS256', hashes.SHA256))
|
||||
PS384 = JWASignature.register(_JWAPS('PS384', hashes.SHA384))
|
||||
PS512 = JWASignature.register(_JWAPS('PS512', hashes.SHA512))
|
||||
|
||||
ES256 = JWASignature.register(_JWAES('ES256'))
|
||||
ES384 = JWASignature.register(_JWAES('ES384'))
|
||||
ES512 = JWASignature.register(_JWAES('ES512'))
|
||||
104
acme/acme/jose/jwa_test.py
Normal file
104
acme/acme/jose/jwa_test.py
Normal file
@@ -0,0 +1,104 @@
|
||||
"""Tests for acme.jose.jwa."""
|
||||
import unittest
|
||||
|
||||
from acme import test_util
|
||||
|
||||
from acme.jose import errors
|
||||
|
||||
|
||||
RSA256_KEY = test_util.load_rsa_private_key('rsa256_key.pem')
|
||||
RSA512_KEY = test_util.load_rsa_private_key('rsa512_key.pem')
|
||||
RSA1024_KEY = test_util.load_rsa_private_key('rsa1024_key.pem')
|
||||
|
||||
|
||||
class JWASignatureTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.jwa.JWASignature."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.jose.jwa import JWASignature
|
||||
|
||||
class MockSig(JWASignature):
|
||||
# pylint: disable=missing-docstring,too-few-public-methods
|
||||
# pylint: disable=abstract-class-not-used
|
||||
def sign(self, key, msg):
|
||||
raise NotImplementedError() # pragma: no cover
|
||||
|
||||
def verify(self, key, msg, sig):
|
||||
raise NotImplementedError() # pragma: no cover
|
||||
|
||||
# pylint: disable=invalid-name
|
||||
self.Sig1 = MockSig('Sig1')
|
||||
self.Sig2 = MockSig('Sig2')
|
||||
|
||||
def test_eq(self):
|
||||
self.assertEqual(self.Sig1, self.Sig1)
|
||||
|
||||
def test_ne(self):
|
||||
self.assertNotEqual(self.Sig1, self.Sig2)
|
||||
|
||||
def test_ne_other_type(self):
|
||||
self.assertNotEqual(self.Sig1, 5)
|
||||
|
||||
def test_repr(self):
|
||||
self.assertEqual('Sig1', repr(self.Sig1))
|
||||
self.assertEqual('Sig2', repr(self.Sig2))
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.Sig1.to_partial_json(), 'Sig1')
|
||||
self.assertEqual(self.Sig2.to_partial_json(), 'Sig2')
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.jose.jwa import JWASignature
|
||||
from acme.jose.jwa import RS256
|
||||
self.assertTrue(JWASignature.from_json('RS256') is RS256)
|
||||
|
||||
|
||||
class JWAHSTest(unittest.TestCase): # pylint: disable=too-few-public-methods
|
||||
|
||||
def test_it(self):
|
||||
from acme.jose.jwa import HS256
|
||||
sig = (
|
||||
b"\xceR\xea\xcd\x94\xab\xcf\xfb\xe0\xacA.:\x1a'\x08i\xe2\xc4"
|
||||
b"\r\x85+\x0e\x85\xaeUZ\xd4\xb3\x97zO"
|
||||
)
|
||||
self.assertEqual(HS256.sign(b'some key', b'foo'), sig)
|
||||
self.assertTrue(HS256.verify(b'some key', b'foo', sig) is True)
|
||||
self.assertTrue(HS256.verify(b'some key', b'foo', sig + b'!') is False)
|
||||
|
||||
|
||||
class JWARSTest(unittest.TestCase):
|
||||
|
||||
def test_sign_no_private_part(self):
|
||||
from acme.jose.jwa import RS256
|
||||
self.assertRaises(
|
||||
errors.Error, RS256.sign, RSA512_KEY.public_key(), b'foo')
|
||||
|
||||
def test_sign_key_too_small(self):
|
||||
from acme.jose.jwa import RS256
|
||||
from acme.jose.jwa import PS256
|
||||
self.assertRaises(errors.Error, RS256.sign, RSA256_KEY, b'foo')
|
||||
self.assertRaises(errors.Error, PS256.sign, RSA256_KEY, b'foo')
|
||||
|
||||
def test_rs(self):
|
||||
from acme.jose.jwa import RS256
|
||||
sig = (
|
||||
b'|\xc6\xb2\xa4\xab(\x87\x99\xfa*:\xea\xf8\xa0N&}\x9f\x0f\xc0O'
|
||||
b'\xc6t\xa3\xe6\xfa\xbb"\x15Y\x80Y\xe0\x81\xb8\x88)\xba\x0c\x9c'
|
||||
b'\xa4\x99\x1e\x19&\xd8\xc7\x99S\x97\xfc\x85\x0cOV\xe6\x07\x99'
|
||||
b'\xd2\xb9.>}\xfd'
|
||||
)
|
||||
self.assertEqual(RS256.sign(RSA512_KEY, b'foo'), sig)
|
||||
self.assertTrue(RS256.verify(RSA512_KEY.public_key(), b'foo', sig))
|
||||
self.assertFalse(RS256.verify(
|
||||
RSA512_KEY.public_key(), b'foo', sig + b'!'))
|
||||
|
||||
def test_ps(self):
|
||||
from acme.jose.jwa import PS256
|
||||
sig = PS256.sign(RSA1024_KEY, b'foo')
|
||||
self.assertTrue(PS256.verify(RSA1024_KEY.public_key(), b'foo', sig))
|
||||
self.assertFalse(PS256.verify(
|
||||
RSA1024_KEY.public_key(), b'foo', sig + b'!'))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
281
acme/acme/jose/jwk.py
Normal file
281
acme/acme/jose/jwk.py
Normal file
@@ -0,0 +1,281 @@
|
||||
"""JSON Web Key."""
|
||||
import abc
|
||||
import binascii
|
||||
import json
|
||||
import logging
|
||||
|
||||
import cryptography.exceptions
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import hashes
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import ec
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
|
||||
import six
|
||||
|
||||
from acme.jose import errors
|
||||
from acme.jose import json_util
|
||||
from acme.jose import util
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class JWK(json_util.TypedJSONObjectWithFields):
|
||||
# pylint: disable=too-few-public-methods
|
||||
"""JSON Web Key."""
|
||||
type_field_name = 'kty'
|
||||
TYPES = {}
|
||||
cryptography_key_types = ()
|
||||
"""Subclasses should override."""
|
||||
|
||||
required = NotImplemented
|
||||
"""Required members of public key's representation as defined by JWK/JWA."""
|
||||
|
||||
_thumbprint_json_dumps_params = {
|
||||
# "no whitespace or line breaks before or after any syntactic
|
||||
# elements"
|
||||
'indent': None,
|
||||
'separators': (',', ':'),
|
||||
# "members ordered lexicographically by the Unicode [UNICODE]
|
||||
# code points of the member names"
|
||||
'sort_keys': True,
|
||||
}
|
||||
|
||||
def thumbprint(self, hash_function=hashes.SHA256):
|
||||
"""Compute JWK Thumbprint.
|
||||
|
||||
https://tools.ietf.org/html/rfc7638
|
||||
|
||||
:returns bytes:
|
||||
|
||||
"""
|
||||
digest = hashes.Hash(hash_function(), backend=default_backend())
|
||||
digest.update(json.dumps(
|
||||
dict((k, v) for k, v in six.iteritems(self.to_json())
|
||||
if k in self.required),
|
||||
**self._thumbprint_json_dumps_params).encode())
|
||||
return digest.finalize()
|
||||
|
||||
@abc.abstractmethod
|
||||
def public_key(self): # pragma: no cover
|
||||
"""Generate JWK with public key.
|
||||
|
||||
For symmetric cryptosystems, this would return ``self``.
|
||||
|
||||
"""
|
||||
raise NotImplementedError()
|
||||
|
||||
@classmethod
|
||||
def _load_cryptography_key(cls, data, password=None, backend=None):
|
||||
backend = default_backend() if backend is None else backend
|
||||
exceptions = {}
|
||||
|
||||
# private key?
|
||||
for loader in (serialization.load_pem_private_key,
|
||||
serialization.load_der_private_key):
|
||||
try:
|
||||
return loader(data, password, backend)
|
||||
except (ValueError, TypeError,
|
||||
cryptography.exceptions.UnsupportedAlgorithm) as error:
|
||||
exceptions[loader] = error
|
||||
|
||||
# public key?
|
||||
for loader in (serialization.load_pem_public_key,
|
||||
serialization.load_der_public_key):
|
||||
try:
|
||||
return loader(data, backend)
|
||||
except (ValueError,
|
||||
cryptography.exceptions.UnsupportedAlgorithm) as error:
|
||||
exceptions[loader] = error
|
||||
|
||||
# no luck
|
||||
raise errors.Error('Unable to deserialize key: {0}'.format(exceptions))
|
||||
|
||||
@classmethod
|
||||
def load(cls, data, password=None, backend=None):
|
||||
"""Load serialized key as JWK.
|
||||
|
||||
:param str data: Public or private key serialized as PEM or DER.
|
||||
:param str password: Optional password.
|
||||
:param backend: A `.PEMSerializationBackend` and
|
||||
`.DERSerializationBackend` provider.
|
||||
|
||||
:raises errors.Error: if unable to deserialize, or unsupported
|
||||
JWK algorithm
|
||||
|
||||
:returns: JWK of an appropriate type.
|
||||
:rtype: `JWK`
|
||||
|
||||
"""
|
||||
try:
|
||||
key = cls._load_cryptography_key(data, password, backend)
|
||||
except errors.Error as error:
|
||||
logger.debug('Loading symmetric key, assymentric failed: %s', error)
|
||||
return JWKOct(key=data)
|
||||
|
||||
if cls.typ is not NotImplemented and not isinstance(
|
||||
key, cls.cryptography_key_types):
|
||||
raise errors.Error('Unable to deserialize {0} into {1}'.format(
|
||||
key.__class__, cls.__class__))
|
||||
for jwk_cls in six.itervalues(cls.TYPES):
|
||||
if isinstance(key, jwk_cls.cryptography_key_types):
|
||||
return jwk_cls(key=key)
|
||||
raise errors.Error('Unsupported algorithm: {0}'.format(key.__class__))
|
||||
|
||||
|
||||
@JWK.register
|
||||
class JWKES(JWK): # pragma: no cover
|
||||
# pylint: disable=abstract-class-not-used
|
||||
"""ES JWK.
|
||||
|
||||
.. warning:: This is not yet implemented!
|
||||
|
||||
"""
|
||||
typ = 'ES'
|
||||
cryptography_key_types = (
|
||||
ec.EllipticCurvePublicKey, ec.EllipticCurvePrivateKey)
|
||||
required = ('crv', JWK.type_field_name, 'x', 'y')
|
||||
|
||||
def fields_to_partial_json(self):
|
||||
raise NotImplementedError()
|
||||
|
||||
@classmethod
|
||||
def fields_from_json(cls, jobj):
|
||||
raise NotImplementedError()
|
||||
|
||||
def public_key(self):
|
||||
raise NotImplementedError()
|
||||
|
||||
|
||||
@JWK.register
|
||||
class JWKOct(JWK):
|
||||
"""Symmetric JWK."""
|
||||
typ = 'oct'
|
||||
__slots__ = ('key',)
|
||||
required = ('k', JWK.type_field_name)
|
||||
|
||||
def fields_to_partial_json(self):
|
||||
# TODO: An "alg" member SHOULD also be present to identify the
|
||||
# algorithm intended to be used with the key, unless the
|
||||
# application uses another means or convention to determine
|
||||
# the algorithm used.
|
||||
return {'k': json_util.encode_b64jose(self.key)}
|
||||
|
||||
@classmethod
|
||||
def fields_from_json(cls, jobj):
|
||||
return cls(key=json_util.decode_b64jose(jobj['k']))
|
||||
|
||||
def public_key(self):
|
||||
return self
|
||||
|
||||
|
||||
@JWK.register
|
||||
class JWKRSA(JWK):
|
||||
"""RSA JWK.
|
||||
|
||||
:ivar key: `cryptography.hazmat.primitives.rsa.RSAPrivateKey`
|
||||
or `cryptography.hazmat.primitives.rsa.RSAPublicKey` wrapped
|
||||
in `.ComparableRSAKey`
|
||||
|
||||
"""
|
||||
typ = 'RSA'
|
||||
cryptography_key_types = (rsa.RSAPublicKey, rsa.RSAPrivateKey)
|
||||
__slots__ = ('key',)
|
||||
required = ('e', JWK.type_field_name, 'n')
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
if 'key' in kwargs and not isinstance(
|
||||
kwargs['key'], util.ComparableRSAKey):
|
||||
kwargs['key'] = util.ComparableRSAKey(kwargs['key'])
|
||||
super(JWKRSA, self).__init__(*args, **kwargs)
|
||||
|
||||
@classmethod
|
||||
def _encode_param(cls, data):
|
||||
"""Encode Base64urlUInt.
|
||||
|
||||
:type data: long
|
||||
:rtype: unicode
|
||||
|
||||
"""
|
||||
def _leading_zeros(arg):
|
||||
if len(arg) % 2:
|
||||
return '0' + arg
|
||||
return arg
|
||||
|
||||
return json_util.encode_b64jose(binascii.unhexlify(
|
||||
_leading_zeros(hex(data)[2:].rstrip('L'))))
|
||||
|
||||
@classmethod
|
||||
def _decode_param(cls, data):
|
||||
"""Decode Base64urlUInt."""
|
||||
try:
|
||||
return int(binascii.hexlify(json_util.decode_b64jose(data)), 16)
|
||||
except ValueError: # invalid literal for long() with base 16
|
||||
raise errors.DeserializationError()
|
||||
|
||||
def public_key(self):
|
||||
return type(self)(key=self.key.public_key())
|
||||
|
||||
@classmethod
|
||||
def fields_from_json(cls, jobj):
|
||||
# pylint: disable=invalid-name
|
||||
n, e = (cls._decode_param(jobj[x]) for x in ('n', 'e'))
|
||||
public_numbers = rsa.RSAPublicNumbers(e=e, n=n)
|
||||
if 'd' not in jobj: # public key
|
||||
key = public_numbers.public_key(default_backend())
|
||||
else: # private key
|
||||
d = cls._decode_param(jobj['d'])
|
||||
if ('p' in jobj or 'q' in jobj or 'dp' in jobj or
|
||||
'dq' in jobj or 'qi' in jobj or 'oth' in jobj):
|
||||
# "If the producer includes any of the other private
|
||||
# key parameters, then all of the others MUST be
|
||||
# present, with the exception of "oth", which MUST
|
||||
# only be present when more than two prime factors
|
||||
# were used."
|
||||
p, q, dp, dq, qi, = all_params = tuple(
|
||||
jobj.get(x) for x in ('p', 'q', 'dp', 'dq', 'qi'))
|
||||
if tuple(param for param in all_params if param is None):
|
||||
raise errors.Error(
|
||||
'Some private parameters are missing: {0}'.format(
|
||||
all_params))
|
||||
p, q, dp, dq, qi = tuple(
|
||||
cls._decode_param(x) for x in all_params)
|
||||
|
||||
# TODO: check for oth
|
||||
else:
|
||||
# cryptography>=0.8
|
||||
p, q = rsa.rsa_recover_prime_factors(n, e, d)
|
||||
dp = rsa.rsa_crt_dmp1(d, p)
|
||||
dq = rsa.rsa_crt_dmq1(d, q)
|
||||
qi = rsa.rsa_crt_iqmp(p, q)
|
||||
|
||||
key = rsa.RSAPrivateNumbers(
|
||||
p, q, d, dp, dq, qi, public_numbers).private_key(
|
||||
default_backend())
|
||||
|
||||
return cls(key=key)
|
||||
|
||||
def fields_to_partial_json(self):
|
||||
# pylint: disable=protected-access
|
||||
if isinstance(self.key._wrapped, rsa.RSAPublicKey):
|
||||
numbers = self.key.public_numbers()
|
||||
params = {
|
||||
'n': numbers.n,
|
||||
'e': numbers.e,
|
||||
}
|
||||
else: # rsa.RSAPrivateKey
|
||||
private = self.key.private_numbers()
|
||||
public = self.key.public_key().public_numbers()
|
||||
params = {
|
||||
'n': public.n,
|
||||
'e': public.e,
|
||||
'd': private.d,
|
||||
'p': private.p,
|
||||
'q': private.q,
|
||||
'dp': private.dmp1,
|
||||
'dq': private.dmq1,
|
||||
'qi': private.iqmp,
|
||||
}
|
||||
return dict((key, self._encode_param(value))
|
||||
for key, value in six.iteritems(params))
|
||||
191
acme/acme/jose/jwk_test.py
Normal file
191
acme/acme/jose/jwk_test.py
Normal file
@@ -0,0 +1,191 @@
|
||||
"""Tests for acme.jose.jwk."""
|
||||
import binascii
|
||||
import unittest
|
||||
|
||||
from acme import test_util
|
||||
|
||||
from acme.jose import errors
|
||||
from acme.jose import json_util
|
||||
from acme.jose import util
|
||||
|
||||
|
||||
DSA_PEM = test_util.load_vector('dsa512_key.pem')
|
||||
RSA256_KEY = test_util.load_rsa_private_key('rsa256_key.pem')
|
||||
RSA512_KEY = test_util.load_rsa_private_key('rsa512_key.pem')
|
||||
|
||||
|
||||
class JWKTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.jwk.JWK."""
|
||||
|
||||
def test_load(self):
|
||||
from acme.jose.jwk import JWK
|
||||
self.assertRaises(errors.Error, JWK.load, DSA_PEM)
|
||||
|
||||
def test_load_subclass_wrong_type(self):
|
||||
from acme.jose.jwk import JWKRSA
|
||||
self.assertRaises(errors.Error, JWKRSA.load, DSA_PEM)
|
||||
|
||||
|
||||
class JWKTestBaseMixin(object):
|
||||
"""Mixin test for JWK subclass tests."""
|
||||
|
||||
thumbprint = NotImplemented
|
||||
|
||||
def test_thumbprint_private(self):
|
||||
self.assertEqual(self.thumbprint, self.jwk.thumbprint())
|
||||
|
||||
def test_thumbprint_public(self):
|
||||
self.assertEqual(self.thumbprint, self.jwk.public_key().thumbprint())
|
||||
|
||||
|
||||
class JWKOctTest(unittest.TestCase, JWKTestBaseMixin):
|
||||
"""Tests for acme.jose.jwk.JWKOct."""
|
||||
|
||||
thumbprint = (b"\xf3\xe7\xbe\xa8`\xd2\xdap\xe9}\x9c\xce>"
|
||||
b"\xd0\xfcI\xbe\xcd\x92'\xd4o\x0e\xf41\xea"
|
||||
b"\x8e(\x8a\xb2i\x1c")
|
||||
|
||||
def setUp(self):
|
||||
from acme.jose.jwk import JWKOct
|
||||
self.jwk = JWKOct(key=b'foo')
|
||||
self.jobj = {'kty': 'oct', 'k': json_util.encode_b64jose(b'foo')}
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.jwk.to_partial_json(), self.jobj)
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.jose.jwk import JWKOct
|
||||
self.assertEqual(self.jwk, JWKOct.from_json(self.jobj))
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.jose.jwk import JWKOct
|
||||
hash(JWKOct.from_json(self.jobj))
|
||||
|
||||
def test_load(self):
|
||||
from acme.jose.jwk import JWKOct
|
||||
self.assertEqual(self.jwk, JWKOct.load(b'foo'))
|
||||
|
||||
def test_public_key(self):
|
||||
self.assertTrue(self.jwk.public_key() is self.jwk)
|
||||
|
||||
|
||||
class JWKRSATest(unittest.TestCase, JWKTestBaseMixin):
|
||||
"""Tests for acme.jose.jwk.JWKRSA."""
|
||||
# pylint: disable=too-many-instance-attributes
|
||||
|
||||
thumbprint = (b'\x83K\xdc#3\x98\xca\x98\xed\xcb\x80\x80<\x0c'
|
||||
b'\xf0\x95\xb9H\xb2*l\xbd$\xe5&|O\x91\xd4 \xb0Y')
|
||||
|
||||
def setUp(self):
|
||||
from acme.jose.jwk import JWKRSA
|
||||
self.jwk256 = JWKRSA(key=RSA256_KEY.public_key())
|
||||
self.jwk256json = {
|
||||
'kty': 'RSA',
|
||||
'e': 'AQAB',
|
||||
'n': 'm2Fylv-Uz7trgTW8EBHP3FQSMeZs2GNQ6VRo1sIVJEk',
|
||||
}
|
||||
# pylint: disable=protected-access
|
||||
self.jwk256_not_comparable = JWKRSA(
|
||||
key=RSA256_KEY.public_key()._wrapped)
|
||||
self.jwk512 = JWKRSA(key=RSA512_KEY.public_key())
|
||||
self.jwk512json = {
|
||||
'kty': 'RSA',
|
||||
'e': 'AQAB',
|
||||
'n': 'rHVztFHtH92ucFJD_N_HW9AsdRsUuHUBBBDlHwNlRd3fp5'
|
||||
'80rv2-6QWE30cWgdmJS86ObRz6lUTor4R0T-3C5Q',
|
||||
}
|
||||
self.private = JWKRSA(key=RSA256_KEY)
|
||||
self.private_json_small = self.jwk256json.copy()
|
||||
self.private_json_small['d'] = (
|
||||
'lPQED_EPTV0UIBfNI3KP2d9Jlrc2mrMllmf946bu-CE')
|
||||
self.private_json = self.jwk256json.copy()
|
||||
self.private_json.update({
|
||||
'd': 'lPQED_EPTV0UIBfNI3KP2d9Jlrc2mrMllmf946bu-CE',
|
||||
'p': 'zUVNZn4lLLBD1R6NE8TKNQ',
|
||||
'q': 'wcfKfc7kl5jfqXArCRSURQ',
|
||||
'dp': 'CWJFq43QvT5Bm5iN8n1okQ',
|
||||
'dq': 'bHh2u7etM8LKKCF2pY2UdQ',
|
||||
'qi': 'oi45cEkbVoJjAbnQpFY87Q',
|
||||
})
|
||||
self.jwk = self.private
|
||||
|
||||
def test_init_auto_comparable(self):
|
||||
self.assertTrue(isinstance(
|
||||
self.jwk256_not_comparable.key, util.ComparableRSAKey))
|
||||
self.assertEqual(self.jwk256, self.jwk256_not_comparable)
|
||||
|
||||
def test_encode_param_zero(self):
|
||||
from acme.jose.jwk import JWKRSA
|
||||
# pylint: disable=protected-access
|
||||
# TODO: move encode/decode _param to separate class
|
||||
self.assertEqual('AA', JWKRSA._encode_param(0))
|
||||
|
||||
def test_equals(self):
|
||||
self.assertEqual(self.jwk256, self.jwk256)
|
||||
self.assertEqual(self.jwk512, self.jwk512)
|
||||
|
||||
def test_not_equals(self):
|
||||
self.assertNotEqual(self.jwk256, self.jwk512)
|
||||
self.assertNotEqual(self.jwk512, self.jwk256)
|
||||
|
||||
def test_load(self):
|
||||
from acme.jose.jwk import JWKRSA
|
||||
self.assertEqual(self.private, JWKRSA.load(
|
||||
test_util.load_vector('rsa256_key.pem')))
|
||||
|
||||
def test_public_key(self):
|
||||
self.assertEqual(self.jwk256, self.private.public_key())
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.jwk256.to_partial_json(), self.jwk256json)
|
||||
self.assertEqual(self.jwk512.to_partial_json(), self.jwk512json)
|
||||
self.assertEqual(self.private.to_partial_json(), self.private_json)
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.jose.jwk import JWK
|
||||
self.assertEqual(
|
||||
self.jwk256, JWK.from_json(self.jwk256json))
|
||||
self.assertEqual(
|
||||
self.jwk512, JWK.from_json(self.jwk512json))
|
||||
self.assertEqual(self.private, JWK.from_json(self.private_json))
|
||||
|
||||
def test_from_json_private_small(self):
|
||||
from acme.jose.jwk import JWK
|
||||
self.assertEqual(self.private, JWK.from_json(self.private_json_small))
|
||||
|
||||
def test_from_json_missing_one_additional(self):
|
||||
from acme.jose.jwk import JWK
|
||||
del self.private_json['q']
|
||||
self.assertRaises(errors.Error, JWK.from_json, self.private_json)
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.jose.jwk import JWK
|
||||
hash(JWK.from_json(self.jwk256json))
|
||||
|
||||
def test_from_json_non_schema_errors(self):
|
||||
# valid against schema, but still failing
|
||||
from acme.jose.jwk import JWK
|
||||
self.assertRaises(errors.DeserializationError, JWK.from_json,
|
||||
{'kty': 'RSA', 'e': 'AQAB', 'n': ''})
|
||||
self.assertRaises(errors.DeserializationError, JWK.from_json,
|
||||
{'kty': 'RSA', 'e': 'AQAB', 'n': '1'})
|
||||
|
||||
def test_thumbprint_go_jose(self):
|
||||
# https://github.com/square/go-jose/blob/4ddd71883fa547d37fbf598071f04512d8bafee3/jwk.go#L155
|
||||
# https://github.com/square/go-jose/blob/4ddd71883fa547d37fbf598071f04512d8bafee3/jwk_test.go#L331-L344
|
||||
# https://github.com/square/go-jose/blob/4ddd71883fa547d37fbf598071f04512d8bafee3/jwk_test.go#L384
|
||||
from acme.jose.jwk import JWKRSA
|
||||
key = JWKRSA.json_loads("""{
|
||||
"kty": "RSA",
|
||||
"kid": "bilbo.baggins@hobbiton.example",
|
||||
"use": "sig",
|
||||
"n": "n4EPtAOCc9AlkeQHPzHStgAbgs7bTZLwUBZdR8_KuKPEHLd4rHVTeT-O-XV2jRojdNhxJWTDvNd7nqQ0VEiZQHz_AJmSCpMaJMRBSFKrKb2wqVwGU_NsYOYL-QtiWN2lbzcEe6XC0dApr5ydQLrHqkHHig3RBordaZ6Aj-oBHqFEHYpPe7Tpe-OfVfHd1E6cS6M1FZcD1NNLYD5lFHpPI9bTwJlsde3uhGqC0ZCuEHg8lhzwOHrtIQbS0FVbb9k3-tVTU4fg_3L_vniUFAKwuCLqKnS2BYwdq_mzSnbLY7h_qixoR7jig3__kRhuaxwUkRz5iaiQkqgc5gHdrNP5zw",
|
||||
"e": "AQAB"
|
||||
}""")
|
||||
self.assertEqual(
|
||||
binascii.hexlify(key.thumbprint()),
|
||||
b"f63838e96077ad1fc01c3f8405774dedc0641f558ebb4b40dccf5f9b6d66a932")
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
432
acme/acme/jose/jws.py
Normal file
432
acme/acme/jose/jws.py
Normal file
@@ -0,0 +1,432 @@
|
||||
"""JOSE Web Signature."""
|
||||
import argparse
|
||||
import base64
|
||||
import sys
|
||||
|
||||
import OpenSSL
|
||||
import six
|
||||
|
||||
from acme.jose import b64
|
||||
from acme.jose import errors
|
||||
from acme.jose import json_util
|
||||
from acme.jose import jwa
|
||||
from acme.jose import jwk
|
||||
from acme.jose import util
|
||||
|
||||
|
||||
class MediaType(object):
|
||||
"""MediaType field encoder/decoder."""
|
||||
|
||||
PREFIX = 'application/'
|
||||
"""MIME Media Type and Content Type prefix."""
|
||||
|
||||
@classmethod
|
||||
def decode(cls, value):
|
||||
"""Decoder."""
|
||||
# 4.1.10
|
||||
if '/' not in value:
|
||||
if ';' in value:
|
||||
raise errors.DeserializationError('Unexpected semi-colon')
|
||||
return cls.PREFIX + value
|
||||
return value
|
||||
|
||||
@classmethod
|
||||
def encode(cls, value):
|
||||
"""Encoder."""
|
||||
# 4.1.10
|
||||
if ';' not in value:
|
||||
assert value.startswith(cls.PREFIX)
|
||||
return value[len(cls.PREFIX):]
|
||||
return value
|
||||
|
||||
|
||||
class Header(json_util.JSONObjectWithFields):
|
||||
"""JOSE Header.
|
||||
|
||||
.. warning:: This class supports **only** Registered Header
|
||||
Parameter Names (as defined in section 4.1 of the
|
||||
protocol). If you need Public Header Parameter Names (4.2)
|
||||
or Private Header Parameter Names (4.3), you must subclass
|
||||
and override :meth:`from_json` and :meth:`to_partial_json`
|
||||
appropriately.
|
||||
|
||||
.. warning:: This class does not support any extensions through
|
||||
the "crit" (Critical) Header Parameter (4.1.11) and as a
|
||||
conforming implementation, :meth:`from_json` treats its
|
||||
occurrence as an error. Please subclass if you seek for
|
||||
a different behaviour.
|
||||
|
||||
:ivar x5tS256: "x5t#S256"
|
||||
:ivar str typ: MIME Media Type, inc. :const:`MediaType.PREFIX`.
|
||||
:ivar str cty: Content-Type, inc. :const:`MediaType.PREFIX`.
|
||||
|
||||
"""
|
||||
alg = json_util.Field(
|
||||
'alg', decoder=jwa.JWASignature.from_json, omitempty=True)
|
||||
jku = json_util.Field('jku', omitempty=True)
|
||||
jwk = json_util.Field('jwk', decoder=jwk.JWK.from_json, omitempty=True)
|
||||
kid = json_util.Field('kid', omitempty=True)
|
||||
x5u = json_util.Field('x5u', omitempty=True)
|
||||
x5c = json_util.Field('x5c', omitempty=True, default=())
|
||||
x5t = json_util.Field(
|
||||
'x5t', decoder=json_util.decode_b64jose, omitempty=True)
|
||||
x5tS256 = json_util.Field(
|
||||
'x5t#S256', decoder=json_util.decode_b64jose, omitempty=True)
|
||||
typ = json_util.Field('typ', encoder=MediaType.encode,
|
||||
decoder=MediaType.decode, omitempty=True)
|
||||
cty = json_util.Field('cty', encoder=MediaType.encode,
|
||||
decoder=MediaType.decode, omitempty=True)
|
||||
crit = json_util.Field('crit', omitempty=True, default=())
|
||||
|
||||
def not_omitted(self):
|
||||
"""Fields that would not be omitted in the JSON object."""
|
||||
return dict((name, getattr(self, name))
|
||||
for name, field in six.iteritems(self._fields)
|
||||
if not field.omit(getattr(self, name)))
|
||||
|
||||
def __add__(self, other):
|
||||
if not isinstance(other, type(self)):
|
||||
raise TypeError('Header cannot be added to: {0}'.format(
|
||||
type(other)))
|
||||
|
||||
not_omitted_self = self.not_omitted()
|
||||
not_omitted_other = other.not_omitted()
|
||||
|
||||
if set(not_omitted_self).intersection(not_omitted_other):
|
||||
raise TypeError('Addition of overlapping headers not defined')
|
||||
|
||||
not_omitted_self.update(not_omitted_other)
|
||||
return type(self)(**not_omitted_self) # pylint: disable=star-args
|
||||
|
||||
def find_key(self):
|
||||
"""Find key based on header.
|
||||
|
||||
.. todo:: Supports only "jwk" header parameter lookup.
|
||||
|
||||
:returns: (Public) key found in the header.
|
||||
:rtype: .JWK
|
||||
|
||||
:raises acme.jose.errors.Error: if key could not be found
|
||||
|
||||
"""
|
||||
if self.jwk is None:
|
||||
raise errors.Error('No key found')
|
||||
return self.jwk
|
||||
|
||||
@crit.decoder
|
||||
def crit(unused_value):
|
||||
# pylint: disable=missing-docstring,no-self-argument,no-self-use
|
||||
raise errors.DeserializationError(
|
||||
'"crit" is not supported, please subclass')
|
||||
|
||||
# x5c does NOT use JOSE Base64 (4.1.6)
|
||||
|
||||
@x5c.encoder
|
||||
def x5c(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
return [base64.b64encode(OpenSSL.crypto.dump_certificate(
|
||||
OpenSSL.crypto.FILETYPE_ASN1, cert.wrapped)) for cert in value]
|
||||
|
||||
@x5c.decoder
|
||||
def x5c(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
try:
|
||||
return tuple(util.ComparableX509(OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_ASN1,
|
||||
base64.b64decode(cert))) for cert in value)
|
||||
except OpenSSL.crypto.Error as error:
|
||||
raise errors.DeserializationError(error)
|
||||
|
||||
|
||||
class Signature(json_util.JSONObjectWithFields):
|
||||
"""JWS Signature.
|
||||
|
||||
:ivar combined: Combined Header (protected and unprotected,
|
||||
:class:`Header`).
|
||||
:ivar unicode protected: JWS protected header (Jose Base-64 decoded).
|
||||
:ivar header: JWS Unprotected Header (:class:`Header`).
|
||||
:ivar str signature: The signature.
|
||||
|
||||
"""
|
||||
header_cls = Header
|
||||
|
||||
__slots__ = ('combined',)
|
||||
protected = json_util.Field('protected', omitempty=True, default='')
|
||||
header = json_util.Field(
|
||||
'header', omitempty=True, default=header_cls(),
|
||||
decoder=header_cls.from_json)
|
||||
signature = json_util.Field(
|
||||
'signature', decoder=json_util.decode_b64jose,
|
||||
encoder=json_util.encode_b64jose)
|
||||
|
||||
@protected.encoder
|
||||
def protected(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
# wrong type guess (Signature, not bytes) | pylint: disable=no-member
|
||||
return json_util.encode_b64jose(value.encode('utf-8'))
|
||||
|
||||
@protected.decoder
|
||||
def protected(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
return json_util.decode_b64jose(value).decode('utf-8')
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
if 'combined' not in kwargs:
|
||||
kwargs = self._with_combined(kwargs)
|
||||
super(Signature, self).__init__(**kwargs)
|
||||
assert self.combined.alg is not None
|
||||
|
||||
@classmethod
|
||||
def _with_combined(cls, kwargs):
|
||||
assert 'combined' not in kwargs
|
||||
header = kwargs.get('header', cls._fields['header'].default)
|
||||
protected = kwargs.get('protected', cls._fields['protected'].default)
|
||||
|
||||
if protected:
|
||||
combined = header + cls.header_cls.json_loads(protected)
|
||||
else:
|
||||
combined = header
|
||||
|
||||
kwargs['combined'] = combined
|
||||
return kwargs
|
||||
|
||||
@classmethod
|
||||
def _msg(cls, protected, payload):
|
||||
return (b64.b64encode(protected.encode('utf-8')) + b'.' +
|
||||
b64.b64encode(payload))
|
||||
|
||||
def verify(self, payload, key=None):
|
||||
"""Verify.
|
||||
|
||||
:param JWK key: Key used for verification.
|
||||
|
||||
"""
|
||||
key = self.combined.find_key() if key is None else key
|
||||
return self.combined.alg.verify(
|
||||
key=key.key, sig=self.signature,
|
||||
msg=self._msg(self.protected, payload))
|
||||
|
||||
@classmethod
|
||||
def sign(cls, payload, key, alg, include_jwk=True,
|
||||
protect=frozenset(), **kwargs):
|
||||
"""Sign.
|
||||
|
||||
:param JWK key: Key for signature.
|
||||
|
||||
"""
|
||||
assert isinstance(key, alg.kty)
|
||||
|
||||
header_params = kwargs
|
||||
header_params['alg'] = alg
|
||||
if include_jwk:
|
||||
header_params['jwk'] = key.public_key()
|
||||
|
||||
assert set(header_params).issubset(cls.header_cls._fields)
|
||||
assert protect.issubset(cls.header_cls._fields)
|
||||
|
||||
protected_params = {}
|
||||
for header in protect:
|
||||
protected_params[header] = header_params.pop(header)
|
||||
if protected_params:
|
||||
# pylint: disable=star-args
|
||||
protected = cls.header_cls(**protected_params).json_dumps()
|
||||
else:
|
||||
protected = ''
|
||||
|
||||
header = cls.header_cls(**header_params) # pylint: disable=star-args
|
||||
signature = alg.sign(key.key, cls._msg(protected, payload))
|
||||
|
||||
return cls(protected=protected, header=header, signature=signature)
|
||||
|
||||
def fields_to_partial_json(self):
|
||||
fields = super(Signature, self).fields_to_partial_json()
|
||||
if not fields['header'].not_omitted():
|
||||
del fields['header']
|
||||
return fields
|
||||
|
||||
@classmethod
|
||||
def fields_from_json(cls, jobj):
|
||||
fields = super(Signature, cls).fields_from_json(jobj)
|
||||
fields_with_combined = cls._with_combined(fields)
|
||||
if 'alg' not in fields_with_combined['combined'].not_omitted():
|
||||
raise errors.DeserializationError('alg not present')
|
||||
return fields_with_combined
|
||||
|
||||
|
||||
class JWS(json_util.JSONObjectWithFields):
|
||||
"""JSON Web Signature.
|
||||
|
||||
:ivar str payload: JWS Payload.
|
||||
:ivar str signature: JWS Signatures.
|
||||
|
||||
"""
|
||||
__slots__ = ('payload', 'signatures')
|
||||
|
||||
signature_cls = Signature
|
||||
|
||||
def verify(self, key=None):
|
||||
"""Verify."""
|
||||
return all(sig.verify(self.payload, key) for sig in self.signatures)
|
||||
|
||||
@classmethod
|
||||
def sign(cls, payload, **kwargs):
|
||||
"""Sign."""
|
||||
return cls(payload=payload, signatures=(
|
||||
cls.signature_cls.sign(payload=payload, **kwargs),))
|
||||
|
||||
@property
|
||||
def signature(self):
|
||||
"""Get a singleton signature.
|
||||
|
||||
:rtype: `signature_cls`
|
||||
|
||||
"""
|
||||
assert len(self.signatures) == 1
|
||||
return self.signatures[0]
|
||||
|
||||
def to_compact(self):
|
||||
"""Compact serialization.
|
||||
|
||||
:rtype: bytes
|
||||
|
||||
"""
|
||||
assert len(self.signatures) == 1
|
||||
|
||||
assert 'alg' not in self.signature.header.not_omitted()
|
||||
# ... it must be in protected
|
||||
|
||||
return (
|
||||
b64.b64encode(self.signature.protected.encode('utf-8')) +
|
||||
b'.' +
|
||||
b64.b64encode(self.payload) +
|
||||
b'.' +
|
||||
b64.b64encode(self.signature.signature))
|
||||
|
||||
@classmethod
|
||||
def from_compact(cls, compact):
|
||||
"""Compact deserialization.
|
||||
|
||||
:param bytes compact:
|
||||
|
||||
"""
|
||||
try:
|
||||
protected, payload, signature = compact.split(b'.')
|
||||
except ValueError:
|
||||
raise errors.DeserializationError(
|
||||
'Compact JWS serialization should comprise of exactly'
|
||||
' 3 dot-separated components')
|
||||
|
||||
sig = cls.signature_cls(
|
||||
protected=b64.b64decode(protected).decode('utf-8'),
|
||||
signature=b64.b64decode(signature))
|
||||
return cls(payload=b64.b64decode(payload), signatures=(sig,))
|
||||
|
||||
def to_partial_json(self, flat=True): # pylint: disable=arguments-differ
|
||||
assert self.signatures
|
||||
payload = json_util.encode_b64jose(self.payload)
|
||||
|
||||
if flat and len(self.signatures) == 1:
|
||||
ret = self.signatures[0].to_partial_json()
|
||||
ret['payload'] = payload
|
||||
return ret
|
||||
else:
|
||||
return {
|
||||
'payload': payload,
|
||||
'signatures': self.signatures,
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
if 'signature' in jobj and 'signatures' in jobj:
|
||||
raise errors.DeserializationError('Flat mixed with non-flat')
|
||||
elif 'signature' in jobj: # flat
|
||||
return cls(payload=json_util.decode_b64jose(jobj.pop('payload')),
|
||||
signatures=(cls.signature_cls.from_json(jobj),))
|
||||
else:
|
||||
return cls(payload=json_util.decode_b64jose(jobj['payload']),
|
||||
signatures=tuple(cls.signature_cls.from_json(sig)
|
||||
for sig in jobj['signatures']))
|
||||
|
||||
|
||||
class CLI(object):
|
||||
"""JWS CLI."""
|
||||
|
||||
@classmethod
|
||||
def sign(cls, args):
|
||||
"""Sign."""
|
||||
key = args.alg.kty.load(args.key.read())
|
||||
args.key.close()
|
||||
if args.protect is None:
|
||||
args.protect = []
|
||||
if args.compact:
|
||||
args.protect.append('alg')
|
||||
|
||||
sig = JWS.sign(payload=sys.stdin.read().encode(), key=key, alg=args.alg,
|
||||
protect=set(args.protect))
|
||||
|
||||
if args.compact:
|
||||
six.print_(sig.to_compact().decode('utf-8'))
|
||||
else: # JSON
|
||||
six.print_(sig.json_dumps_pretty())
|
||||
|
||||
@classmethod
|
||||
def verify(cls, args):
|
||||
"""Verify."""
|
||||
if args.compact:
|
||||
sig = JWS.from_compact(sys.stdin.read().encode())
|
||||
else: # JSON
|
||||
try:
|
||||
sig = JWS.json_loads(sys.stdin.read())
|
||||
except errors.Error as error:
|
||||
six.print_(error)
|
||||
return -1
|
||||
|
||||
if args.key is not None:
|
||||
assert args.kty is not None
|
||||
key = args.kty.load(args.key.read()).public_key()
|
||||
args.key.close()
|
||||
else:
|
||||
key = None
|
||||
|
||||
sys.stdout.write(sig.payload)
|
||||
return not sig.verify(key=key)
|
||||
|
||||
@classmethod
|
||||
def _alg_type(cls, arg):
|
||||
return jwa.JWASignature.from_json(arg)
|
||||
|
||||
@classmethod
|
||||
def _header_type(cls, arg):
|
||||
assert arg in Signature.header_cls._fields
|
||||
return arg
|
||||
|
||||
@classmethod
|
||||
def _kty_type(cls, arg):
|
||||
assert arg in jwk.JWK.TYPES
|
||||
return jwk.JWK.TYPES[arg]
|
||||
|
||||
@classmethod
|
||||
def run(cls, args=sys.argv[1:]):
|
||||
"""Parse arguments and sign/verify."""
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument('--compact', action='store_true')
|
||||
|
||||
subparsers = parser.add_subparsers()
|
||||
parser_sign = subparsers.add_parser('sign')
|
||||
parser_sign.set_defaults(func=cls.sign)
|
||||
parser_sign.add_argument(
|
||||
'-k', '--key', type=argparse.FileType('rb'), required=True)
|
||||
parser_sign.add_argument(
|
||||
'-a', '--alg', type=cls._alg_type, default=jwa.RS256)
|
||||
parser_sign.add_argument(
|
||||
'-p', '--protect', action='append', type=cls._header_type)
|
||||
|
||||
parser_verify = subparsers.add_parser('verify')
|
||||
parser_verify.set_defaults(func=cls.verify)
|
||||
parser_verify.add_argument(
|
||||
'-k', '--key', type=argparse.FileType('rb'), required=False)
|
||||
parser_verify.add_argument(
|
||||
'--kty', type=cls._kty_type, required=False)
|
||||
|
||||
parsed = parser.parse_args(args)
|
||||
return parsed.func(parsed)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
exit(CLI.run()) # pragma: no cover
|
||||
239
acme/acme/jose/jws_test.py
Normal file
239
acme/acme/jose/jws_test.py
Normal file
@@ -0,0 +1,239 @@
|
||||
"""Tests for acme.jose.jws."""
|
||||
import base64
|
||||
import unittest
|
||||
|
||||
import mock
|
||||
import OpenSSL
|
||||
|
||||
from acme import test_util
|
||||
|
||||
from acme.jose import errors
|
||||
from acme.jose import json_util
|
||||
from acme.jose import jwa
|
||||
from acme.jose import jwk
|
||||
|
||||
|
||||
CERT = test_util.load_comparable_cert('cert.pem')
|
||||
KEY = jwk.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
|
||||
|
||||
|
||||
class MediaTypeTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.jws.MediaType."""
|
||||
|
||||
def test_decode(self):
|
||||
from acme.jose.jws import MediaType
|
||||
self.assertEqual('application/app', MediaType.decode('application/app'))
|
||||
self.assertEqual('application/app', MediaType.decode('app'))
|
||||
self.assertRaises(
|
||||
errors.DeserializationError, MediaType.decode, 'app;foo')
|
||||
|
||||
def test_encode(self):
|
||||
from acme.jose.jws import MediaType
|
||||
self.assertEqual('app', MediaType.encode('application/app'))
|
||||
self.assertEqual('application/app;foo',
|
||||
MediaType.encode('application/app;foo'))
|
||||
|
||||
|
||||
class HeaderTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.jws.Header."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.jose.jws import Header
|
||||
self.header1 = Header(jwk='foo')
|
||||
self.header2 = Header(jwk='bar')
|
||||
self.crit = Header(crit=('a', 'b'))
|
||||
self.empty = Header()
|
||||
|
||||
def test_add_non_empty(self):
|
||||
from acme.jose.jws import Header
|
||||
self.assertEqual(Header(jwk='foo', crit=('a', 'b')),
|
||||
self.header1 + self.crit)
|
||||
|
||||
def test_add_empty(self):
|
||||
self.assertEqual(self.header1, self.header1 + self.empty)
|
||||
self.assertEqual(self.header1, self.empty + self.header1)
|
||||
|
||||
def test_add_overlapping_error(self):
|
||||
self.assertRaises(TypeError, self.header1.__add__, self.header2)
|
||||
|
||||
def test_add_wrong_type_error(self):
|
||||
self.assertRaises(TypeError, self.header1.__add__, 'xxx')
|
||||
|
||||
def test_crit_decode_always_errors(self):
|
||||
from acme.jose.jws import Header
|
||||
self.assertRaises(errors.DeserializationError, Header.from_json,
|
||||
{'crit': ['a', 'b']})
|
||||
|
||||
def test_x5c_decoding(self):
|
||||
from acme.jose.jws import Header
|
||||
header = Header(x5c=(CERT, CERT))
|
||||
jobj = header.to_partial_json()
|
||||
cert_asn1 = OpenSSL.crypto.dump_certificate(
|
||||
OpenSSL.crypto.FILETYPE_ASN1, CERT.wrapped)
|
||||
cert_b64 = base64.b64encode(cert_asn1)
|
||||
self.assertEqual(jobj, {'x5c': [cert_b64, cert_b64]})
|
||||
self.assertEqual(header, Header.from_json(jobj))
|
||||
jobj['x5c'][0] = base64.b64encode(b'xxx' + cert_asn1)
|
||||
self.assertRaises(errors.DeserializationError, Header.from_json, jobj)
|
||||
|
||||
def test_find_key(self):
|
||||
self.assertEqual('foo', self.header1.find_key())
|
||||
self.assertEqual('bar', self.header2.find_key())
|
||||
self.assertRaises(errors.Error, self.crit.find_key)
|
||||
|
||||
|
||||
class SignatureTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.jws.Signature."""
|
||||
|
||||
def test_from_json(self):
|
||||
from acme.jose.jws import Header
|
||||
from acme.jose.jws import Signature
|
||||
self.assertEqual(
|
||||
Signature(signature=b'foo', header=Header(alg=jwa.RS256)),
|
||||
Signature.from_json(
|
||||
{'signature': 'Zm9v', 'header': {'alg': 'RS256'}}))
|
||||
|
||||
def test_from_json_no_alg_error(self):
|
||||
from acme.jose.jws import Signature
|
||||
self.assertRaises(errors.DeserializationError,
|
||||
Signature.from_json, {'signature': 'foo'})
|
||||
|
||||
|
||||
class JWSTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.jws.JWS."""
|
||||
|
||||
def setUp(self):
|
||||
self.privkey = KEY
|
||||
self.pubkey = self.privkey.public_key()
|
||||
|
||||
from acme.jose.jws import JWS
|
||||
self.unprotected = JWS.sign(
|
||||
payload=b'foo', key=self.privkey, alg=jwa.RS256)
|
||||
self.protected = JWS.sign(
|
||||
payload=b'foo', key=self.privkey, alg=jwa.RS256,
|
||||
protect=frozenset(['jwk', 'alg']))
|
||||
self.mixed = JWS.sign(
|
||||
payload=b'foo', key=self.privkey, alg=jwa.RS256,
|
||||
protect=frozenset(['alg']))
|
||||
|
||||
def test_pubkey_jwk(self):
|
||||
self.assertEqual(self.unprotected.signature.combined.jwk, self.pubkey)
|
||||
self.assertEqual(self.protected.signature.combined.jwk, self.pubkey)
|
||||
self.assertEqual(self.mixed.signature.combined.jwk, self.pubkey)
|
||||
|
||||
def test_sign_unprotected(self):
|
||||
self.assertTrue(self.unprotected.verify())
|
||||
|
||||
def test_sign_protected(self):
|
||||
self.assertTrue(self.protected.verify())
|
||||
|
||||
def test_sign_mixed(self):
|
||||
self.assertTrue(self.mixed.verify())
|
||||
|
||||
def test_compact_lost_unprotected(self):
|
||||
compact = self.mixed.to_compact()
|
||||
self.assertEqual(
|
||||
b'eyJhbGciOiAiUlMyNTYifQ.Zm9v.OHdxFVj73l5LpxbFp1AmYX4yJM0Pyb'
|
||||
b'_893n1zQjpim_eLS5J1F61lkvrCrCDErTEJnBGOGesJ72M7b6Ve1cAJA',
|
||||
compact)
|
||||
|
||||
from acme.jose.jws import JWS
|
||||
mixed = JWS.from_compact(compact)
|
||||
|
||||
self.assertNotEqual(self.mixed, mixed)
|
||||
self.assertEqual(
|
||||
set(['alg']), set(mixed.signature.combined.not_omitted()))
|
||||
|
||||
def test_from_compact_missing_components(self):
|
||||
from acme.jose.jws import JWS
|
||||
self.assertRaises(errors.DeserializationError, JWS.from_compact, b'.')
|
||||
|
||||
def test_json_omitempty(self):
|
||||
protected_jobj = self.protected.to_partial_json(flat=True)
|
||||
unprotected_jobj = self.unprotected.to_partial_json(flat=True)
|
||||
|
||||
self.assertTrue('protected' not in unprotected_jobj)
|
||||
self.assertTrue('header' not in protected_jobj)
|
||||
|
||||
unprotected_jobj['header'] = unprotected_jobj['header'].to_json()
|
||||
|
||||
from acme.jose.jws import JWS
|
||||
self.assertEqual(JWS.from_json(protected_jobj), self.protected)
|
||||
self.assertEqual(JWS.from_json(unprotected_jobj), self.unprotected)
|
||||
|
||||
def test_json_flat(self):
|
||||
jobj_to = {
|
||||
'signature': json_util.encode_b64jose(
|
||||
self.mixed.signature.signature),
|
||||
'payload': json_util.encode_b64jose(b'foo'),
|
||||
'header': self.mixed.signature.header,
|
||||
'protected': json_util.encode_b64jose(
|
||||
self.mixed.signature.protected.encode('utf-8')),
|
||||
}
|
||||
jobj_from = jobj_to.copy()
|
||||
jobj_from['header'] = jobj_from['header'].to_json()
|
||||
|
||||
self.assertEqual(self.mixed.to_partial_json(flat=True), jobj_to)
|
||||
from acme.jose.jws import JWS
|
||||
self.assertEqual(self.mixed, JWS.from_json(jobj_from))
|
||||
|
||||
def test_json_not_flat(self):
|
||||
jobj_to = {
|
||||
'signatures': (self.mixed.signature,),
|
||||
'payload': json_util.encode_b64jose(b'foo'),
|
||||
}
|
||||
jobj_from = jobj_to.copy()
|
||||
jobj_from['signatures'] = [jobj_to['signatures'][0].to_json()]
|
||||
|
||||
self.assertEqual(self.mixed.to_partial_json(flat=False), jobj_to)
|
||||
from acme.jose.jws import JWS
|
||||
self.assertEqual(self.mixed, JWS.from_json(jobj_from))
|
||||
|
||||
def test_from_json_mixed_flat(self):
|
||||
from acme.jose.jws import JWS
|
||||
self.assertRaises(errors.DeserializationError, JWS.from_json,
|
||||
{'signatures': (), 'signature': 'foo'})
|
||||
|
||||
def test_from_json_hashable(self):
|
||||
from acme.jose.jws import JWS
|
||||
hash(JWS.from_json(self.mixed.to_json()))
|
||||
|
||||
|
||||
class CLITest(unittest.TestCase):
|
||||
|
||||
def setUp(self):
|
||||
self.key_path = test_util.vector_path('rsa512_key.pem')
|
||||
|
||||
def test_unverified(self):
|
||||
from acme.jose.jws import CLI
|
||||
with mock.patch('sys.stdin') as sin:
|
||||
sin.read.return_value = '{"payload": "foo", "signature": "xxx"}'
|
||||
with mock.patch('sys.stdout'):
|
||||
self.assertEqual(-1, CLI.run(['verify']))
|
||||
|
||||
def test_json(self):
|
||||
from acme.jose.jws import CLI
|
||||
|
||||
with mock.patch('sys.stdin') as sin:
|
||||
sin.read.return_value = 'foo'
|
||||
with mock.patch('sys.stdout') as sout:
|
||||
CLI.run(['sign', '-k', self.key_path, '-a', 'RS256',
|
||||
'-p', 'jwk'])
|
||||
sin.read.return_value = sout.write.mock_calls[0][1][0]
|
||||
self.assertEqual(0, CLI.run(['verify']))
|
||||
|
||||
def test_compact(self):
|
||||
from acme.jose.jws import CLI
|
||||
|
||||
with mock.patch('sys.stdin') as sin:
|
||||
sin.read.return_value = 'foo'
|
||||
with mock.patch('sys.stdout') as sout:
|
||||
CLI.run(['--compact', 'sign', '-k', self.key_path])
|
||||
sin.read.return_value = sout.write.mock_calls[0][1][0]
|
||||
self.assertEqual(0, CLI.run([
|
||||
'--compact', 'verify', '--kty', 'RSA',
|
||||
'-k', self.key_path]))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
226
acme/acme/jose/util.py
Normal file
226
acme/acme/jose/util.py
Normal file
@@ -0,0 +1,226 @@
|
||||
"""JOSE utilities."""
|
||||
import collections
|
||||
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
import OpenSSL
|
||||
import six
|
||||
|
||||
|
||||
class abstractclassmethod(classmethod):
|
||||
# pylint: disable=invalid-name,too-few-public-methods
|
||||
"""Descriptor for an abstract classmethod.
|
||||
|
||||
It augments the :mod:`abc` framework with an abstract
|
||||
classmethod. This is implemented as :class:`abc.abstractclassmethod`
|
||||
in the standard Python library starting with version 3.2.
|
||||
|
||||
This particular implementation, allegedly based on Python 3.3 source
|
||||
code, is stolen from
|
||||
http://stackoverflow.com/questions/11217878/python-2-7-combine-abc-abstractmethod-and-classmethod.
|
||||
|
||||
"""
|
||||
__isabstractmethod__ = True
|
||||
|
||||
def __init__(self, target):
|
||||
target.__isabstractmethod__ = True
|
||||
super(abstractclassmethod, self).__init__(target)
|
||||
|
||||
|
||||
class ComparableX509(object): # pylint: disable=too-few-public-methods
|
||||
"""Wrapper for OpenSSL.crypto.X509** objects that supports __eq__.
|
||||
|
||||
:ivar wrapped: Wrapped certificate or certificate request.
|
||||
:type wrapped: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
|
||||
|
||||
"""
|
||||
def __init__(self, wrapped):
|
||||
assert isinstance(wrapped, OpenSSL.crypto.X509) or isinstance(
|
||||
wrapped, OpenSSL.crypto.X509Req)
|
||||
self.wrapped = wrapped
|
||||
|
||||
def __getattr__(self, name):
|
||||
return getattr(self.wrapped, name)
|
||||
|
||||
def _dump(self, filetype=OpenSSL.crypto.FILETYPE_ASN1):
|
||||
"""Dumps the object into a buffer with the specified encoding.
|
||||
|
||||
:param int filetype: The desired encoding. Should be one of
|
||||
`OpenSSL.crypto.FILETYPE_ASN1`,
|
||||
`OpenSSL.crypto.FILETYPE_PEM`, or
|
||||
`OpenSSL.crypto.FILETYPE_TEXT`.
|
||||
|
||||
:returns: Encoded X509 object.
|
||||
:rtype: str
|
||||
|
||||
"""
|
||||
if isinstance(self.wrapped, OpenSSL.crypto.X509):
|
||||
func = OpenSSL.crypto.dump_certificate
|
||||
else: # assert in __init__ makes sure this is X509Req
|
||||
func = OpenSSL.crypto.dump_certificate_request
|
||||
return func(filetype, self.wrapped)
|
||||
|
||||
def __eq__(self, other):
|
||||
if not isinstance(other, self.__class__):
|
||||
return NotImplemented
|
||||
# pylint: disable=protected-access
|
||||
return self._dump() == other._dump()
|
||||
|
||||
def __hash__(self):
|
||||
return hash((self.__class__, self._dump()))
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self == other
|
||||
|
||||
def __repr__(self):
|
||||
return '<{0}({1!r})>'.format(self.__class__.__name__, self.wrapped)
|
||||
|
||||
|
||||
class ComparableKey(object): # pylint: disable=too-few-public-methods
|
||||
"""Comparable wrapper for `cryptography` keys.
|
||||
|
||||
See https://github.com/pyca/cryptography/issues/2122.
|
||||
|
||||
"""
|
||||
__hash__ = NotImplemented
|
||||
|
||||
def __init__(self, wrapped):
|
||||
self._wrapped = wrapped
|
||||
|
||||
def __getattr__(self, name):
|
||||
return getattr(self._wrapped, name)
|
||||
|
||||
def __eq__(self, other):
|
||||
# pylint: disable=protected-access
|
||||
if (not isinstance(other, self.__class__) or
|
||||
self._wrapped.__class__ is not other._wrapped.__class__):
|
||||
return NotImplemented
|
||||
elif hasattr(self._wrapped, 'private_numbers'):
|
||||
return self.private_numbers() == other.private_numbers()
|
||||
elif hasattr(self._wrapped, 'public_numbers'):
|
||||
return self.public_numbers() == other.public_numbers()
|
||||
else:
|
||||
return NotImplemented
|
||||
|
||||
def __ne__(self, other):
|
||||
return not self == other
|
||||
|
||||
def __repr__(self):
|
||||
return '<{0}({1!r})>'.format(self.__class__.__name__, self._wrapped)
|
||||
|
||||
def public_key(self):
|
||||
"""Get wrapped public key."""
|
||||
return self.__class__(self._wrapped.public_key())
|
||||
|
||||
|
||||
class ComparableRSAKey(ComparableKey): # pylint: disable=too-few-public-methods
|
||||
"""Wrapper for `cryptography` RSA keys.
|
||||
|
||||
Wraps around:
|
||||
- `cryptography.hazmat.primitives.asymmetric.RSAPrivateKey`
|
||||
- `cryptography.hazmat.primitives.asymmetric.RSAPublicKey`
|
||||
|
||||
"""
|
||||
|
||||
def __hash__(self):
|
||||
# public_numbers() hasn't got stable hash!
|
||||
# https://github.com/pyca/cryptography/issues/2143
|
||||
if isinstance(self._wrapped, rsa.RSAPrivateKeyWithSerialization):
|
||||
priv = self.private_numbers()
|
||||
pub = priv.public_numbers
|
||||
return hash((self.__class__, priv.p, priv.q, priv.dmp1,
|
||||
priv.dmq1, priv.iqmp, pub.n, pub.e))
|
||||
elif isinstance(self._wrapped, rsa.RSAPublicKeyWithSerialization):
|
||||
pub = self.public_numbers()
|
||||
return hash((self.__class__, pub.n, pub.e))
|
||||
|
||||
|
||||
class ImmutableMap(collections.Mapping, collections.Hashable):
|
||||
# pylint: disable=too-few-public-methods
|
||||
"""Immutable key to value mapping with attribute access."""
|
||||
|
||||
__slots__ = ()
|
||||
"""Must be overridden in subclasses."""
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
if set(kwargs) != set(self.__slots__):
|
||||
raise TypeError(
|
||||
'__init__() takes exactly the following arguments: {0} '
|
||||
'({1} given)'.format(', '.join(self.__slots__),
|
||||
', '.join(kwargs) if kwargs else 'none'))
|
||||
for slot in self.__slots__:
|
||||
object.__setattr__(self, slot, kwargs.pop(slot))
|
||||
|
||||
def update(self, **kwargs):
|
||||
"""Return updated map."""
|
||||
items = dict(self)
|
||||
items.update(kwargs)
|
||||
return type(self)(**items) # pylint: disable=star-args
|
||||
|
||||
def __getitem__(self, key):
|
||||
try:
|
||||
return getattr(self, key)
|
||||
except AttributeError:
|
||||
raise KeyError(key)
|
||||
|
||||
def __iter__(self):
|
||||
return iter(self.__slots__)
|
||||
|
||||
def __len__(self):
|
||||
return len(self.__slots__)
|
||||
|
||||
def __hash__(self):
|
||||
return hash(tuple(getattr(self, slot) for slot in self.__slots__))
|
||||
|
||||
def __setattr__(self, name, value):
|
||||
raise AttributeError("can't set attribute")
|
||||
|
||||
def __repr__(self):
|
||||
return '{0}({1})'.format(self.__class__.__name__, ', '.join(
|
||||
'{0}={1!r}'.format(key, value)
|
||||
for key, value in six.iteritems(self)))
|
||||
|
||||
|
||||
class frozendict(collections.Mapping, collections.Hashable):
|
||||
# pylint: disable=invalid-name,too-few-public-methods
|
||||
"""Frozen dictionary."""
|
||||
__slots__ = ('_items', '_keys')
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
if kwargs and not args:
|
||||
items = dict(kwargs)
|
||||
elif len(args) == 1 and isinstance(args[0], collections.Mapping):
|
||||
items = args[0]
|
||||
else:
|
||||
raise TypeError()
|
||||
# TODO: support generators/iterators
|
||||
|
||||
object.__setattr__(self, '_items', items)
|
||||
object.__setattr__(self, '_keys', tuple(sorted(six.iterkeys(items))))
|
||||
|
||||
def __getitem__(self, key):
|
||||
return self._items[key]
|
||||
|
||||
def __iter__(self):
|
||||
return iter(self._keys)
|
||||
|
||||
def __len__(self):
|
||||
return len(self._items)
|
||||
|
||||
def _sorted_items(self):
|
||||
return tuple((key, self[key]) for key in self._keys)
|
||||
|
||||
def __hash__(self):
|
||||
return hash(self._sorted_items())
|
||||
|
||||
def __getattr__(self, name):
|
||||
try:
|
||||
return self._items[name]
|
||||
except KeyError:
|
||||
raise AttributeError(name)
|
||||
|
||||
def __setattr__(self, name, value):
|
||||
raise AttributeError("can't set attribute")
|
||||
|
||||
def __repr__(self):
|
||||
return 'frozendict({0})'.format(', '.join('{0}={1!r}'.format(
|
||||
key, value) for key, value in self._sorted_items()))
|
||||
199
acme/acme/jose/util_test.py
Normal file
199
acme/acme/jose/util_test.py
Normal file
@@ -0,0 +1,199 @@
|
||||
"""Tests for acme.jose.util."""
|
||||
import functools
|
||||
import unittest
|
||||
|
||||
import six
|
||||
|
||||
from acme import test_util
|
||||
|
||||
|
||||
class ComparableX509Test(unittest.TestCase):
|
||||
"""Tests for acme.jose.util.ComparableX509."""
|
||||
|
||||
def setUp(self):
|
||||
# test_util.load_comparable_{csr,cert} return ComparableX509
|
||||
self.req1 = test_util.load_comparable_csr('csr.pem')
|
||||
self.req2 = test_util.load_comparable_csr('csr.pem')
|
||||
self.req_other = test_util.load_comparable_csr('csr-san.pem')
|
||||
|
||||
self.cert1 = test_util.load_comparable_cert('cert.pem')
|
||||
self.cert2 = test_util.load_comparable_cert('cert.pem')
|
||||
self.cert_other = test_util.load_comparable_cert('cert-san.pem')
|
||||
|
||||
def test_getattr_proxy(self):
|
||||
self.assertTrue(self.cert1.has_expired())
|
||||
|
||||
def test_eq(self):
|
||||
self.assertEqual(self.req1, self.req2)
|
||||
self.assertEqual(self.cert1, self.cert2)
|
||||
|
||||
def test_ne(self):
|
||||
self.assertNotEqual(self.req1, self.req_other)
|
||||
self.assertNotEqual(self.cert1, self.cert_other)
|
||||
|
||||
def test_ne_wrong_types(self):
|
||||
self.assertNotEqual(self.req1, 5)
|
||||
self.assertNotEqual(self.cert1, 5)
|
||||
|
||||
def test_hash(self):
|
||||
self.assertEqual(hash(self.req1), hash(self.req2))
|
||||
self.assertNotEqual(hash(self.req1), hash(self.req_other))
|
||||
|
||||
self.assertEqual(hash(self.cert1), hash(self.cert2))
|
||||
self.assertNotEqual(hash(self.cert1), hash(self.cert_other))
|
||||
|
||||
def test_repr(self):
|
||||
for x509 in self.req1, self.cert1:
|
||||
self.assertEqual(repr(x509),
|
||||
'<ComparableX509({0!r})>'.format(x509.wrapped))
|
||||
|
||||
|
||||
class ComparableRSAKeyTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.util.ComparableRSAKey."""
|
||||
|
||||
def setUp(self):
|
||||
# test_utl.load_rsa_private_key return ComparableRSAKey
|
||||
self.key = test_util.load_rsa_private_key('rsa256_key.pem')
|
||||
self.key_same = test_util.load_rsa_private_key('rsa256_key.pem')
|
||||
self.key2 = test_util.load_rsa_private_key('rsa512_key.pem')
|
||||
|
||||
def test_getattr_proxy(self):
|
||||
self.assertEqual(256, self.key.key_size)
|
||||
|
||||
def test_eq(self):
|
||||
self.assertEqual(self.key, self.key_same)
|
||||
|
||||
def test_ne(self):
|
||||
self.assertNotEqual(self.key, self.key2)
|
||||
|
||||
def test_ne_different_types(self):
|
||||
self.assertNotEqual(self.key, 5)
|
||||
|
||||
def test_ne_not_wrapped(self):
|
||||
# pylint: disable=protected-access
|
||||
self.assertNotEqual(self.key, self.key_same._wrapped)
|
||||
|
||||
def test_ne_no_serialization(self):
|
||||
from acme.jose.util import ComparableRSAKey
|
||||
self.assertNotEqual(ComparableRSAKey(5), ComparableRSAKey(5))
|
||||
|
||||
def test_hash(self):
|
||||
self.assertTrue(isinstance(hash(self.key), int))
|
||||
self.assertEqual(hash(self.key), hash(self.key_same))
|
||||
self.assertNotEqual(hash(self.key), hash(self.key2))
|
||||
|
||||
def test_repr(self):
|
||||
self.assertTrue(repr(self.key).startswith(
|
||||
'<ComparableRSAKey(<cryptography.hazmat.'))
|
||||
|
||||
def test_public_key(self):
|
||||
from acme.jose.util import ComparableRSAKey
|
||||
self.assertTrue(isinstance(self.key.public_key(), ComparableRSAKey))
|
||||
|
||||
|
||||
class ImmutableMapTest(unittest.TestCase):
|
||||
"""Tests for acme.jose.util.ImmutableMap."""
|
||||
|
||||
def setUp(self):
|
||||
# pylint: disable=invalid-name,too-few-public-methods
|
||||
# pylint: disable=missing-docstring
|
||||
from acme.jose.util import ImmutableMap
|
||||
|
||||
class A(ImmutableMap):
|
||||
__slots__ = ('x', 'y')
|
||||
|
||||
class B(ImmutableMap):
|
||||
__slots__ = ('x', 'y')
|
||||
|
||||
self.A = A
|
||||
self.B = B
|
||||
|
||||
self.a1 = self.A(x=1, y=2)
|
||||
self.a1_swap = self.A(y=2, x=1)
|
||||
self.a2 = self.A(x=3, y=4)
|
||||
self.b = self.B(x=1, y=2)
|
||||
|
||||
def test_update(self):
|
||||
self.assertEqual(self.A(x=2, y=2), self.a1.update(x=2))
|
||||
self.assertEqual(self.a2, self.a1.update(x=3, y=4))
|
||||
|
||||
def test_get_missing_item_raises_key_error(self):
|
||||
self.assertRaises(KeyError, self.a1.__getitem__, 'z')
|
||||
|
||||
def test_order_of_args_does_not_matter(self):
|
||||
self.assertEqual(self.a1, self.a1_swap)
|
||||
|
||||
def test_type_error_on_missing(self):
|
||||
self.assertRaises(TypeError, self.A, x=1)
|
||||
self.assertRaises(TypeError, self.A, y=2)
|
||||
|
||||
def test_type_error_on_unrecognized(self):
|
||||
self.assertRaises(TypeError, self.A, x=1, z=2)
|
||||
self.assertRaises(TypeError, self.A, x=1, y=2, z=3)
|
||||
|
||||
def test_get_attr(self):
|
||||
self.assertEqual(1, self.a1.x)
|
||||
self.assertEqual(2, self.a1.y)
|
||||
self.assertEqual(1, self.a1_swap.x)
|
||||
self.assertEqual(2, self.a1_swap.y)
|
||||
|
||||
def test_set_attr_raises_attribute_error(self):
|
||||
self.assertRaises(
|
||||
AttributeError, functools.partial(self.a1.__setattr__, 'x'), 10)
|
||||
|
||||
def test_equal(self):
|
||||
self.assertEqual(self.a1, self.a1)
|
||||
self.assertEqual(self.a2, self.a2)
|
||||
self.assertNotEqual(self.a1, self.a2)
|
||||
|
||||
def test_hash(self):
|
||||
self.assertEqual(hash((1, 2)), hash(self.a1))
|
||||
|
||||
def test_unhashable(self):
|
||||
self.assertRaises(TypeError, self.A(x=1, y={}).__hash__)
|
||||
|
||||
def test_repr(self):
|
||||
self.assertEqual('A(x=1, y=2)', repr(self.a1))
|
||||
self.assertEqual('A(x=1, y=2)', repr(self.a1_swap))
|
||||
self.assertEqual('B(x=1, y=2)', repr(self.b))
|
||||
self.assertEqual("B(x='foo', y='bar')", repr(self.B(x='foo', y='bar')))
|
||||
|
||||
|
||||
class frozendictTest(unittest.TestCase): # pylint: disable=invalid-name
|
||||
"""Tests for acme.jose.util.frozendict."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.jose.util import frozendict
|
||||
self.fdict = frozendict(x=1, y='2')
|
||||
|
||||
def test_init_dict(self):
|
||||
from acme.jose.util import frozendict
|
||||
self.assertEqual(self.fdict, frozendict({'x': 1, 'y': '2'}))
|
||||
|
||||
def test_init_other_raises_type_error(self):
|
||||
from acme.jose.util import frozendict
|
||||
# specifically fail for generators...
|
||||
self.assertRaises(TypeError, frozendict, six.iteritems({'a': 'b'}))
|
||||
|
||||
def test_len(self):
|
||||
self.assertEqual(2, len(self.fdict))
|
||||
|
||||
def test_hash(self):
|
||||
self.assertTrue(isinstance(hash(self.fdict), int))
|
||||
|
||||
def test_getattr_proxy(self):
|
||||
self.assertEqual(1, self.fdict.x)
|
||||
self.assertEqual('2', self.fdict.y)
|
||||
|
||||
def test_getattr_raises_attribute_error(self):
|
||||
self.assertRaises(AttributeError, self.fdict.__getattr__, 'z')
|
||||
|
||||
def test_setattr_immutable(self):
|
||||
self.assertRaises(AttributeError, self.fdict.__setattr__, 'z', 3)
|
||||
|
||||
def test_repr(self):
|
||||
self.assertEqual("frozendict(x=1, y='2')", repr(self.fdict))
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,18 +1,14 @@
|
||||
"""ACME-specific JWS.
|
||||
|
||||
The JWS implementation in josepy only implements the base JOSE standard. In
|
||||
order to support the new header fields defined in ACME, this module defines some
|
||||
ACME-specific classes that layer on top of josepy.
|
||||
"""
|
||||
import josepy as jose
|
||||
"""ACME JOSE JWS."""
|
||||
from acme import jose
|
||||
|
||||
|
||||
class Header(jose.Header):
|
||||
"""ACME-specific JOSE Header. Implements nonce, kid, and url.
|
||||
"""ACME JOSE Header.
|
||||
|
||||
.. todo:: Implement ``acmePath``.
|
||||
|
||||
"""
|
||||
nonce = jose.Field('nonce', omitempty=True, encoder=jose.encode_b64jose)
|
||||
kid = jose.Field('kid', omitempty=True)
|
||||
url = jose.Field('url', omitempty=True)
|
||||
|
||||
@nonce.decoder
|
||||
def nonce(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
@@ -24,7 +20,7 @@ class Header(jose.Header):
|
||||
|
||||
|
||||
class Signature(jose.Signature):
|
||||
"""ACME-specific Signature. Uses ACME-specific Header for customer fields."""
|
||||
"""ACME Signature."""
|
||||
__slots__ = jose.Signature._orig_slots # pylint: disable=no-member
|
||||
|
||||
# TODO: decoder/encoder should accept cls? Otherwise, subclassing
|
||||
@@ -38,17 +34,11 @@ class Signature(jose.Signature):
|
||||
|
||||
|
||||
class JWS(jose.JWS):
|
||||
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
|
||||
"""ACME JWS."""
|
||||
signature_cls = Signature
|
||||
__slots__ = jose.JWS._orig_slots
|
||||
__slots__ = jose.JWS._orig_slots # pylint: disable=no-member
|
||||
|
||||
@classmethod
|
||||
# pylint: disable=arguments-differ
|
||||
def sign(cls, payload, key, alg, nonce, url=None, kid=None):
|
||||
# Per ACME spec, jwk and kid are mutually exclusive, so only include a
|
||||
# jwk field if kid is not provided.
|
||||
include_jwk = kid is None
|
||||
def sign(cls, payload, key, alg, nonce): # pylint: disable=arguments-differ
|
||||
return super(JWS, cls).sign(payload, key=key, alg=alg,
|
||||
protect=frozenset(['nonce', 'url', 'kid', 'jwk', 'alg']),
|
||||
nonce=nonce, url=url, kid=kid,
|
||||
include_jwk=include_jwk)
|
||||
protect=frozenset(['nonce']), nonce=nonce)
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
"""Tests for acme.jws."""
|
||||
import unittest
|
||||
|
||||
import josepy as jose
|
||||
from acme import jose
|
||||
from acme import test_util
|
||||
|
||||
import test_util
|
||||
|
||||
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
|
||||
|
||||
@@ -37,30 +37,16 @@ class JWSTest(unittest.TestCase):
|
||||
self.privkey = KEY
|
||||
self.pubkey = self.privkey.public_key()
|
||||
self.nonce = jose.b64encode(b'Nonce')
|
||||
self.url = 'hi'
|
||||
self.kid = 'baaaaa'
|
||||
|
||||
def test_kid_serialize(self):
|
||||
def test_it(self):
|
||||
from acme.jws import JWS
|
||||
jws = JWS.sign(payload=b'foo', key=self.privkey,
|
||||
alg=jose.RS256, nonce=self.nonce,
|
||||
url=self.url, kid=self.kid)
|
||||
alg=jose.RS256, nonce=self.nonce)
|
||||
self.assertEqual(jws.signature.combined.nonce, self.nonce)
|
||||
self.assertEqual(jws.signature.combined.url, self.url)
|
||||
self.assertEqual(jws.signature.combined.kid, self.kid)
|
||||
self.assertEqual(jws.signature.combined.jwk, None)
|
||||
# TODO: check that nonce is in protected header
|
||||
|
||||
self.assertEqual(jws, JWS.from_json(jws.to_json()))
|
||||
|
||||
def test_jwk_serialize(self):
|
||||
from acme.jws import JWS
|
||||
jws = JWS.sign(payload=b'foo', key=self.privkey,
|
||||
alg=jose.RS256, nonce=self.nonce,
|
||||
url=self.url)
|
||||
self.assertEqual(jws.signature.combined.kid, None)
|
||||
self.assertEqual(jws.signature.combined.jwk, self.pubkey)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,17 +0,0 @@
|
||||
"""Shim class to not have to depend on typing module in prod."""
|
||||
import sys
|
||||
|
||||
|
||||
class TypingClass(object):
|
||||
"""Ignore import errors by getting anything"""
|
||||
def __getattr__(self, name):
|
||||
return None
|
||||
|
||||
try:
|
||||
# mypy doesn't respect modifying sys.modules
|
||||
from typing import * # pylint: disable=wildcard-import, unused-wildcard-import
|
||||
# pylint: disable=unused-import
|
||||
from typing import Collection, IO # type: ignore
|
||||
# pylint: enable=unused-import
|
||||
except ImportError:
|
||||
sys.modules[__name__] = TypingClass()
|
||||
@@ -1,73 +1,13 @@
|
||||
"""ACME protocol messages."""
|
||||
import json
|
||||
|
||||
import josepy as jose
|
||||
import six
|
||||
import collections
|
||||
|
||||
from acme import challenges
|
||||
from acme import errors
|
||||
from acme import fields
|
||||
from acme import jws
|
||||
from acme import jose
|
||||
from acme import util
|
||||
|
||||
try:
|
||||
from collections.abc import Hashable # pylint: disable=no-name-in-module
|
||||
except ImportError: # pragma: no cover
|
||||
from collections import Hashable
|
||||
|
||||
|
||||
|
||||
OLD_ERROR_PREFIX = "urn:acme:error:"
|
||||
ERROR_PREFIX = "urn:ietf:params:acme:error:"
|
||||
|
||||
ERROR_CODES = {
|
||||
'accountDoesNotExist': 'The request specified an account that does not exist',
|
||||
'alreadyRevoked': 'The request specified a certificate to be revoked that has' \
|
||||
' already been revoked',
|
||||
'badCSR': 'The CSR is unacceptable (e.g., due to a short key)',
|
||||
'badNonce': 'The client sent an unacceptable anti-replay nonce',
|
||||
'badPublicKey': 'The JWS was signed by a public key the server does not support',
|
||||
'badRevocationReason': 'The revocation reason provided is not allowed by the server',
|
||||
'badSignatureAlgorithm': 'The JWS was signed with an algorithm the server does not support',
|
||||
'caa': 'Certification Authority Authorization (CAA) records forbid the CA from issuing' \
|
||||
' a certificate',
|
||||
'compound': 'Specific error conditions are indicated in the "subproblems" array',
|
||||
'connection': ('The server could not connect to the client to verify the'
|
||||
' domain'),
|
||||
'dns': 'There was a problem with a DNS query during identifier validation',
|
||||
'dnssec': 'The server could not validate a DNSSEC signed domain',
|
||||
'incorrectResponse': 'Response recieved didn\'t match the challenge\'s requirements',
|
||||
# deprecate invalidEmail
|
||||
'invalidEmail': 'The provided email for a registration was invalid',
|
||||
'invalidContact': 'The provided contact URI was invalid',
|
||||
'malformed': 'The request message was malformed',
|
||||
'rejectedIdentifier': 'The server will not issue certificates for the identifier',
|
||||
'orderNotReady': 'The request attempted to finalize an order that is not ready to be finalized',
|
||||
'rateLimited': 'There were too many requests of a given type',
|
||||
'serverInternal': 'The server experienced an internal error',
|
||||
'tls': 'The server experienced a TLS error during domain verification',
|
||||
'unauthorized': 'The client lacks sufficient authorization',
|
||||
'unsupportedContact': 'A contact URL for an account used an unsupported protocol scheme',
|
||||
'unknownHost': 'The server could not resolve a domain name',
|
||||
'unsupportedIdentifier': 'An identifier is of an unsupported type',
|
||||
'externalAccountRequired': 'The server requires external account binding',
|
||||
}
|
||||
|
||||
ERROR_TYPE_DESCRIPTIONS = dict(
|
||||
(ERROR_PREFIX + name, desc) for name, desc in ERROR_CODES.items())
|
||||
|
||||
ERROR_TYPE_DESCRIPTIONS.update(dict( # add errors with old prefix, deprecate me
|
||||
(OLD_ERROR_PREFIX + name, desc) for name, desc in ERROR_CODES.items()))
|
||||
|
||||
|
||||
def is_acme_error(err):
|
||||
"""Check if argument is an ACME error."""
|
||||
if isinstance(err, Error) and (err.typ is not None):
|
||||
return (ERROR_PREFIX in err.typ) or (OLD_ERROR_PREFIX in err.typ)
|
||||
return False
|
||||
|
||||
|
||||
@six.python_2_unicode_compatible
|
||||
class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
"""ACME error.
|
||||
|
||||
@@ -78,24 +18,31 @@ class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
:ivar unicode detail:
|
||||
|
||||
"""
|
||||
ERROR_TYPE_DESCRIPTIONS = dict(
|
||||
('urn:acme:error:' + name, description) for name, description in (
|
||||
('badCSR', 'The CSR is unacceptable (e.g., due to a short key)'),
|
||||
('badNonce', 'The client sent an unacceptable anti-replay nonce'),
|
||||
('connection', 'The server could not connect to the client to '
|
||||
'verify the domain'),
|
||||
('dnssec', 'The server could not validate a DNSSEC signed domain'),
|
||||
('invalidEmail',
|
||||
'The provided email for a registration was invalid'),
|
||||
('invalidContact',
|
||||
'The provided contact URI was invalid'),
|
||||
('malformed', 'The request message was malformed'),
|
||||
('rateLimited', 'There were too many requests of a given type'),
|
||||
('serverInternal', 'The server experienced an internal error'),
|
||||
('tls', 'The server experienced a TLS error during domain '
|
||||
'verification'),
|
||||
('unauthorized', 'The client lacks sufficient authorization'),
|
||||
('unknownHost', 'The server could not resolve a domain name'),
|
||||
)
|
||||
)
|
||||
|
||||
typ = jose.Field('type', omitempty=True, default='about:blank')
|
||||
title = jose.Field('title', omitempty=True)
|
||||
detail = jose.Field('detail', omitempty=True)
|
||||
|
||||
@classmethod
|
||||
def with_code(cls, code, **kwargs):
|
||||
"""Create an Error instance with an ACME Error code.
|
||||
|
||||
:unicode code: An ACME error code, like 'dnssec'.
|
||||
:kwargs: kwargs to pass to Error.
|
||||
|
||||
"""
|
||||
if code not in ERROR_CODES:
|
||||
raise ValueError("The supplied code: %s is not a known ACME error"
|
||||
" code" % code)
|
||||
typ = ERROR_PREFIX + code
|
||||
return cls(typ=typ, **kwargs)
|
||||
|
||||
@property
|
||||
def description(self):
|
||||
"""Hardcoded error description based on its type.
|
||||
@@ -104,49 +51,33 @@ class Error(jose.JSONObjectWithFields, errors.Error):
|
||||
:rtype: unicode
|
||||
|
||||
"""
|
||||
return ERROR_TYPE_DESCRIPTIONS.get(self.typ)
|
||||
|
||||
@property
|
||||
def code(self):
|
||||
"""ACME error code.
|
||||
|
||||
Basically self.typ without the ERROR_PREFIX.
|
||||
|
||||
:returns: error code if standard ACME code or ``None``.
|
||||
:rtype: unicode
|
||||
|
||||
"""
|
||||
code = str(self.typ).split(':')[-1]
|
||||
if code in ERROR_CODES:
|
||||
return code
|
||||
return None
|
||||
return self.ERROR_TYPE_DESCRIPTIONS.get(self.typ)
|
||||
|
||||
def __str__(self):
|
||||
return b' :: '.join(
|
||||
part.encode('ascii', 'backslashreplace') for part in
|
||||
return ' :: '.join(
|
||||
part for part in
|
||||
(self.typ, self.description, self.detail, self.title)
|
||||
if part is not None).decode()
|
||||
if part is not None)
|
||||
|
||||
|
||||
class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
|
||||
class _Constant(jose.JSONDeSerializable, collections.Hashable):
|
||||
"""ACME constant."""
|
||||
__slots__ = ('name',)
|
||||
POSSIBLE_NAMES = NotImplemented
|
||||
|
||||
def __init__(self, name):
|
||||
super(_Constant, self).__init__()
|
||||
self.POSSIBLE_NAMES[name] = self # pylint: disable=unsupported-assignment-operation
|
||||
self.POSSIBLE_NAMES[name] = self
|
||||
self.name = name
|
||||
|
||||
def to_partial_json(self):
|
||||
return self.name
|
||||
|
||||
@classmethod
|
||||
def from_json(cls, jobj):
|
||||
if jobj not in cls.POSSIBLE_NAMES: # pylint: disable=unsupported-membership-test
|
||||
def from_json(cls, value):
|
||||
if value not in cls.POSSIBLE_NAMES:
|
||||
raise jose.DeserializationError(
|
||||
'{0} not recognized'.format(cls.__name__))
|
||||
return cls.POSSIBLE_NAMES[jobj]
|
||||
return cls.POSSIBLE_NAMES[value]
|
||||
|
||||
def __repr__(self):
|
||||
return '{0}({1})'.format(self.__class__.__name__, self.name)
|
||||
@@ -163,20 +94,18 @@ class _Constant(jose.JSONDeSerializable, Hashable): # type: ignore
|
||||
|
||||
class Status(_Constant):
|
||||
"""ACME "status" field."""
|
||||
POSSIBLE_NAMES = {} # type: dict
|
||||
POSSIBLE_NAMES = {}
|
||||
STATUS_UNKNOWN = Status('unknown')
|
||||
STATUS_PENDING = Status('pending')
|
||||
STATUS_PROCESSING = Status('processing')
|
||||
STATUS_VALID = Status('valid')
|
||||
STATUS_INVALID = Status('invalid')
|
||||
STATUS_REVOKED = Status('revoked')
|
||||
STATUS_READY = Status('ready')
|
||||
STATUS_DEACTIVATED = Status('deactivated')
|
||||
|
||||
|
||||
class IdentifierType(_Constant):
|
||||
"""ACME identifier type."""
|
||||
POSSIBLE_NAMES = {} # type: dict
|
||||
POSSIBLE_NAMES = {}
|
||||
IDENTIFIER_FQDN = IdentifierType('dns') # IdentifierDNS in Boulder
|
||||
|
||||
|
||||
@@ -194,34 +123,13 @@ class Identifier(jose.JSONObjectWithFields):
|
||||
class Directory(jose.JSONDeSerializable):
|
||||
"""Directory."""
|
||||
|
||||
_REGISTERED_TYPES = {} # type: dict
|
||||
_REGISTERED_TYPES = {}
|
||||
|
||||
class Meta(jose.JSONObjectWithFields):
|
||||
"""Directory Meta."""
|
||||
_terms_of_service = jose.Field('terms-of-service', omitempty=True)
|
||||
_terms_of_service_v2 = jose.Field('termsOfService', omitempty=True)
|
||||
terms_of_service = jose.Field('terms-of-service', omitempty=True)
|
||||
website = jose.Field('website', omitempty=True)
|
||||
caa_identities = jose.Field('caaIdentities', omitempty=True)
|
||||
external_account_required = jose.Field('externalAccountRequired', omitempty=True)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
|
||||
super(Directory.Meta, self).__init__(**kwargs)
|
||||
|
||||
@property
|
||||
def terms_of_service(self):
|
||||
"""URL for the CA TOS"""
|
||||
return self._terms_of_service or self._terms_of_service_v2
|
||||
|
||||
def __iter__(self):
|
||||
# When iterating over fields, use the external name 'terms_of_service' instead of
|
||||
# the internal '_terms_of_service'.
|
||||
for name in super(Directory.Meta, self).__iter__():
|
||||
yield name[1:] if name == '_terms_of_service' else name
|
||||
|
||||
def _internal_name(self, name):
|
||||
return '_' + name if name == 'terms_of_service' else name
|
||||
|
||||
caa_identities = jose.Field('caa-identities', omitempty=True)
|
||||
|
||||
@classmethod
|
||||
def _canon_key(cls, key):
|
||||
@@ -245,7 +153,7 @@ class Directory(jose.JSONDeSerializable):
|
||||
try:
|
||||
return self[name.replace('_', '-')]
|
||||
except KeyError as error:
|
||||
raise AttributeError(str(error) + ': ' + name)
|
||||
raise AttributeError(str(error))
|
||||
|
||||
def __getitem__(self, name):
|
||||
try:
|
||||
@@ -284,31 +192,17 @@ class ResourceBody(jose.JSONObjectWithFields):
|
||||
"""ACME Resource Body."""
|
||||
|
||||
|
||||
class ExternalAccountBinding(object):
|
||||
"""ACME External Account Binding"""
|
||||
|
||||
@classmethod
|
||||
def from_data(cls, account_public_key, kid, hmac_key, directory):
|
||||
"""Create External Account Binding Resource from contact details, kid and hmac."""
|
||||
|
||||
key_json = json.dumps(account_public_key.to_partial_json()).encode()
|
||||
decoded_hmac_key = jose.b64.b64decode(hmac_key)
|
||||
url = directory["newAccount"]
|
||||
|
||||
eab = jws.JWS.sign(key_json, jose.jwk.JWKOct(key=decoded_hmac_key),
|
||||
jose.jwa.HS256, None,
|
||||
url, kid)
|
||||
|
||||
return eab.to_partial_json()
|
||||
|
||||
|
||||
class Registration(ResourceBody):
|
||||
"""Registration Resource Body.
|
||||
|
||||
:ivar josepy.jwk.JWK key: Public key.
|
||||
:ivar acme.jose.jwk.JWK key: Public key.
|
||||
:ivar tuple contact: Contact information following ACME spec,
|
||||
`tuple` of `unicode`.
|
||||
:ivar unicode agreement:
|
||||
:ivar unicode authorizations: URI where
|
||||
`messages.Registration.Authorizations` can be found.
|
||||
:ivar unicode certificates: URI where
|
||||
`messages.Registration.Certificates` can be found.
|
||||
|
||||
"""
|
||||
# on new-reg key server ignores 'key' and populates it based on
|
||||
@@ -316,32 +210,42 @@ class Registration(ResourceBody):
|
||||
key = jose.Field('key', omitempty=True, decoder=jose.JWK.from_json)
|
||||
contact = jose.Field('contact', omitempty=True, default=())
|
||||
agreement = jose.Field('agreement', omitempty=True)
|
||||
status = jose.Field('status', omitempty=True)
|
||||
terms_of_service_agreed = jose.Field('termsOfServiceAgreed', omitempty=True)
|
||||
only_return_existing = jose.Field('onlyReturnExisting', omitempty=True)
|
||||
external_account_binding = jose.Field('externalAccountBinding', omitempty=True)
|
||||
authorizations = jose.Field('authorizations', omitempty=True)
|
||||
certificates = jose.Field('certificates', omitempty=True)
|
||||
|
||||
class Authorizations(jose.JSONObjectWithFields):
|
||||
"""Authorizations granted to Account in the process of registration.
|
||||
|
||||
:ivar tuple authorizations: URIs to Authorization Resources.
|
||||
|
||||
"""
|
||||
authorizations = jose.Field('authorizations')
|
||||
|
||||
class Certificates(jose.JSONObjectWithFields):
|
||||
"""Certificates granted to Account in the process of registration.
|
||||
|
||||
:ivar tuple certificates: URIs to Certificate Resources.
|
||||
|
||||
"""
|
||||
certificates = jose.Field('certificates')
|
||||
|
||||
phone_prefix = 'tel:'
|
||||
email_prefix = 'mailto:'
|
||||
|
||||
@classmethod
|
||||
def from_data(cls, phone=None, email=None, external_account_binding=None, **kwargs):
|
||||
def from_data(cls, phone=None, email=None, **kwargs):
|
||||
"""Create registration resource from contact details."""
|
||||
details = list(kwargs.pop('contact', ()))
|
||||
if phone is not None:
|
||||
details.append(cls.phone_prefix + phone)
|
||||
if email is not None:
|
||||
details.extend([cls.email_prefix + mail for mail in email.split(',')])
|
||||
details.append(cls.email_prefix + email)
|
||||
kwargs['contact'] = tuple(details)
|
||||
|
||||
if external_account_binding:
|
||||
kwargs['external_account_binding'] = external_account_binding
|
||||
|
||||
return cls(**kwargs)
|
||||
|
||||
def _filter_contact(self, prefix):
|
||||
return tuple(
|
||||
detail[len(prefix):] for detail in self.contact # pylint: disable=not-an-iterable
|
||||
detail[len(prefix):] for detail in self.contact
|
||||
if detail.startswith(prefix))
|
||||
|
||||
@property
|
||||
@@ -372,12 +276,12 @@ class RegistrationResource(ResourceWithURI):
|
||||
"""Registration Resource.
|
||||
|
||||
:ivar acme.messages.Registration body:
|
||||
:ivar unicode new_authzr_uri: Deprecated. Do not use.
|
||||
:ivar unicode new_authzr_uri: URI found in the 'next' ``Link`` header
|
||||
:ivar unicode terms_of_service: URL for the CA TOS.
|
||||
|
||||
"""
|
||||
body = jose.Field('body', decoder=Registration.from_json)
|
||||
new_authzr_uri = jose.Field('new_authzr_uri', omitempty=True)
|
||||
new_authzr_uri = jose.Field('new_authzr_uri')
|
||||
terms_of_service = jose.Field('terms_of_service', omitempty=True)
|
||||
|
||||
|
||||
@@ -399,25 +303,13 @@ class ChallengeBody(ResourceBody):
|
||||
|
||||
"""
|
||||
__slots__ = ('chall',)
|
||||
# ACMEv1 has a "uri" field in challenges. ACMEv2 has a "url" field. This
|
||||
# challenge object supports either one, but should be accessed through the
|
||||
# name "uri". In Client.answer_challenge, whichever one is set will be
|
||||
# used.
|
||||
_uri = jose.Field('uri', omitempty=True, default=None)
|
||||
_url = jose.Field('url', omitempty=True, default=None)
|
||||
uri = jose.Field('uri')
|
||||
status = jose.Field('status', decoder=Status.from_json,
|
||||
omitempty=True, default=STATUS_PENDING)
|
||||
validated = fields.RFC3339Field('validated', omitempty=True)
|
||||
error = jose.Field('error', decoder=Error.from_json,
|
||||
omitempty=True, default=None)
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
kwargs = dict((self._internal_name(k), v) for k, v in kwargs.items())
|
||||
super(ChallengeBody, self).__init__(**kwargs)
|
||||
|
||||
def encode(self, name):
|
||||
return super(ChallengeBody, self).encode(self._internal_name(name))
|
||||
|
||||
def to_partial_json(self):
|
||||
jobj = super(ChallengeBody, self).to_partial_json()
|
||||
jobj.update(self.chall.to_partial_json())
|
||||
@@ -429,23 +321,9 @@ class ChallengeBody(ResourceBody):
|
||||
jobj_fields['chall'] = challenges.Challenge.from_json(jobj)
|
||||
return jobj_fields
|
||||
|
||||
@property
|
||||
def uri(self):
|
||||
"""The URL of this challenge."""
|
||||
return self._url or self._uri
|
||||
|
||||
def __getattr__(self, name):
|
||||
return getattr(self.chall, name)
|
||||
|
||||
def __iter__(self):
|
||||
# When iterating over fields, use the external name 'uri' instead of
|
||||
# the internal '_uri'.
|
||||
for name in super(ChallengeBody, self).__iter__():
|
||||
yield name[1:] if name == '_uri' else name
|
||||
|
||||
def _internal_name(self, name):
|
||||
return '_' + name if name == 'uri' else name
|
||||
|
||||
|
||||
class ChallengeResource(Resource):
|
||||
"""Challenge Resource.
|
||||
@@ -458,10 +336,10 @@ class ChallengeResource(Resource):
|
||||
authzr_uri = jose.Field('authzr_uri')
|
||||
|
||||
@property
|
||||
def uri(self):
|
||||
"""The URL of the challenge body."""
|
||||
# pylint: disable=function-redefined,no-member
|
||||
return self.body.uri
|
||||
def uri(self): # pylint: disable=missing-docstring,no-self-argument
|
||||
# bug? 'method already defined line None'
|
||||
# pylint: disable=function-redefined
|
||||
return self.body.uri # pylint: disable=no-member
|
||||
|
||||
|
||||
class Authorization(ResourceBody):
|
||||
@@ -475,7 +353,7 @@ class Authorization(ResourceBody):
|
||||
:ivar datetime.datetime expires:
|
||||
|
||||
"""
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json, omitempty=True)
|
||||
identifier = jose.Field('identifier', decoder=Identifier.from_json)
|
||||
challenges = jose.Field('challenges', omitempty=True)
|
||||
combinations = jose.Field('combinations', omitempty=True)
|
||||
|
||||
@@ -485,7 +363,6 @@ class Authorization(ResourceBody):
|
||||
# be absent'... then acme-spec gives example with 'expires'
|
||||
# present... That's confusing!
|
||||
expires = fields.RFC3339Field('expires', omitempty=True)
|
||||
wildcard = jose.Field('wildcard', omitempty=True)
|
||||
|
||||
@challenges.decoder
|
||||
def challenges(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
@@ -495,7 +372,7 @@ class Authorization(ResourceBody):
|
||||
def resolved_combinations(self):
|
||||
"""Combinations with challenges instead of indices."""
|
||||
return tuple(tuple(self.challenges[idx] for idx in combo)
|
||||
for combo in self.combinations) # pylint: disable=not-an-iterable
|
||||
for combo in self.combinations)
|
||||
|
||||
|
||||
@Directory.register
|
||||
@@ -505,28 +382,22 @@ class NewAuthorization(Authorization):
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class UpdateAuthorization(Authorization):
|
||||
"""Update authorization."""
|
||||
resource_type = 'authz'
|
||||
resource = fields.Resource(resource_type)
|
||||
|
||||
|
||||
class AuthorizationResource(ResourceWithURI):
|
||||
"""Authorization Resource.
|
||||
|
||||
:ivar acme.messages.Authorization body:
|
||||
:ivar unicode new_cert_uri: Deprecated. Do not use.
|
||||
:ivar unicode new_cert_uri: URI found in the 'next' ``Link`` header
|
||||
|
||||
"""
|
||||
body = jose.Field('body', decoder=Authorization.from_json)
|
||||
new_cert_uri = jose.Field('new_cert_uri', omitempty=True)
|
||||
new_cert_uri = jose.Field('new_cert_uri')
|
||||
|
||||
|
||||
@Directory.register
|
||||
class CertificateRequest(jose.JSONObjectWithFields):
|
||||
"""ACME new-cert request.
|
||||
|
||||
:ivar josepy.util.ComparableX509 csr:
|
||||
:ivar acme.jose.util.ComparableX509 csr:
|
||||
`OpenSSL.crypto.X509Req` wrapped in `.ComparableX509`
|
||||
|
||||
"""
|
||||
@@ -538,7 +409,7 @@ class CertificateRequest(jose.JSONObjectWithFields):
|
||||
class CertificateResource(ResourceWithURI):
|
||||
"""Certificate Resource.
|
||||
|
||||
:ivar josepy.util.ComparableX509 body:
|
||||
:ivar acme.jose.util.ComparableX509 body:
|
||||
`OpenSSL.crypto.X509` wrapped in `.ComparableX509`
|
||||
:ivar unicode cert_chain_uri: URI found in the 'up' ``Link`` header
|
||||
:ivar tuple authzrs: `tuple` of `AuthorizationResource`.
|
||||
@@ -560,50 +431,3 @@ class Revocation(jose.JSONObjectWithFields):
|
||||
resource = fields.Resource(resource_type)
|
||||
certificate = jose.Field(
|
||||
'certificate', decoder=jose.decode_cert, encoder=jose.encode_cert)
|
||||
reason = jose.Field('reason')
|
||||
|
||||
|
||||
class Order(ResourceBody):
|
||||
"""Order Resource Body.
|
||||
|
||||
:ivar list of .Identifier: List of identifiers for the certificate.
|
||||
:ivar acme.messages.Status status:
|
||||
:ivar list of str authorizations: URLs of authorizations.
|
||||
:ivar str certificate: URL to download certificate as a fullchain PEM.
|
||||
:ivar str finalize: URL to POST to to request issuance once all
|
||||
authorizations have "valid" status.
|
||||
:ivar datetime.datetime expires: When the order expires.
|
||||
:ivar .Error error: Any error that occurred during finalization, if applicable.
|
||||
"""
|
||||
identifiers = jose.Field('identifiers', omitempty=True)
|
||||
status = jose.Field('status', decoder=Status.from_json,
|
||||
omitempty=True)
|
||||
authorizations = jose.Field('authorizations', omitempty=True)
|
||||
certificate = jose.Field('certificate', omitempty=True)
|
||||
finalize = jose.Field('finalize', omitempty=True)
|
||||
expires = fields.RFC3339Field('expires', omitempty=True)
|
||||
error = jose.Field('error', omitempty=True, decoder=Error.from_json)
|
||||
|
||||
@identifiers.decoder
|
||||
def identifiers(value): # pylint: disable=missing-docstring,no-self-argument
|
||||
return tuple(Identifier.from_json(identifier) for identifier in value)
|
||||
|
||||
class OrderResource(ResourceWithURI):
|
||||
"""Order Resource.
|
||||
|
||||
:ivar acme.messages.Order body:
|
||||
:ivar str csr_pem: The CSR this Order will be finalized with.
|
||||
:ivar list of acme.messages.AuthorizationResource authorizations:
|
||||
Fully-fetched AuthorizationResource objects.
|
||||
:ivar str fullchain_pem: The fetched contents of the certificate URL
|
||||
produced once the order was finalized, if it's present.
|
||||
"""
|
||||
body = jose.Field('body', decoder=Order.from_json)
|
||||
csr_pem = jose.Field('csr_pem', omitempty=True)
|
||||
authorizations = jose.Field('authorizations')
|
||||
fullchain_pem = jose.Field('fullchain_pem', omitempty=True)
|
||||
|
||||
@Directory.register
|
||||
class NewOrder(Order):
|
||||
"""New order."""
|
||||
resource_type = 'new-order'
|
||||
|
||||
@@ -1,12 +1,12 @@
|
||||
"""Tests for acme.messages."""
|
||||
import unittest
|
||||
|
||||
import josepy as jose
|
||||
import mock
|
||||
|
||||
from acme import challenges
|
||||
from acme.magic_typing import Dict # pylint: disable=unused-import, no-name-in-module
|
||||
import test_util
|
||||
from acme import jose
|
||||
from acme import test_util
|
||||
|
||||
|
||||
CERT = test_util.load_comparable_cert('cert.der')
|
||||
CSR = test_util.load_comparable_csr('csr.der')
|
||||
@@ -17,15 +17,16 @@ class ErrorTest(unittest.TestCase):
|
||||
"""Tests for acme.messages.Error."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.messages import Error, ERROR_PREFIX
|
||||
self.error = Error.with_code('malformed', detail='foo', title='title')
|
||||
from acme.messages import Error
|
||||
self.error = Error(
|
||||
detail='foo', typ='urn:acme:error:malformed', title='title')
|
||||
self.jobj = {
|
||||
'detail': 'foo',
|
||||
'title': 'some title',
|
||||
'type': ERROR_PREFIX + 'malformed',
|
||||
'type': 'urn:acme:error:malformed',
|
||||
}
|
||||
self.error_custom = Error(typ='custom', detail='bar')
|
||||
self.empty_error = Error()
|
||||
self.jobj_cusom = {'type': 'custom', 'detail': 'bar'}
|
||||
|
||||
def test_default_typ(self):
|
||||
from acme.messages import Error
|
||||
@@ -40,39 +41,15 @@ class ErrorTest(unittest.TestCase):
|
||||
hash(Error.from_json(self.error.to_json()))
|
||||
|
||||
def test_description(self):
|
||||
self.assertEqual('The request message was malformed', self.error.description)
|
||||
self.assertEqual(
|
||||
'The request message was malformed', self.error.description)
|
||||
self.assertTrue(self.error_custom.description is None)
|
||||
|
||||
def test_code(self):
|
||||
from acme.messages import Error
|
||||
self.assertEqual('malformed', self.error.code)
|
||||
self.assertEqual(None, self.error_custom.code)
|
||||
self.assertEqual(None, Error().code)
|
||||
|
||||
def test_is_acme_error(self):
|
||||
from acme.messages import is_acme_error, Error
|
||||
self.assertTrue(is_acme_error(self.error))
|
||||
self.assertFalse(is_acme_error(self.error_custom))
|
||||
self.assertFalse(is_acme_error(Error()))
|
||||
self.assertFalse(is_acme_error(self.empty_error))
|
||||
self.assertFalse(is_acme_error("must pet all the {dogs|rabbits}"))
|
||||
|
||||
def test_unicode_error(self):
|
||||
from acme.messages import Error, is_acme_error
|
||||
arabic_error = Error.with_code(
|
||||
'malformed', detail=u'\u0639\u062f\u0627\u0644\u0629', title='title')
|
||||
self.assertTrue(is_acme_error(arabic_error))
|
||||
|
||||
def test_with_code(self):
|
||||
from acme.messages import Error, is_acme_error
|
||||
self.assertTrue(is_acme_error(Error.with_code('badCSR')))
|
||||
self.assertRaises(ValueError, Error.with_code, 'not an ACME error code')
|
||||
|
||||
def test_str(self):
|
||||
self.assertEqual(
|
||||
str(self.error),
|
||||
u"{0.typ} :: {0.description} :: {0.detail} :: {0.title}"
|
||||
.format(self.error))
|
||||
'urn:acme:error:malformed :: The request message was '
|
||||
'malformed :: foo :: title', str(self.error))
|
||||
self.assertEqual('custom :: bar', str(self.error_custom))
|
||||
|
||||
|
||||
class ConstantTest(unittest.TestCase):
|
||||
@@ -82,7 +59,7 @@ class ConstantTest(unittest.TestCase):
|
||||
from acme.messages import _Constant
|
||||
|
||||
class MockConstant(_Constant): # pylint: disable=missing-docstring
|
||||
POSSIBLE_NAMES = {} # type: Dict
|
||||
POSSIBLE_NAMES = {}
|
||||
|
||||
self.MockConstant = MockConstant # pylint: disable=invalid-name
|
||||
self.const_a = MockConstant('a')
|
||||
@@ -154,7 +131,7 @@ class DirectoryTest(unittest.TestCase):
|
||||
'meta': {
|
||||
'terms-of-service': 'https://example.com/acme/terms',
|
||||
'website': 'https://www.example.com/',
|
||||
'caaIdentities': ['example.com'],
|
||||
'caa-identities': ['example.com'],
|
||||
},
|
||||
})
|
||||
|
||||
@@ -162,31 +139,6 @@ class DirectoryTest(unittest.TestCase):
|
||||
from acme.messages import Directory
|
||||
Directory.from_json({'foo': 'bar'})
|
||||
|
||||
def test_iter_meta(self):
|
||||
result = False
|
||||
for k in self.dir.meta:
|
||||
if k == 'terms_of_service':
|
||||
result = self.dir.meta[k] == 'https://example.com/acme/terms'
|
||||
self.assertTrue(result)
|
||||
|
||||
|
||||
class ExternalAccountBindingTest(unittest.TestCase):
|
||||
def setUp(self):
|
||||
from acme.messages import Directory
|
||||
self.key = jose.jwk.JWKRSA(key=KEY.public_key())
|
||||
self.kid = "kid-for-testing"
|
||||
self.hmac_key = "hmac-key-for-testing"
|
||||
self.dir = Directory({
|
||||
'newAccount': 'http://url/acme/new-account',
|
||||
})
|
||||
|
||||
def test_from_data(self):
|
||||
from acme.messages import ExternalAccountBinding
|
||||
eab = ExternalAccountBinding.from_data(self.key, self.kid, self.hmac_key, self.dir)
|
||||
|
||||
self.assertEqual(len(eab), 3)
|
||||
self.assertEqual(sorted(eab.keys()), sorted(['protected', 'payload', 'signature']))
|
||||
|
||||
|
||||
class RegistrationTest(unittest.TestCase):
|
||||
"""Tests for acme.messages.Registration."""
|
||||
@@ -201,7 +153,8 @@ class RegistrationTest(unittest.TestCase):
|
||||
|
||||
from acme.messages import Registration
|
||||
self.reg = Registration(key=key, contact=contact, agreement=agreement)
|
||||
self.reg_none = Registration()
|
||||
self.reg_none = Registration(authorizations='uri/authorizations',
|
||||
certificates='uri/certificates')
|
||||
|
||||
self.jobj_to = {
|
||||
'contact': contact,
|
||||
@@ -219,22 +172,6 @@ class RegistrationTest(unittest.TestCase):
|
||||
'mailto:admin@foo.com',
|
||||
))
|
||||
|
||||
def test_new_registration_from_data_with_eab(self):
|
||||
from acme.messages import NewRegistration, ExternalAccountBinding, Directory
|
||||
key = jose.jwk.JWKRSA(key=KEY.public_key())
|
||||
kid = "kid-for-testing"
|
||||
hmac_key = "hmac-key-for-testing"
|
||||
directory = Directory({
|
||||
'newAccount': 'http://url/acme/new-account',
|
||||
})
|
||||
eab = ExternalAccountBinding.from_data(key, kid, hmac_key, directory)
|
||||
reg = NewRegistration.from_data(email='admin@foo.com', external_account_binding=eab)
|
||||
self.assertEqual(reg.contact, (
|
||||
'mailto:admin@foo.com',
|
||||
))
|
||||
self.assertEqual(sorted(reg.external_account_binding.keys()),
|
||||
sorted(['protected', 'payload', 'signature']))
|
||||
|
||||
def test_phones(self):
|
||||
self.assertEqual(('1234',), self.reg.phones)
|
||||
|
||||
@@ -271,12 +208,14 @@ class RegistrationResourceTest(unittest.TestCase):
|
||||
from acme.messages import RegistrationResource
|
||||
self.regr = RegistrationResource(
|
||||
body=mock.sentinel.body, uri=mock.sentinel.uri,
|
||||
new_authzr_uri=mock.sentinel.new_authzr_uri,
|
||||
terms_of_service=mock.sentinel.terms_of_service)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.regr.to_json(), {
|
||||
'body': mock.sentinel.body,
|
||||
'uri': mock.sentinel.uri,
|
||||
'new_authzr_uri': mock.sentinel.new_authzr_uri,
|
||||
'terms_of_service': mock.sentinel.terms_of_service,
|
||||
})
|
||||
|
||||
@@ -301,7 +240,8 @@ class ChallengeBodyTest(unittest.TestCase):
|
||||
from acme.messages import Error
|
||||
from acme.messages import STATUS_INVALID
|
||||
self.status = STATUS_INVALID
|
||||
error = Error.with_code('serverInternal', detail='Unable to communicate with DNS server')
|
||||
error = Error(typ='urn:acme:error:serverInternal',
|
||||
detail='Unable to communicate with DNS server')
|
||||
self.challb = ChallengeBody(
|
||||
uri='http://challb', chall=self.chall, status=self.status,
|
||||
error=error)
|
||||
@@ -316,13 +256,10 @@ class ChallengeBodyTest(unittest.TestCase):
|
||||
self.jobj_from = self.jobj_to.copy()
|
||||
self.jobj_from['status'] = 'invalid'
|
||||
self.jobj_from['error'] = {
|
||||
'type': 'urn:ietf:params:acme:error:serverInternal',
|
||||
'type': 'urn:acme:error:serverInternal',
|
||||
'detail': 'Unable to communicate with DNS server',
|
||||
}
|
||||
|
||||
def test_encode(self):
|
||||
self.assertEqual(self.challb.encode('uri'), self.challb.uri)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.jobj_to, self.challb.to_partial_json())
|
||||
|
||||
@@ -392,7 +329,9 @@ class AuthorizationResourceTest(unittest.TestCase):
|
||||
from acme.messages import AuthorizationResource
|
||||
authzr = AuthorizationResource(
|
||||
uri=mock.sentinel.uri,
|
||||
body=mock.sentinel.body)
|
||||
body=mock.sentinel.body,
|
||||
new_cert_uri=mock.sentinel.new_cert_uri,
|
||||
)
|
||||
self.assertTrue(isinstance(authzr, jose.JSONDeSerializable))
|
||||
|
||||
|
||||
@@ -438,34 +377,5 @@ class RevocationTest(unittest.TestCase):
|
||||
hash(Revocation.from_json(self.rev.to_json()))
|
||||
|
||||
|
||||
class OrderResourceTest(unittest.TestCase):
|
||||
"""Tests for acme.messages.OrderResource."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.messages import OrderResource
|
||||
self.regr = OrderResource(
|
||||
body=mock.sentinel.body, uri=mock.sentinel.uri)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.regr.to_json(), {
|
||||
'body': mock.sentinel.body,
|
||||
'uri': mock.sentinel.uri,
|
||||
'authorizations': None,
|
||||
})
|
||||
|
||||
class NewOrderTest(unittest.TestCase):
|
||||
"""Tests for acme.messages.NewOrder."""
|
||||
|
||||
def setUp(self):
|
||||
from acme.messages import NewOrder
|
||||
self.reg = NewOrder(
|
||||
identifiers=mock.sentinel.identifiers)
|
||||
|
||||
def test_to_partial_json(self):
|
||||
self.assertEqual(self.reg.to_json(), {
|
||||
'identifiers': mock.sentinel.identifiers,
|
||||
})
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,37 +1,35 @@
|
||||
"""Support for standalone client challenge solvers. """
|
||||
import argparse
|
||||
import collections
|
||||
import functools
|
||||
import logging
|
||||
import socket
|
||||
import threading
|
||||
import os
|
||||
import sys
|
||||
|
||||
from six.moves import BaseHTTPServer # type: ignore # pylint: disable=import-error
|
||||
from six.moves import BaseHTTPServer # pylint: disable=import-error
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
from six.moves import socketserver # type: ignore # pylint: disable=import-error
|
||||
from six.moves import socketserver # pylint: disable=import-error
|
||||
|
||||
import OpenSSL
|
||||
|
||||
from acme import challenges
|
||||
from acme import crypto_util
|
||||
from acme.magic_typing import List # pylint: disable=unused-import, no-name-in-module
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
# six.moves.* | pylint: disable=no-member,attribute-defined-outside-init
|
||||
# pylint: disable=no-init
|
||||
# pylint: disable=too-few-public-methods,no-init
|
||||
|
||||
|
||||
class TLSServer(socketserver.TCPServer):
|
||||
"""Generic TLS Server."""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.ipv6 = kwargs.pop("ipv6", False)
|
||||
if self.ipv6:
|
||||
self.address_family = socket.AF_INET6
|
||||
else:
|
||||
self.address_family = socket.AF_INET
|
||||
self.certs = kwargs.pop("certs", {})
|
||||
self.method = kwargs.pop(
|
||||
# pylint: disable=protected-access
|
||||
"method", crypto_util._DEFAULT_SSL_METHOD)
|
||||
"method", crypto_util._DEFAULT_TLSSNI01_SSL_METHOD)
|
||||
self.allow_reuse_address = kwargs.pop("allow_reuse_address", True)
|
||||
socketserver.TCPServer.__init__(self, *args, **kwargs)
|
||||
|
||||
@@ -44,113 +42,41 @@ class TLSServer(socketserver.TCPServer):
|
||||
return socketserver.TCPServer.server_bind(self)
|
||||
|
||||
|
||||
class ACMEServerMixin:
|
||||
class ACMEServerMixin: # pylint: disable=old-style-class
|
||||
"""ACME server common settings mixin."""
|
||||
# TODO: c.f. #858
|
||||
server_version = "ACME client standalone challenge solver"
|
||||
allow_reuse_address = True
|
||||
|
||||
|
||||
class BaseDualNetworkedServers(object):
|
||||
"""Base class for a pair of IPv6 and IPv4 servers that tries to do everything
|
||||
it's asked for both servers, but where failures in one server don't
|
||||
affect the other.
|
||||
class TLSSNI01Server(TLSServer, ACMEServerMixin):
|
||||
"""TLSSNI01 Server."""
|
||||
|
||||
If two servers are instantiated, they will serve on the same port.
|
||||
"""
|
||||
|
||||
def __init__(self, ServerClass, server_address, *remaining_args, **kwargs):
|
||||
port = server_address[1]
|
||||
self.threads = [] # type: List[threading.Thread]
|
||||
self.servers = [] # type: List[ACMEServerMixin]
|
||||
|
||||
# Must try True first.
|
||||
# Ubuntu, for example, will fail to bind to IPv4 if we've already bound
|
||||
# to IPv6. But that's ok, since it will accept IPv4 connections on the IPv6
|
||||
# socket. On the other hand, FreeBSD will successfully bind to IPv4 on the
|
||||
# same port, which means that server will accept the IPv4 connections.
|
||||
# If Python is compiled without IPv6, we'll error out but (probably) successfully
|
||||
# create the IPv4 server.
|
||||
for ip_version in [True, False]:
|
||||
try:
|
||||
kwargs["ipv6"] = ip_version
|
||||
new_address = (server_address[0],) + (port,) + server_address[2:]
|
||||
new_args = (new_address,) + remaining_args
|
||||
server = ServerClass(*new_args, **kwargs)
|
||||
logger.debug(
|
||||
"Successfully bound to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
except socket.error:
|
||||
if self.servers:
|
||||
# Already bound using IPv6.
|
||||
logger.debug(
|
||||
"Certbot wasn't able to bind to %s:%s using %s, this "
|
||||
"is often expected due to the dual stack nature of "
|
||||
"IPv6 socket implementations.",
|
||||
new_address[0], new_address[1],
|
||||
"IPv6" if ip_version else "IPv4")
|
||||
else:
|
||||
logger.debug(
|
||||
"Failed to bind to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
else:
|
||||
self.servers.append(server)
|
||||
# If two servers are set up and port 0 was passed in, ensure we always
|
||||
# bind to the same port for both servers.
|
||||
port = server.socket.getsockname()[1]
|
||||
if not self.servers:
|
||||
raise socket.error("Could not bind to IPv4 or IPv6.")
|
||||
|
||||
def serve_forever(self):
|
||||
"""Wraps socketserver.TCPServer.serve_forever"""
|
||||
for server in self.servers:
|
||||
thread = threading.Thread(
|
||||
target=server.serve_forever)
|
||||
thread.start()
|
||||
self.threads.append(thread)
|
||||
|
||||
def getsocknames(self):
|
||||
"""Wraps socketserver.TCPServer.socket.getsockname"""
|
||||
return [server.socket.getsockname() for server in self.servers]
|
||||
|
||||
def shutdown_and_server_close(self):
|
||||
"""Wraps socketserver.TCPServer.shutdown, socketserver.TCPServer.server_close, and
|
||||
threading.Thread.join"""
|
||||
for server in self.servers:
|
||||
server.shutdown()
|
||||
server.server_close()
|
||||
for thread in self.threads:
|
||||
thread.join()
|
||||
self.threads = []
|
||||
def __init__(self, server_address, certs):
|
||||
TLSServer.__init__(
|
||||
self, server_address, BaseRequestHandlerWithLogging, certs=certs)
|
||||
|
||||
|
||||
class HTTPServer(BaseHTTPServer.HTTPServer):
|
||||
"""Generic HTTP Server."""
|
||||
class BaseRequestHandlerWithLogging(socketserver.BaseRequestHandler):
|
||||
"""BaseRequestHandler with logging."""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.ipv6 = kwargs.pop("ipv6", False)
|
||||
if self.ipv6:
|
||||
self.address_family = socket.AF_INET6
|
||||
else:
|
||||
self.address_family = socket.AF_INET
|
||||
BaseHTTPServer.HTTPServer.__init__(self, *args, **kwargs)
|
||||
def log_message(self, format, *args): # pylint: disable=redefined-builtin
|
||||
"""Log arbitrary message."""
|
||||
logger.debug("%s - - %s", self.client_address[0], format % args)
|
||||
|
||||
def handle(self):
|
||||
"""Handle request."""
|
||||
self.log_message("Incoming request")
|
||||
socketserver.BaseRequestHandler.handle(self)
|
||||
|
||||
|
||||
class HTTP01Server(HTTPServer, ACMEServerMixin):
|
||||
class HTTP01Server(BaseHTTPServer.HTTPServer, ACMEServerMixin):
|
||||
"""HTTP01 Server."""
|
||||
|
||||
def __init__(self, server_address, resources, ipv6=False):
|
||||
HTTPServer.__init__(
|
||||
def __init__(self, server_address, resources):
|
||||
BaseHTTPServer.HTTPServer.__init__(
|
||||
self, server_address, HTTP01RequestHandler.partial_init(
|
||||
simple_http_resources=resources), ipv6=ipv6)
|
||||
|
||||
|
||||
class HTTP01DualNetworkedServers(BaseDualNetworkedServers):
|
||||
"""HTTP01Server Wrapper. Tries everything for both. Failures for one don't
|
||||
affect the other."""
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
BaseDualNetworkedServers.__init__(self, HTTP01Server, *args, **kwargs)
|
||||
simple_http_resources=resources))
|
||||
|
||||
|
||||
class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
@@ -167,7 +93,7 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
self.simple_http_resources = kwargs.pop("simple_http_resources", set())
|
||||
BaseHTTPServer.BaseHTTPRequestHandler.__init__(self, *args, **kwargs)
|
||||
socketserver.BaseRequestHandler.__init__(self, *args, **kwargs)
|
||||
|
||||
def log_message(self, format, *args): # pylint: disable=redefined-builtin
|
||||
"""Log arbitrary message."""
|
||||
@@ -226,3 +152,39 @@ class HTTP01RequestHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
"""
|
||||
return functools.partial(
|
||||
cls, simple_http_resources=simple_http_resources)
|
||||
|
||||
|
||||
def simple_tls_sni_01_server(cli_args, forever=True):
|
||||
"""Run simple standalone TLSSNI01 server."""
|
||||
logging.basicConfig(level=logging.DEBUG)
|
||||
|
||||
parser = argparse.ArgumentParser()
|
||||
parser.add_argument(
|
||||
"-p", "--port", default=0, help="Port to serve at. By default "
|
||||
"picks random free port.")
|
||||
args = parser.parse_args(cli_args[1:])
|
||||
|
||||
certs = {}
|
||||
|
||||
_, hosts, _ = next(os.walk('.'))
|
||||
for host in hosts:
|
||||
with open(os.path.join(host, "cert.pem")) as cert_file:
|
||||
cert_contents = cert_file.read()
|
||||
with open(os.path.join(host, "key.pem")) as key_file:
|
||||
key_contents = key_file.read()
|
||||
certs[host.encode()] = (
|
||||
OpenSSL.crypto.load_privatekey(
|
||||
OpenSSL.crypto.FILETYPE_PEM, key_contents),
|
||||
OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_PEM, cert_contents))
|
||||
|
||||
server = TLSSNI01Server(('', int(args.port)), certs=certs)
|
||||
logger.info("Serving at https://%s:%s...", *server.socket.getsockname()[:2])
|
||||
if forever: # pragma: no cover
|
||||
server.serve_forever()
|
||||
else:
|
||||
server.handle_request()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(simple_tls_sni_01_server(sys.argv)) # pragma: no cover
|
||||
|
||||
155
acme/acme/standalone_test.py
Normal file
155
acme/acme/standalone_test.py
Normal file
@@ -0,0 +1,155 @@
|
||||
"""Tests for acme.standalone."""
|
||||
import os
|
||||
import shutil
|
||||
import threading
|
||||
import tempfile
|
||||
import time
|
||||
import unittest
|
||||
|
||||
from six.moves import http_client # pylint: disable=import-error
|
||||
from six.moves import socketserver # pylint: disable=import-error
|
||||
|
||||
import requests
|
||||
|
||||
from acme import challenges
|
||||
from acme import crypto_util
|
||||
from acme import errors
|
||||
from acme import jose
|
||||
from acme import test_util
|
||||
|
||||
|
||||
class TLSServerTest(unittest.TestCase):
|
||||
"""Tests for acme.standalone.TLSServer."""
|
||||
|
||||
def test_bind(self): # pylint: disable=no-self-use
|
||||
from acme.standalone import TLSServer
|
||||
server = TLSServer(
|
||||
('', 0), socketserver.BaseRequestHandler, bind_and_activate=True)
|
||||
server.server_close() # pylint: disable=no-member
|
||||
|
||||
|
||||
class TLSSNI01ServerTest(unittest.TestCase):
|
||||
"""Test for acme.standalone.TLSSNI01Server."""
|
||||
|
||||
def setUp(self):
|
||||
self.certs = {b'localhost': (
|
||||
test_util.load_pyopenssl_private_key('rsa512_key.pem'),
|
||||
test_util.load_cert('cert.pem'),
|
||||
)}
|
||||
from acme.standalone import TLSSNI01Server
|
||||
self.server = TLSSNI01Server(("", 0), certs=self.certs)
|
||||
# pylint: disable=no-member
|
||||
self.thread = threading.Thread(target=self.server.serve_forever)
|
||||
self.thread.start()
|
||||
|
||||
def tearDown(self):
|
||||
self.server.shutdown() # pylint: disable=no-member
|
||||
self.thread.join()
|
||||
|
||||
def test_it(self):
|
||||
host, port = self.server.socket.getsockname()[:2]
|
||||
cert = crypto_util.probe_sni(
|
||||
b'localhost', host=host, port=port, timeout=1)
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
jose.ComparableX509(self.certs[b'localhost'][1]))
|
||||
|
||||
|
||||
class HTTP01ServerTest(unittest.TestCase):
|
||||
"""Tests for acme.standalone.HTTP01Server."""
|
||||
|
||||
def setUp(self):
|
||||
self.account_key = jose.JWK.load(
|
||||
test_util.load_vector('rsa1024_key.pem'))
|
||||
self.resources = set()
|
||||
|
||||
from acme.standalone import HTTP01Server
|
||||
self.server = HTTP01Server(('', 0), resources=self.resources)
|
||||
|
||||
# pylint: disable=no-member
|
||||
self.port = self.server.socket.getsockname()[1]
|
||||
self.thread = threading.Thread(target=self.server.serve_forever)
|
||||
self.thread.start()
|
||||
|
||||
def tearDown(self):
|
||||
self.server.shutdown() # pylint: disable=no-member
|
||||
self.thread.join()
|
||||
|
||||
def test_index(self):
|
||||
response = requests.get(
|
||||
'http://localhost:{0}'.format(self.port), verify=False)
|
||||
self.assertEqual(
|
||||
response.text, 'ACME client standalone challenge solver')
|
||||
self.assertTrue(response.ok)
|
||||
|
||||
def test_404(self):
|
||||
response = requests.get(
|
||||
'http://localhost:{0}/foo'.format(self.port), verify=False)
|
||||
self.assertEqual(response.status_code, http_client.NOT_FOUND)
|
||||
|
||||
def _test_http01(self, add):
|
||||
chall = challenges.HTTP01(token=(b'x' * 16))
|
||||
response, validation = chall.response_and_validation(self.account_key)
|
||||
|
||||
from acme.standalone import HTTP01RequestHandler
|
||||
resource = HTTP01RequestHandler.HTTP01Resource(
|
||||
chall=chall, response=response, validation=validation)
|
||||
if add:
|
||||
self.resources.add(resource)
|
||||
return resource.response.simple_verify(
|
||||
resource.chall, 'localhost', self.account_key.public_key(),
|
||||
port=self.port)
|
||||
|
||||
def test_http01_found(self):
|
||||
self.assertTrue(self._test_http01(add=True))
|
||||
|
||||
def test_http01_not_found(self):
|
||||
self.assertFalse(self._test_http01(add=False))
|
||||
|
||||
|
||||
class TestSimpleTLSSNI01Server(unittest.TestCase):
|
||||
"""Tests for acme.standalone.simple_tls_sni_01_server."""
|
||||
|
||||
def setUp(self):
|
||||
# mirror ../examples/standalone
|
||||
self.test_cwd = tempfile.mkdtemp()
|
||||
localhost_dir = os.path.join(self.test_cwd, 'localhost')
|
||||
os.makedirs(localhost_dir)
|
||||
shutil.copy(test_util.vector_path('cert.pem'), localhost_dir)
|
||||
shutil.copy(test_util.vector_path('rsa512_key.pem'),
|
||||
os.path.join(localhost_dir, 'key.pem'))
|
||||
|
||||
from acme.standalone import simple_tls_sni_01_server
|
||||
self.port = 1234
|
||||
self.thread = threading.Thread(
|
||||
target=simple_tls_sni_01_server, kwargs={
|
||||
'cli_args': ('xxx', '--port', str(self.port)),
|
||||
'forever': False,
|
||||
},
|
||||
)
|
||||
self.old_cwd = os.getcwd()
|
||||
os.chdir(self.test_cwd)
|
||||
self.thread.start()
|
||||
|
||||
def tearDown(self):
|
||||
os.chdir(self.old_cwd)
|
||||
self.thread.join()
|
||||
shutil.rmtree(self.test_cwd)
|
||||
|
||||
def test_it(self):
|
||||
max_attempts = 5
|
||||
while max_attempts:
|
||||
max_attempts -= 1
|
||||
try:
|
||||
cert = crypto_util.probe_sni(
|
||||
b'localhost', b'0.0.0.0', self.port)
|
||||
except errors.Error:
|
||||
self.assertTrue(max_attempts > 0, "Timeout!")
|
||||
time.sleep(1) # wait until thread starts
|
||||
else:
|
||||
self.assertEqual(jose.ComparableX509(cert),
|
||||
test_util.load_comparable_cert('cert.pem'))
|
||||
break
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
unittest.main() # pragma: no cover
|
||||
113
acme/acme/test_util.py
Normal file
113
acme/acme/test_util.py
Normal file
@@ -0,0 +1,113 @@
|
||||
"""Test utilities.
|
||||
|
||||
.. warning:: This module is not part of the public API.
|
||||
|
||||
"""
|
||||
import os
|
||||
import pkg_resources
|
||||
import unittest
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
import OpenSSL
|
||||
|
||||
from acme import errors
|
||||
from acme import jose
|
||||
from acme import util
|
||||
|
||||
|
||||
def vector_path(*names):
|
||||
"""Path to a test vector."""
|
||||
return pkg_resources.resource_filename(
|
||||
__name__, os.path.join('testdata', *names))
|
||||
|
||||
|
||||
def load_vector(*names):
|
||||
"""Load contents of a test vector."""
|
||||
# luckily, resource_string opens file in binary mode
|
||||
return pkg_resources.resource_string(
|
||||
__name__, os.path.join('testdata', *names))
|
||||
|
||||
|
||||
def _guess_loader(filename, loader_pem, loader_der):
|
||||
_, ext = os.path.splitext(filename)
|
||||
if ext.lower() == '.pem':
|
||||
return loader_pem
|
||||
elif ext.lower() == '.der':
|
||||
return loader_der
|
||||
else: # pragma: no cover
|
||||
raise ValueError("Loader could not be recognized based on extension")
|
||||
|
||||
|
||||
def load_cert(*names):
|
||||
"""Load certificate."""
|
||||
loader = _guess_loader(
|
||||
names[-1], OpenSSL.crypto.FILETYPE_PEM, OpenSSL.crypto.FILETYPE_ASN1)
|
||||
return OpenSSL.crypto.load_certificate(loader, load_vector(*names))
|
||||
|
||||
|
||||
def load_comparable_cert(*names):
|
||||
"""Load ComparableX509 cert."""
|
||||
return jose.ComparableX509(load_cert(*names))
|
||||
|
||||
|
||||
def load_csr(*names):
|
||||
"""Load certificate request."""
|
||||
loader = _guess_loader(
|
||||
names[-1], OpenSSL.crypto.FILETYPE_PEM, OpenSSL.crypto.FILETYPE_ASN1)
|
||||
return OpenSSL.crypto.load_certificate_request(loader, load_vector(*names))
|
||||
|
||||
|
||||
def load_comparable_csr(*names):
|
||||
"""Load ComparableX509 certificate request."""
|
||||
return jose.ComparableX509(load_csr(*names))
|
||||
|
||||
|
||||
def load_rsa_private_key(*names):
|
||||
"""Load RSA private key."""
|
||||
loader = _guess_loader(names[-1], serialization.load_pem_private_key,
|
||||
serialization.load_der_private_key)
|
||||
return jose.ComparableRSAKey(loader(
|
||||
load_vector(*names), password=None, backend=default_backend()))
|
||||
|
||||
|
||||
def load_pyopenssl_private_key(*names):
|
||||
"""Load pyOpenSSL private key."""
|
||||
loader = _guess_loader(
|
||||
names[-1], OpenSSL.crypto.FILETYPE_PEM, OpenSSL.crypto.FILETYPE_ASN1)
|
||||
return OpenSSL.crypto.load_privatekey(loader, load_vector(*names))
|
||||
|
||||
|
||||
def requirement_available(requirement):
|
||||
"""Checks if requirement can be imported.
|
||||
|
||||
:rtype: bool
|
||||
:returns: ``True`` iff requirement can be imported
|
||||
|
||||
"""
|
||||
try:
|
||||
util.activate(requirement)
|
||||
except errors.DependencyError: # pragma: no cover
|
||||
return False
|
||||
return True # pragma: no cover
|
||||
|
||||
|
||||
def skip_unless(condition, reason): # pragma: no cover
|
||||
"""Skip tests unless a condition holds.
|
||||
|
||||
This implements the basic functionality of unittest.skipUnless
|
||||
which is only available on Python 2.7+.
|
||||
|
||||
:param bool condition: If ``False``, the test will be skipped
|
||||
:param str reason: the reason for skipping the test
|
||||
|
||||
:rtype: callable
|
||||
:returns: decorator that hides tests unless condition is ``True``
|
||||
|
||||
"""
|
||||
if hasattr(unittest, "skipUnless"):
|
||||
return unittest.skipUnless(condition, reason)
|
||||
elif condition:
|
||||
return lambda cls: cls
|
||||
else:
|
||||
return lambda cls: None
|
||||
27
acme/acme/testdata/rsa2048_key.pem
vendored
Normal file
27
acme/acme/testdata/rsa2048_key.pem
vendored
Normal file
@@ -0,0 +1,27 @@
|
||||
-----BEGIN RSA PRIVATE KEY-----
|
||||
MIIEowIBAAKCAQEA8HwZMHeImB/iM8/n8CTCR4KeYQB2gLGO3v8xLms+PWH3Zbxc
|
||||
dVtEn25Y34scIh+iOuEXBcSBalBddLHKBGVN3nCfmpupoLm52xgRG44q9OWODpg4
|
||||
FSi4afqVw2agMx0RHi0v3GVcdpqB83UW42kK1ESZHUuq7mxLg8u3IMYZFm6Amsf+
|
||||
YQjBbDNn8NczJOFhsExP2EdM5ykgM1Om8aqTqqPMgPub68/r4Sym+BjLnvRq5Qtz
|
||||
h/jCfOBIIpAwg3lj7l8OyE3kkD3ALtuiuminNUqLHEkUaLq/Xiv8V8mvnrhG7h3Q
|
||||
+L1Xc707P0dz5YM5XxTMhmUE1cae/lQ0KbNrpwIDAQABAoIBAAiDXCDrGlrIRimv
|
||||
YnaN1pLRfOnSKl/D6VrbjdIm2b0yip9/W4aMBJHgRiUjt4s9s3CCJ1585lftIGHR
|
||||
KWWecHM/aWb/u7GE4Z9v6qsfDUY+GhlKKjIVjvGxfTu9lk446TI4R0l2DR/luFP2
|
||||
ASlrvoZlJ0ZyN0rZapLv0zvFx32Tukd+3rcMmXfHl7aRGMZG1YTKNmBJ4d9iJ6cP
|
||||
HG3fgSzLQMPLNO/20MzbXdREG5FNQtwaMuFnIcVbtMCvc/71lQQEfANMLCUweEed
|
||||
YWGOjgDeh+731nJsopel+2TSTgnf5VhcFrgChZZdqeKvP+HbXjTE2VkWo7BrzoM7
|
||||
xICYBwECgYEA/ZF/JOjZfwIMUdikv8vldJzRMdFryl4NJWnh4NeThNOEpUcpxnyK
|
||||
wyMnnQaGJa51u9EEnzl0sZ2h2ODjD6KFpz6fkWaVRq5SWalVPAoKZGaoPZV3IUOI
|
||||
8Tm0xkXho+A/FUUEcxCLME+3V9EdPfHaVRJOrbfDyxvNhsj4w9F0aAkCgYEA8sp7
|
||||
XTrolOknJGv4Qt1w6gcm5+cMtLaRfi8ZHPHujl2x9eWE8/s2818az7jc0Xr/G4HQ
|
||||
NeU+3Es4BblEckSHmhUZhx26cZgkLSIIDofEtaEc6u8CyWfxsWvn3l4T3kMdeSLC
|
||||
9UoLk59AH2tkMIh8vzV8LSisLJa341lMdgryQi8CgYAlJKr7PSCe+i3Tz2hSsAts
|
||||
iYwbQBIKErzaPihYRzvUuSc1DreP26535y5mUg5UdrnISVXj/Qaa/fw3SLn6EFSD
|
||||
qyi0o9I6CE8H00YpBU+AZYk/fCV3Oe1VaJ6SbKog1zhmZTXBpSq+aO7ybi9aY5MX
|
||||
4xajW8fSeMAifk3yYTwsAQKBgErcEcOCOVpItU/uloKPYpRWFjHktK83p46fmP+q
|
||||
vOJak1d9KExOBfhuN4caucNBSE1D7l3fzE0CSEjDgg41gRYKMW/Ow8DopybfWlqY
|
||||
lBdokNEDVvmgug35dmnC2h9q1DiYdkJJTV57+Lp3U1H/k28lX59Q7h1lb1eDHic7
|
||||
YszzAoGBAOx05dhOiYbzAJSTQu3oBHFn4mTYIqCcDO6cQrEJwPKAq7mAhT0yOk9N
|
||||
CrqRV/1aes665829cyTwcAZl6nqbzHv5XjX5+g6vmooCb4oCkq49rumHjoQdrX8D
|
||||
RR5b+Spkc1jo4rctCcExzSkgo+K5N3oBVYznecje7O7Z0/qiJE/8
|
||||
-----END RSA PRIVATE KEY-----
|
||||
@@ -1,7 +1,25 @@
|
||||
"""ACME utilities."""
|
||||
import pkg_resources
|
||||
import six
|
||||
|
||||
from acme import errors
|
||||
|
||||
|
||||
def map_keys(dikt, func):
|
||||
"""Map dictionary keys."""
|
||||
return dict((func(key), value) for key, value in six.iteritems(dikt))
|
||||
|
||||
|
||||
def activate(requirement):
|
||||
"""Make requirement importable.
|
||||
|
||||
:param str requirement: the distribution and version to activate
|
||||
|
||||
:raises acme.errors.DependencyError: if cannot activate requirement
|
||||
|
||||
"""
|
||||
try:
|
||||
for distro in pkg_resources.require(requirement): # pylint: disable=not-callable
|
||||
distro.activate()
|
||||
except (pkg_resources.DistributionNotFound, pkg_resources.VersionConflict):
|
||||
raise errors.DependencyError('{0} is unavailable'.format(requirement))
|
||||
|
||||
@@ -1,6 +1,8 @@
|
||||
"""Tests for acme.util."""
|
||||
import unittest
|
||||
|
||||
from acme import errors
|
||||
|
||||
|
||||
class MapKeysTest(unittest.TestCase):
|
||||
"""Tests for acme.util.map_keys."""
|
||||
@@ -12,5 +14,21 @@ class MapKeysTest(unittest.TestCase):
|
||||
self.assertEqual({2: 2, 4: 4}, map_keys({1: 2, 3: 4}, lambda x: x + 1))
|
||||
|
||||
|
||||
class ActivateTest(unittest.TestCase):
|
||||
"""Tests for acme.util.activate."""
|
||||
|
||||
@classmethod
|
||||
def _call(cls, *args, **kwargs):
|
||||
from acme.util import activate
|
||||
return activate(*args, **kwargs)
|
||||
|
||||
def test_failure(self):
|
||||
self.assertRaises(errors.DependencyError, self._call, 'acme>99.0.0')
|
||||
|
||||
def test_success(self):
|
||||
self._call('acme')
|
||||
import acme as unused_acme
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
unittest.main() # pragma: no cover
|
||||
@@ -1,7 +1,10 @@
|
||||
JOSE
|
||||
----
|
||||
|
||||
The ``acme.jose`` module was moved to its own package "josepy_".
|
||||
Please refer to its documentation there.
|
||||
.. automodule:: acme.jose
|
||||
:members:
|
||||
|
||||
.. _josepy: https://josepy.readthedocs.io/
|
||||
.. toctree::
|
||||
:glob:
|
||||
|
||||
jose/*
|
||||
|
||||
5
acme/docs/api/jose/base64.rst
Normal file
5
acme/docs/api/jose/base64.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
JOSE Base64
|
||||
-----------
|
||||
|
||||
.. automodule:: acme.jose.b64
|
||||
:members:
|
||||
5
acme/docs/api/jose/errors.rst
Normal file
5
acme/docs/api/jose/errors.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
Errors
|
||||
------
|
||||
|
||||
.. automodule:: acme.jose.errors
|
||||
:members:
|
||||
5
acme/docs/api/jose/interfaces.rst
Normal file
5
acme/docs/api/jose/interfaces.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
Interfaces
|
||||
----------
|
||||
|
||||
.. automodule:: acme.jose.interfaces
|
||||
:members:
|
||||
5
acme/docs/api/jose/json_util.rst
Normal file
5
acme/docs/api/jose/json_util.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
JSON utilities
|
||||
--------------
|
||||
|
||||
.. automodule:: acme.jose.json_util
|
||||
:members:
|
||||
5
acme/docs/api/jose/jwa.rst
Normal file
5
acme/docs/api/jose/jwa.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
JSON Web Algorithms
|
||||
-------------------
|
||||
|
||||
.. automodule:: acme.jose.jwa
|
||||
:members:
|
||||
5
acme/docs/api/jose/jwk.rst
Normal file
5
acme/docs/api/jose/jwk.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
JSON Web Key
|
||||
------------
|
||||
|
||||
.. automodule:: acme.jose.jwk
|
||||
:members:
|
||||
5
acme/docs/api/jose/jws.rst
Normal file
5
acme/docs/api/jose/jws.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
JSON Web Signature
|
||||
------------------
|
||||
|
||||
.. automodule:: acme.jose.jws
|
||||
:members:
|
||||
5
acme/docs/api/jose/util.rst
Normal file
5
acme/docs/api/jose/util.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
Utilities
|
||||
---------
|
||||
|
||||
.. automodule:: acme.jose.util
|
||||
:members:
|
||||
5
acme/docs/api/other.rst
Normal file
5
acme/docs/api/other.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
Other ACME objects
|
||||
------------------
|
||||
|
||||
.. automodule:: acme.other
|
||||
:members:
|
||||
@@ -12,9 +12,10 @@
|
||||
# All configuration values have a default; values that are commented out
|
||||
# serve to show the default.
|
||||
|
||||
import sys
|
||||
import os
|
||||
import shlex
|
||||
import sys
|
||||
|
||||
|
||||
here = os.path.abspath(os.path.dirname(__file__))
|
||||
|
||||
@@ -38,6 +39,7 @@ extensions = [
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinxcontrib.programoutput',
|
||||
]
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
@@ -307,5 +309,4 @@ texinfo_documents = [
|
||||
|
||||
intersphinx_mapping = {
|
||||
'python': ('https://docs.python.org/', None),
|
||||
'josepy': ('https://josepy.readthedocs.io/en/latest/', None),
|
||||
}
|
||||
|
||||
@@ -16,6 +16,13 @@ Contents:
|
||||
.. automodule:: acme
|
||||
:members:
|
||||
|
||||
|
||||
Example client:
|
||||
|
||||
.. include:: ../examples/example_client.py
|
||||
:code: python
|
||||
|
||||
|
||||
Indices and tables
|
||||
==================
|
||||
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user