Compare commits
201 Commits
mutable-va
...
require_is
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d3a4a3c23a | ||
|
|
86694397a6 | ||
|
|
b18c074088 | ||
|
|
f59a639ec4 | ||
|
|
5411e4c86a | ||
|
|
0425b87b78 | ||
|
|
1de966d637 | ||
|
|
ba2e4aecb7 | ||
|
|
7d2b1996d9 | ||
|
|
dcd52b0711 | ||
|
|
8074858620 | ||
|
|
d3d293299a | ||
|
|
9148acd332 | ||
|
|
9f9a1df85e | ||
|
|
985457e57b | ||
|
|
4004589cbf | ||
|
|
8f7c3756b3 | ||
|
|
6ea5da51e0 | ||
|
|
1ac05ae891 | ||
|
|
a441debdaa | ||
|
|
5dd898f56b | ||
|
|
a1fce6b398 | ||
|
|
635d9c3ec3 | ||
|
|
0f36d0c1ba | ||
|
|
619da0432a | ||
|
|
314838eb81 | ||
|
|
25a1933e01 | ||
|
|
0f500e8010 | ||
|
|
1afae838bb | ||
|
|
724be8848a | ||
|
|
06ea141ca9 | ||
|
|
23245c07b2 | ||
|
|
2d1d1cd534 | ||
|
|
5240e3cbf2 | ||
|
|
5fca4a14ab | ||
|
|
9be070414f | ||
|
|
761c268934 | ||
|
|
1fa110c9d7 | ||
|
|
9d1fccf53a | ||
|
|
b16c64a05b | ||
|
|
88932da859 | ||
|
|
8a69b2f1d9 | ||
|
|
57b5942fc3 | ||
|
|
0f0000298b | ||
|
|
c39fbe388c | ||
|
|
fc07f5f935 | ||
|
|
9c8cdd05da | ||
|
|
2c8609464c | ||
|
|
7a48c235a9 | ||
|
|
3f9387bd15 | ||
|
|
087cb4d1f4 | ||
|
|
bcbc3dd484 | ||
|
|
89737718c1 | ||
|
|
b0e389aad7 | ||
|
|
9f5451d16b | ||
|
|
5ada20cb74 | ||
|
|
ba256adcdb | ||
|
|
94adff7247 | ||
|
|
06d6231d6d | ||
|
|
61da18cc47 | ||
|
|
7ab421233e | ||
|
|
59f32c9d11 | ||
|
|
2b57c5f03c | ||
|
|
8f75af1e84 | ||
|
|
3c9b936168 | ||
|
|
93294fc989 | ||
|
|
58a07ddd79 | ||
|
|
933c1703b6 | ||
|
|
a44d739dd7 | ||
|
|
58374867c8 | ||
|
|
aa6ea3b513 | ||
|
|
a25ef72c4f | ||
|
|
396b6cce02 | ||
|
|
38fc7fcc48 | ||
|
|
0e225dcba2 | ||
|
|
4ff5719a65 | ||
|
|
798a61622c | ||
|
|
b20d01e032 | ||
|
|
990352e371 | ||
|
|
c5a5d6f9a1 | ||
|
|
d4850399c5 | ||
|
|
c4be440853 | ||
|
|
165c3e32b0 | ||
|
|
2660a2017b | ||
|
|
6a6544fd90 | ||
|
|
320cf92944 | ||
|
|
3078c2f3db | ||
|
|
c54f99e35b | ||
|
|
c81dbb2582 | ||
|
|
742f97e11a | ||
|
|
84c8dbc52a | ||
|
|
4b51e3004c | ||
|
|
018800c5cc | ||
|
|
2eb4154169 | ||
|
|
becc2c3fee | ||
|
|
cb5382d4d5 | ||
|
|
6975e32998 | ||
|
|
62962357c5 | ||
|
|
343b540970 | ||
|
|
089b7efacd | ||
|
|
1584b0b58c | ||
|
|
141b15077c | ||
|
|
ee2c4844b9 | ||
|
|
181813b9b2 | ||
|
|
43d0652b0d | ||
|
|
80e68bec26 | ||
|
|
7b2b2b1685 | ||
|
|
c3c587001f | ||
|
|
281b724996 | ||
|
|
3d5714f499 | ||
|
|
ba9f1939ab | ||
|
|
481c8c0600 | ||
|
|
35b177a1a0 | ||
|
|
95976762ac | ||
|
|
bf64e7f4e4 | ||
|
|
9213154e44 | ||
|
|
810d50eb3d | ||
|
|
99a4129cd4 | ||
|
|
8db8fcf26c | ||
|
|
6d8fec7760 | ||
|
|
4f3af45f5c | ||
|
|
8ebd8ea9fb | ||
|
|
83d8fbbd75 | ||
|
|
0c49ab462f | ||
|
|
35091d878f | ||
|
|
c31f53a225 | ||
|
|
d2a13c55f2 | ||
|
|
de1ce7340f | ||
|
|
929f9e944f | ||
|
|
6c422774d5 | ||
|
|
443ec2200f | ||
|
|
38cbeb560c | ||
|
|
873f979a25 | ||
|
|
2a41402f2a | ||
|
|
6ecf3782ac | ||
|
|
d1347fce9a | ||
|
|
9412ce9f05 | ||
|
|
fabe7bbc78 | ||
|
|
1e34fb8b51 | ||
|
|
4d7d0d6d04 | ||
|
|
cf77b3c3fa | ||
|
|
a7674bd45a | ||
|
|
cdeac7a745 | ||
|
|
50b2097d38 | ||
|
|
30e7f23360 | ||
|
|
248455a92b | ||
|
|
cca30ace31 | ||
|
|
90348bde4e | ||
|
|
54dd12cd57 | ||
|
|
4e6934a4b6 | ||
|
|
57bb4e40b7 | ||
|
|
7f885292f9 | ||
|
|
8978e4dbff | ||
|
|
920b717c45 | ||
|
|
54b7b1883e | ||
|
|
87ab76fc7d | ||
|
|
4925f71933 | ||
|
|
39fda1d44d | ||
|
|
c8a1e30981 | ||
|
|
7abf143394 | ||
|
|
f4e031f505 | ||
|
|
2844fdd74a | ||
|
|
3b183961a9 | ||
|
|
76411ecca7 | ||
|
|
725c64d581 | ||
|
|
99ae4ac5ef | ||
|
|
b8b759f1d2 | ||
|
|
8b5a017b05 | ||
|
|
b7ef536ec3 | ||
|
|
282df74ee9 | ||
|
|
0a565815f9 | ||
|
|
d33bbf35c2 | ||
|
|
714a0b348d | ||
|
|
7ca1b8f286 | ||
|
|
be40e377d9 | ||
|
|
01cf4bae75 | ||
|
|
ef949f9149 | ||
|
|
926d0c7e0f | ||
|
|
9d8eb6ccfd | ||
|
|
585f70e700 | ||
|
|
21e24264f4 | ||
|
|
cf78ad3a3d | ||
|
|
dccb92d57f | ||
|
|
f9d31faadc | ||
|
|
e9225d1cc2 | ||
|
|
3dd1f0eea9 | ||
|
|
917e3aba6b | ||
|
|
3833255980 | ||
|
|
619654f317 | ||
|
|
76f9a33e45 | ||
|
|
5f67bb99a8 | ||
|
|
d8392bf394 | ||
|
|
6a89fcbc56 | ||
|
|
2adaacab82 | ||
|
|
2ae810c45a | ||
|
|
b62133e3e1 | ||
|
|
a92bb44ff9 | ||
|
|
9650c25968 | ||
|
|
c3c29afdca | ||
|
|
dca4ddd3d8 | ||
|
|
bf5475fa74 |
@@ -1,8 +1,8 @@
|
||||
# Configuring Azure Pipelines with Certbot
|
||||
|
||||
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
|
||||
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
|
||||
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
|
||||
* `.azure-pipelines/main.yml` is the main one, executed on PRs for main, and pushes to main,
|
||||
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for main.
|
||||
|
||||
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
|
||||
|
||||
@@ -64,7 +64,7 @@ Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses,
|
||||
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
|
||||
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
|
||||
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
|
||||
protected branches, like master, and cannot modify the security context around this on GitHub.
|
||||
protected branches, like main, and cannot modify the security context around this on GitHub.
|
||||
Access can be defined for all or only selected repositories, which is nice.
|
||||
```
|
||||
|
||||
@@ -91,11 +91,11 @@ grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAut
|
||||
then are way too large (admin level on almost everything), while the classic approach does not add any more
|
||||
permissions, and works perfectly well.__
|
||||
|
||||
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
|
||||
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, main in default branch.
|
||||
- Click on YAML option for "Select a template"
|
||||
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
|
||||
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
|
||||
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
|
||||
- Click "Save & queue", choose the main branch to build the first pipeline, and click "Save and run" button.
|
||||
|
||||
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._
|
||||
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
# We run the test suite on commits to master so codecov gets coverage data
|
||||
# about the master branch and can use it to track coverage changes.
|
||||
# We run the test suite on commits to main so codecov gets coverage data
|
||||
# about the main branch and can use it to track coverage changes.
|
||||
trigger:
|
||||
- master
|
||||
- main
|
||||
pr:
|
||||
- master
|
||||
- main
|
||||
- '*.x'
|
||||
|
||||
variables:
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Nightly pipeline running each day for master.
|
||||
# Nightly pipeline running each day for main.
|
||||
trigger: none
|
||||
pr: none
|
||||
schedules:
|
||||
@@ -6,7 +6,7 @@ schedules:
|
||||
displayName: Nightly build
|
||||
branches:
|
||||
include:
|
||||
- master
|
||||
- main
|
||||
always: true
|
||||
|
||||
variables:
|
||||
@@ -15,5 +15,5 @@ variables:
|
||||
|
||||
stages:
|
||||
- template: templates/stages/test-and-package-stage.yml
|
||||
- template: templates/stages/deploy-stage.yml
|
||||
- template: templates/stages/nightly-deploy-stage.yml
|
||||
- template: templates/stages/notify-failure-stage.yml
|
||||
|
||||
@@ -13,7 +13,5 @@ variables:
|
||||
stages:
|
||||
- template: templates/stages/test-and-package-stage.yml
|
||||
- template: templates/stages/changelog-stage.yml
|
||||
- template: templates/stages/deploy-stage.yml
|
||||
parameters:
|
||||
snapReleaseChannel: beta
|
||||
- template: templates/stages/release-deploy-stage.yml
|
||||
- template: templates/stages/notify-failure-stage.yml
|
||||
|
||||
@@ -72,3 +72,57 @@ jobs:
|
||||
tools/retry.sh eval snapcraft upload --release=${{ parameters.snapReleaseChannel }} "${SNAP_FILE}"
|
||||
done
|
||||
displayName: Publish to Snap store
|
||||
# The credentials used in the following jobs are for the shared
|
||||
# certbotbot account on Docker Hub. The credentials are stored
|
||||
# in a service account which was created by following the
|
||||
# instructions at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
|
||||
# The name given to this service account must match the value
|
||||
# given to containerRegistry below. The authentication used when
|
||||
# creating this service account was a personal access token
|
||||
# rather than a password to bypass 2FA. When Brad set this up,
|
||||
# Azure Pipelines failed to verify the credentials with an error
|
||||
# like "access is forbidden with a JWT issued from a personal
|
||||
# access token", but after saving them without verification, the
|
||||
# access token worked when the pipeline actually ran. "Grant
|
||||
# access to all pipelines" should also be checked on the service
|
||||
# account. The access token can be deleted on Docker Hub if
|
||||
# these credentials need to be revoked.
|
||||
- job: publish_docker_by_arch
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
strategy:
|
||||
matrix:
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
path: $(Build.SourcesDirectory)
|
||||
displayName: Retrieve Docker images
|
||||
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
|
||||
displayName: Load Docker images
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy_images.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Deploy the Docker images by architecture
|
||||
- job: publish_docker_multiarch
|
||||
dependsOn: publish_docker_by_arch
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy_manifests.sh $(dockerTag) all
|
||||
displayName: Deploy the Docker multiarch manifests
|
||||
@@ -4,7 +4,7 @@ jobs:
|
||||
- name: IMAGE_NAME
|
||||
value: ubuntu-22.04
|
||||
- name: PYTHON_VERSION
|
||||
value: 3.11
|
||||
value: 3.12
|
||||
- group: certbot-common
|
||||
strategy:
|
||||
matrix:
|
||||
@@ -14,43 +14,39 @@ jobs:
|
||||
linux-py310:
|
||||
PYTHON_VERSION: 3.10
|
||||
TOXENV: py310
|
||||
linux-py311:
|
||||
PYTHON_VERSION: 3.11
|
||||
TOXENV: py311
|
||||
linux-isolated:
|
||||
TOXENV: 'isolated-{acme,certbot,apache,cloudflare,digitalocean,dnsimple,dnsmadeeasy,gehirn,google,linode,luadns,nsone,ovh,rfc2136,route53,sakuracloud,nginx}'
|
||||
linux-boulder-v2-integration-certbot-oldest:
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: 'isolated-acme,isolated-certbot,isolated-apache,isolated-cloudflare,isolated-digitalocean,isolated-dnsimple,isolated-dnsmadeeasy,isolated-gehirn,isolated-google,isolated-linode,isolated-luadns,isolated-nsone,isolated-ovh,isolated-rfc2136,isolated-route53,isolated-sakuracloud,isolated-nginx'
|
||||
linux-integration-certbot-oldest:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration-certbot-oldest
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-integration-nginx-oldest:
|
||||
PYTHON_VERSION: 3.8
|
||||
linux-integration-nginx-oldest:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration-nginx-oldest
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-py38-integration:
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-py39-integration:
|
||||
linux-py39-integration:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-py310-integration:
|
||||
linux-py310-integration:
|
||||
PYTHON_VERSION: 3.10
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-py311-integration:
|
||||
linux-py311-integration:
|
||||
PYTHON_VERSION: 3.11
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
# python 3.12 integration tests are not run here because they're run as
|
||||
# part of the standard test suite
|
||||
nginx-compat:
|
||||
TOXENV: nginx_compat
|
||||
linux-integration-rfc2136:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.8
|
||||
PYTHON_VERSION: 3.12
|
||||
TOXENV: integration-dns-rfc2136
|
||||
le-modification:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: modification
|
||||
farmtest-apache2:
|
||||
PYTHON_VERSION: 3.8
|
||||
PYTHON_VERSION: 3.12
|
||||
TOXENV: test-farm-apache2
|
||||
pool:
|
||||
vmImage: $(IMAGE_NAME)
|
||||
|
||||
@@ -55,65 +55,6 @@ jobs:
|
||||
- bash: |
|
||||
set -e && tools/docker/test.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Run integration tests for Docker images
|
||||
- job: installer_build
|
||||
pool:
|
||||
vmImage: windows-2019
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.9
|
||||
architecture: x64
|
||||
addToPath: true
|
||||
- script: |
|
||||
python -m venv venv
|
||||
venv\Scripts\python tools\pip_install.py -e windows-installer
|
||||
displayName: Prepare Windows installer build environment
|
||||
- script: |
|
||||
venv\Scripts\construct-windows-installer
|
||||
displayName: Build Certbot installer
|
||||
- task: CopyFiles@2
|
||||
inputs:
|
||||
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
|
||||
contents: '*.exe'
|
||||
targetFolder: $(Build.ArtifactStagingDirectory)
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
|
||||
artifact: windows-installer
|
||||
displayName: Publish Windows installer
|
||||
- job: installer_run
|
||||
dependsOn: installer_build
|
||||
strategy:
|
||||
matrix:
|
||||
win2019:
|
||||
imageName: windows-2019
|
||||
pool:
|
||||
vmImage: $(imageName)
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.9
|
||||
addToPath: true
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: windows-installer
|
||||
path: $(Build.SourcesDirectory)/bin
|
||||
displayName: Retrieve Windows installer
|
||||
- script: |
|
||||
python -m venv venv
|
||||
venv\Scripts\python tools\pip_install.py -e certbot-ci
|
||||
env:
|
||||
PIP_NO_BUILD_ISOLATION: no
|
||||
displayName: Prepare Certbot-CI
|
||||
- script: |
|
||||
set PATH=%ProgramFiles%\Certbot\bin;%PATH%
|
||||
venv\Scripts\python -m pytest certbot-ci\windows_installer_integration_tests --allow-persistent-changes --installer-path $(Build.SourcesDirectory)\bin\certbot-beta-installer-win_amd64.exe
|
||||
displayName: Run windows installer integration tests
|
||||
- script: |
|
||||
set PATH=%ProgramFiles%\Certbot\bin;%PATH%
|
||||
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
|
||||
displayName: Run certbot integration tests
|
||||
- job: snaps_build
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
@@ -135,7 +76,7 @@ jobs:
|
||||
displayName: Install dependencies
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
versionSpec: 3.12
|
||||
addToPath: true
|
||||
- task: DownloadSecureFile@1
|
||||
name: credentials
|
||||
@@ -166,7 +107,7 @@ jobs:
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
versionSpec: 3.12
|
||||
addToPath: true
|
||||
- script: |
|
||||
set -e
|
||||
@@ -186,7 +127,7 @@ jobs:
|
||||
displayName: Install Certbot snap
|
||||
- script: |
|
||||
set -e
|
||||
venv/bin/python -m tox -e integration-external,apacheconftest-external-with-pebble
|
||||
venv/bin/python -m tox run -e integration-external,apacheconftest-external-with-pebble
|
||||
displayName: Run tox
|
||||
- job: snap_dns_run
|
||||
dependsOn: snaps_build
|
||||
@@ -200,7 +141,7 @@ jobs:
|
||||
displayName: Install dependencies
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
versionSpec: 3.12
|
||||
addToPath: true
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
|
||||
@@ -1,42 +1,26 @@
|
||||
jobs:
|
||||
- job: test
|
||||
variables:
|
||||
PYTHON_VERSION: 3.11
|
||||
PYTHON_VERSION: 3.12
|
||||
strategy:
|
||||
matrix:
|
||||
macos-py38-cover:
|
||||
IMAGE_NAME: macOS-12
|
||||
PYTHON_VERSION: 3.8
|
||||
macos-cover:
|
||||
IMAGE_NAME: macOS-15
|
||||
TOXENV: cover
|
||||
# As of pip 23.1.0, builds started failing on macOS unless this flag was set.
|
||||
# See https://github.com/certbot/certbot/pull/9717#issuecomment-1610861794.
|
||||
PIP_USE_PEP517: "true"
|
||||
macos-cover:
|
||||
IMAGE_NAME: macOS-12
|
||||
TOXENV: cover
|
||||
# See explanation under macos-py38-cover.
|
||||
PIP_USE_PEP517: "true"
|
||||
windows-py38:
|
||||
IMAGE_NAME: windows-2019
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: py-win
|
||||
windows-py39-cover:
|
||||
IMAGE_NAME: windows-2019
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: cover-win
|
||||
windows-integration-certbot:
|
||||
IMAGE_NAME: windows-2019
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration-certbot
|
||||
linux-oldest:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.8
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: oldest
|
||||
linux-py38:
|
||||
linux-py39:
|
||||
# linux unit tests with the oldest python we support
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: py38
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: py39
|
||||
linux-cover:
|
||||
# linux unit+cover tests with the newest python we support
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: cover
|
||||
linux-lint:
|
||||
@@ -47,9 +31,7 @@ jobs:
|
||||
TOXENV: mypy
|
||||
linux-integration:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration
|
||||
ACME_SERVER: pebble
|
||||
apache-compat:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: apache_compat
|
||||
@@ -65,6 +47,6 @@ jobs:
|
||||
- template: ../steps/tox-steps.yml
|
||||
- job: test_sphinx_builds
|
||||
pool:
|
||||
vmImage: ubuntu-20.04
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- template: ../steps/sphinx-steps.yml
|
||||
|
||||
@@ -1,67 +0,0 @@
|
||||
parameters:
|
||||
# We do not define acceptable values for this parameter here as it is passed
|
||||
# through to ../jobs/snap-deploy-job.yml which does its own sanity checking.
|
||||
- name: snapReleaseChannel
|
||||
type: string
|
||||
default: edge
|
||||
|
||||
stages:
|
||||
- stage: Deploy
|
||||
jobs:
|
||||
- template: ../jobs/snap-deploy-job.yml
|
||||
parameters:
|
||||
snapReleaseChannel: ${{ parameters.snapReleaseChannel }}
|
||||
# The credentials used in the following jobs are for the shared
|
||||
# certbotbot account on Docker Hub. The credentials are stored
|
||||
# in a service account which was created by following the
|
||||
# instructions at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
|
||||
# The name given to this service account must match the value
|
||||
# given to containerRegistry below. The authentication used when
|
||||
# creating this service account was a personal access token
|
||||
# rather than a password to bypass 2FA. When Brad set this up,
|
||||
# Azure Pipelines failed to verify the credentials with an error
|
||||
# like "access is forbidden with a JWT issued from a personal
|
||||
# access token", but after saving them without verification, the
|
||||
# access token worked when the pipeline actually ran. "Grant
|
||||
# access to all pipelines" should also be checked on the service
|
||||
# account. The access token can be deleted on Docker Hub if
|
||||
# these credentials need to be revoked.
|
||||
- job: publish_docker_by_arch
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
strategy:
|
||||
matrix:
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
path: $(Build.SourcesDirectory)
|
||||
displayName: Retrieve Docker images
|
||||
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
|
||||
displayName: Load Docker images
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy_images.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Deploy the Docker images by architecture
|
||||
- job: publish_docker_multiarch
|
||||
dependsOn: publish_docker_by_arch
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy_manifests.sh $(dockerTag) all
|
||||
displayName: Deploy the Docker multiarch manifests
|
||||
@@ -0,0 +1,6 @@
|
||||
stages:
|
||||
- stage: Deploy
|
||||
jobs:
|
||||
- template: ../jobs/common-deploy-jobs.yml
|
||||
parameters:
|
||||
snapReleaseChannel: edge
|
||||
38
.azure-pipelines/templates/stages/release-deploy-stage.yml
Normal file
38
.azure-pipelines/templates/stages/release-deploy-stage.yml
Normal file
@@ -0,0 +1,38 @@
|
||||
stages:
|
||||
- stage: Deploy
|
||||
jobs:
|
||||
- template: ../jobs/common-deploy-jobs.yml
|
||||
parameters:
|
||||
snapReleaseChannel: beta
|
||||
- job: create_github_release
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: changelog
|
||||
path: '$(Pipeline.Workspace)'
|
||||
- task: GitHubRelease@1
|
||||
inputs:
|
||||
# this "github-releases" credential is what azure pipelines calls a
|
||||
# "service connection". it was created using the instructions at
|
||||
# https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#github-service-connection
|
||||
# with a fine-grained personal access token from github to limit
|
||||
# the permissions given to azure pipelines. the connection on azure
|
||||
# needs permissions for the "release" pipeline (and maybe the
|
||||
# "full-test-suite" pipeline to simplify testing it). information
|
||||
# on how to set up these permissions can be found at
|
||||
# https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#secure-a-service-connection.
|
||||
# the github token that is used needs "contents:write" and
|
||||
# "workflows:write" permissions for the certbot repo
|
||||
#
|
||||
# as of writing this, the current token will expire on 3/15/2025.
|
||||
# when recreating it, you may also want to create it using the
|
||||
# shared "certbotbot" github account so the credentials aren't tied
|
||||
# to any one dev's github account and their access to the certbot
|
||||
# repo
|
||||
gitHubConnection: github-releases
|
||||
title: ${{ format('Certbot {0}', replace(variables['Build.SourceBranchName'], 'v', '')) }}
|
||||
releaseNotesFilePath: '$(Pipeline.Workspace)/release_notes.md'
|
||||
assets: '$(Build.SourcesDirectory)/packages/{*.tar.gz,SHA256SUMS*}'
|
||||
addChangeLog: false
|
||||
@@ -44,7 +44,7 @@ steps:
|
||||
export TARGET_BRANCH="`echo "${BUILD_SOURCEBRANCH}" | sed -E 's!refs/(heads|tags)/!!g'`"
|
||||
[ -z "${SYSTEM_PULLREQUEST_TARGETBRANCH}" ] || export TARGET_BRANCH="${SYSTEM_PULLREQUEST_TARGETBRANCH}"
|
||||
env
|
||||
python3 -m tox
|
||||
python3 -m tox run
|
||||
env:
|
||||
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
|
||||
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
|
||||
|
||||
1
.github/ISSUE_TEMPLATE/config.yml
vendored
Normal file
1
.github/ISSUE_TEMPLATE/config.yml
vendored
Normal file
@@ -0,0 +1 @@
|
||||
blank_issues_enabled: false
|
||||
2
.github/pull_request_template.md
vendored
2
.github/pull_request_template.md
vendored
@@ -1,6 +1,6 @@
|
||||
## Pull Request Checklist
|
||||
|
||||
- [ ] The Certbot team has recently expressed interest in reviewing a PR for this. If not, this PR may be closed due our limited resources and need to prioritize how we spend them.
|
||||
- [ ] If the change being made is to a [distributed component](https://certbot.eff.org/docs/contributing.html#code-components-and-layout), edit the `master` section of `certbot/CHANGELOG.md` to include a description of the change being made.
|
||||
- [ ] If the change being made is to a [distributed component](https://certbot.eff.org/docs/contributing.html#code-components-and-layout), edit the `main` section of `certbot/CHANGELOG.md` to include a description of the change being made.
|
||||
- [ ] Add or update any documentation as needed to support the changes in this PR.
|
||||
- [ ] Include your name in `AUTHORS.md` if you like.
|
||||
|
||||
25
.github/workflows/merged.yaml
vendored
25
.github/workflows/merged.yaml
vendored
@@ -8,25 +8,14 @@ on:
|
||||
jobs:
|
||||
if_merged:
|
||||
# Forked repos can not access Mattermost secret.
|
||||
if: github.event.pull_request.merged == true && !github.event.pull_request.head.repo.fork
|
||||
if: github.event.pull_request.merged == true && !github.event.pull_request.head.repo.fork
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Create Mattermost Message
|
||||
# https://docs.github.com/en/actions/security-guides/security-hardening-for-github-actions#example-of-a-script-injection-attack
|
||||
env:
|
||||
NUMBER: ${{ github.event.number }}
|
||||
PR_URL: https://github.com/${{ github.repository }}/pull/${{ github.event.number }}
|
||||
REPO: ${{ github.repository }}
|
||||
USER: ${{ github.actor }}
|
||||
TITLE: ${{ github.event.pull_request.title }}
|
||||
run: |
|
||||
jq --null-input \
|
||||
--arg number "$NUMBER" \
|
||||
--arg pr_url "$PR_URL" \
|
||||
--arg repo "$REPO" \
|
||||
--arg user "$USER" \
|
||||
--arg title "$TITLE" \
|
||||
'{ "text": "[\($repo)] | [\($title) #\($number)](\($pr_url)) was merged into master by \($user)" }' > mattermost.json
|
||||
- uses: mattermost/action-mattermost-notify@master
|
||||
env:
|
||||
with:
|
||||
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_MERGE_WEBHOOK }}
|
||||
TEXT: >
|
||||
[${{ github.repository }}] |
|
||||
[${{ github.event.pull_request.title }}
|
||||
#${{ github.event.number }}](https://github.com/${{ github.repository }}/pull/${{ github.event.number }})
|
||||
was merged into main by ${{ github.actor }}
|
||||
|
||||
24
.github/workflows/notify_weekly.yaml
vendored
24
.github/workflows/notify_weekly.yaml
vendored
@@ -2,8 +2,9 @@ name: Weekly Github Update
|
||||
|
||||
on:
|
||||
schedule:
|
||||
# Every week on Thursday @ 13:00
|
||||
- cron: "0 13 * * 4"
|
||||
# Every week on Thursday @ 10:00
|
||||
- cron: "0 10 * * 4"
|
||||
workflow_dispatch:
|
||||
jobs:
|
||||
send-mattermost-message:
|
||||
runs-on: ubuntu-latest
|
||||
@@ -11,15 +12,14 @@ jobs:
|
||||
steps:
|
||||
- name: Create Mattermost Message
|
||||
run: |
|
||||
DATE=$(date --date="7 days ago" +"%Y-%m-%d")
|
||||
MERGED_URL="https://github.com/pulls?q=merged%3A%3E${DATE}+org%3Acertbot"
|
||||
UPDATED_URL="https://github.com/pulls?q=updated%3A%3E${DATE}+org%3Acertbot"
|
||||
echo "{\"text\":\"## Updates Across Certbot Repos\n\n
|
||||
- Certbot team members SHOULD look at: [link]($MERGED_URL)\n\n
|
||||
- Certbot team members MAY also want to look at: [link]($UPDATED_URL)\n\n
|
||||
- Want to Discuss something today? Place it [here](https://docs.google.com/document/d/17YMUbtC1yg6MfiTMwT8zVm9LmO-cuGVBom0qFn8XJBM/edit?usp=sharing) and we can meet today on Zoom.\n\n
|
||||
- The key words SHOULD and MAY in this message are to be interpreted as described in [RFC 8147](https://www.rfc-editor.org/rfc/rfc8174). \"
|
||||
}" > mattermost.json
|
||||
DATE=$(date --date="7 days ago" +"%Y-%m-%d")
|
||||
echo "ASSIGNED_PRS=https://github.com/pulls?q=is%3Apr+is%3Aopen+updated%3A%3E%3D${DATE}+assignee%3A*+user%3Acertbot" >> $GITHUB_ENV
|
||||
echo "UPDATED_URL=https://github.com/issues?q=is%3Aissue+is%3Aopen+sort%3Acomments-desc+updated%3A%3E%3D${DATE}+user%3Acertbot" >> $GITHUB_ENV
|
||||
- uses: mattermost/action-mattermost-notify@master
|
||||
env:
|
||||
with:
|
||||
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_WEBHOOK_URL }}
|
||||
MATTERMOST_CHANNEL: private-certbot
|
||||
TEXT: |
|
||||
## Updates In the Past Week
|
||||
- Most commented in the last week: [link](${{ env.UPDATED_URL }})
|
||||
- Updated (assigned) PRs in the last week: [link](${{ env.ASSIGNED_PRS }})
|
||||
|
||||
1
.github/workflows/stale.yml
vendored
1
.github/workflows/stale.yml
vendored
@@ -3,6 +3,7 @@ on:
|
||||
schedule:
|
||||
# Run 1:24AM every night
|
||||
- cron: '24 1 * * *'
|
||||
workflow_dispatch:
|
||||
permissions:
|
||||
issues: write
|
||||
jobs:
|
||||
|
||||
@@ -69,7 +69,7 @@ ignored-modules=
|
||||
# CERTBOT COMMENT
|
||||
# This is needed for pylint to import linter_plugin.py since
|
||||
# https://github.com/PyCQA/pylint/pull/3396.
|
||||
init-hook="import pylint.config, os, sys; sys.path.append(os.path.dirname(pylint.config.PYLINTRC))"
|
||||
init-hook="import pylint.config, os, sys; sys.path.append(os.path.dirname(next(pylint.config.find_default_config_files())))"
|
||||
|
||||
# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
|
||||
# number of processors available to use, and will cap the count on Windows to
|
||||
@@ -266,8 +266,8 @@ valid-metaclass-classmethod-first-arg=cls
|
||||
[EXCEPTIONS]
|
||||
|
||||
# Exceptions that will emit a warning when caught.
|
||||
overgeneral-exceptions=BaseException,
|
||||
Exception
|
||||
overgeneral-exceptions=builtins.BaseException,
|
||||
builtins.Exception
|
||||
|
||||
|
||||
[FORMAT]
|
||||
@@ -524,7 +524,7 @@ ignored-classes=optparse.Values,thread._local,_thread._local,argparse.Namespace,
|
||||
# List of module names for which member attributes should not be checked
|
||||
# (useful for modules/projects where namespaces are manipulated during runtime
|
||||
# and thus existing member attributes cannot be deduced by static analysis
|
||||
ignored-modules=pkg_resources,confargparse,argparse
|
||||
ignored-modules=confargparse,argparse
|
||||
|
||||
# Show a hint with possible names when a member name was not found. The aspect
|
||||
# of finding the hint is based on edit distance.
|
||||
|
||||
@@ -94,6 +94,7 @@ Authors
|
||||
* [Felix Yan](https://github.com/felixonmars)
|
||||
* [Filip Ochnik](https://github.com/filipochnik)
|
||||
* [Florian Klink](https://github.com/flokli)
|
||||
* [Francesco Colista](https://github.com/fcolista)
|
||||
* [Francois Marier](https://github.com/fmarier)
|
||||
* [Frank](https://github.com/Frankkkkk)
|
||||
* [Frederic BLANC](https://github.com/fblanc)
|
||||
@@ -138,6 +139,7 @@ Authors
|
||||
* [John Reed](https://github.com/leerspace)
|
||||
* [Jonas Berlin](https://github.com/xkr47)
|
||||
* [Jonathan Herlin](https://github.com/Jonher937)
|
||||
* [Jonathan Vanasco](https://github.com/jvanasco)
|
||||
* [Jon Walsh](https://github.com/code-tree)
|
||||
* [Joona Hoikkala](https://github.com/joohoi)
|
||||
* [Josh McCullough](https://github.com/JoshMcCullough)
|
||||
@@ -165,6 +167,7 @@ Authors
|
||||
* [Luca Ebach](https://github.com/lucebac)
|
||||
* [Luca Olivetti](https://github.com/olivluca)
|
||||
* [Luke Rogers](https://github.com/lukeroge)
|
||||
* [Lukhnos Liu](https://github.com/lukhnos)
|
||||
* [Maarten](https://github.com/mrtndwrd)
|
||||
* [Mads Jensen](https://github.com/atombrella)
|
||||
* [Maikel Martens](https://github.com/krukas)
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: acme/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: acme/readthedocs.org.requirements.txt
|
||||
@@ -24,6 +24,7 @@ from acme.client import ClientV2
|
||||
|
||||
CERT_SAN_PEM = test_util.load_vector('cert-san.pem')
|
||||
CSR_MIXED_PEM = test_util.load_vector('csr-mixed.pem')
|
||||
CSR_NO_SANS_PEM = test_util.load_vector('csr-nosans.pem')
|
||||
KEY = jose.JWKRSA.load(test_util.load_vector('rsa512_key.pem'))
|
||||
|
||||
DIRECTORY_V2 = messages.Directory({
|
||||
@@ -97,6 +98,10 @@ class ClientV2Test(unittest.TestCase):
|
||||
body=self.order,
|
||||
uri='https://www.letsencrypt-demo.org/acme/acct/1/order/1',
|
||||
authorizations=[self.authzr, self.authzr2], csr_pem=CSR_MIXED_PEM)
|
||||
self.orderr2 = messages.OrderResource(
|
||||
body=self.order,
|
||||
uri='https://www.letsencrypt-demo.org/acme/acct/1/order/1',
|
||||
authorizations=[self.authzr, self.authzr2], csr_pem=CSR_NO_SANS_PEM)
|
||||
|
||||
def test_new_account(self):
|
||||
self.response.status_code = http_client.CREATED
|
||||
@@ -158,6 +163,10 @@ class ClientV2Test(unittest.TestCase):
|
||||
mock_post_as_get.side_effect = (authz_response, authz_response2)
|
||||
assert self.client.new_order(CSR_MIXED_PEM) == self.orderr
|
||||
|
||||
with mock.patch('acme.client.ClientV2._post_as_get') as mock_post_as_get:
|
||||
mock_post_as_get.side_effect = (authz_response, authz_response2)
|
||||
assert self.client.new_order(CSR_NO_SANS_PEM) == self.orderr2
|
||||
|
||||
def test_answer_challege(self):
|
||||
self.response.links['up'] = {'url': self.challr.authzr_uri}
|
||||
self.response.json.return_value = self.challr.body.to_json()
|
||||
|
||||
@@ -12,11 +12,21 @@ import unittest
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
import pytest
|
||||
from cryptography import x509
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa, x25519
|
||||
|
||||
from acme import errors
|
||||
from acme._internal.tests import test_util
|
||||
|
||||
|
||||
class FormatTest(unittest.TestCase):
|
||||
def test_to_cryptography_encoding(self):
|
||||
from acme.crypto_util import Format
|
||||
assert Format.DER.to_cryptography_encoding() == serialization.Encoding.DER
|
||||
assert Format.PEM.to_cryptography_encoding() == serialization.Encoding.PEM
|
||||
|
||||
|
||||
class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
"""Tests for acme.crypto_util.SSLSocket/probe_sni."""
|
||||
|
||||
@@ -177,48 +187,6 @@ class PyOpenSSLCertOrReqSANTest(unittest.TestCase):
|
||||
['chicago-cubs.venafi.example', 'cubs.venafi.example']
|
||||
|
||||
|
||||
class PyOpenSSLCertOrReqSANIPTest(unittest.TestCase):
|
||||
"""Test for acme.crypto_util._pyopenssl_cert_or_req_san_ip."""
|
||||
|
||||
@classmethod
|
||||
def _call(cls, loader, name):
|
||||
# pylint: disable=protected-access
|
||||
from acme.crypto_util import _pyopenssl_cert_or_req_san_ip
|
||||
return _pyopenssl_cert_or_req_san_ip(loader(name))
|
||||
|
||||
def _call_cert(self, name):
|
||||
return self._call(test_util.load_cert, name)
|
||||
|
||||
def _call_csr(self, name):
|
||||
return self._call(test_util.load_csr, name)
|
||||
|
||||
def test_cert_no_sans(self):
|
||||
assert self._call_cert('cert.pem') == []
|
||||
|
||||
def test_csr_no_sans(self):
|
||||
assert self._call_csr('csr-nosans.pem') == []
|
||||
|
||||
def test_cert_domain_sans(self):
|
||||
assert self._call_cert('cert-san.pem') == []
|
||||
|
||||
def test_csr_domain_sans(self):
|
||||
assert self._call_csr('csr-san.pem') == []
|
||||
|
||||
def test_cert_ip_two_sans(self):
|
||||
assert self._call_cert('cert-ipsans.pem') == ['192.0.2.145', '203.0.113.1']
|
||||
|
||||
def test_csr_ip_two_sans(self):
|
||||
assert self._call_csr('csr-ipsans.pem') == ['192.0.2.145', '203.0.113.1']
|
||||
|
||||
def test_csr_ipv6_sans(self):
|
||||
assert self._call_csr('csr-ipv6sans.pem') == \
|
||||
['0:0:0:0:0:0:0:1', 'A3BE:32F3:206E:C75D:956:CEE:9858:5EC5']
|
||||
|
||||
def test_cert_ipv6_sans(self):
|
||||
assert self._call_cert('cert-ipv6sans.pem') == \
|
||||
['0:0:0:0:0:0:0:1', 'A3BE:32F3:206E:C75D:956:CEE:9858:5EC5']
|
||||
|
||||
|
||||
class GenSsCertTest(unittest.TestCase):
|
||||
"""Test for gen_ss_cert (generation of self-signed cert)."""
|
||||
|
||||
@@ -250,79 +218,74 @@ class MakeCSRTest(unittest.TestCase):
|
||||
|
||||
@classmethod
|
||||
def _call_with_key(cls, *args, **kwargs):
|
||||
privkey = OpenSSL.crypto.PKey()
|
||||
privkey.generate_key(OpenSSL.crypto.TYPE_RSA, 2048)
|
||||
privkey_pem = OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM, privkey)
|
||||
privkey = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
privkey_pem = privkey.private_bytes(
|
||||
serialization.Encoding.PEM,
|
||||
serialization.PrivateFormat.PKCS8,
|
||||
serialization.NoEncryption(),
|
||||
)
|
||||
from acme.crypto_util import make_csr
|
||||
|
||||
return make_csr(privkey_pem, *args, **kwargs)
|
||||
|
||||
def test_make_csr(self):
|
||||
csr_pem = self._call_with_key(["a.example", "b.example"])
|
||||
assert b'--BEGIN CERTIFICATE REQUEST--' in csr_pem
|
||||
assert b'--END CERTIFICATE REQUEST--' in csr_pem
|
||||
csr = OpenSSL.crypto.load_certificate_request(
|
||||
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
|
||||
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
|
||||
# have a get_extensions() method, so we skip this test if the method
|
||||
# isn't available.
|
||||
if hasattr(csr, 'get_extensions'):
|
||||
assert len(csr.get_extensions()) == 1
|
||||
assert csr.get_extensions()[0].get_data() == \
|
||||
OpenSSL.crypto.X509Extension(
|
||||
b'subjectAltName',
|
||||
critical=False,
|
||||
value=b'DNS:a.example, DNS:b.example',
|
||||
).get_data()
|
||||
assert b"--BEGIN CERTIFICATE REQUEST--" in csr_pem
|
||||
assert b"--END CERTIFICATE REQUEST--" in csr_pem
|
||||
csr = x509.load_pem_x509_csr(csr_pem)
|
||||
|
||||
assert len(csr.extensions) == 1
|
||||
assert list(
|
||||
csr.extensions.get_extension_for_class(x509.SubjectAlternativeName).value
|
||||
) == [
|
||||
x509.DNSName("a.example"),
|
||||
x509.DNSName("b.example"),
|
||||
]
|
||||
|
||||
def test_make_csr_ip(self):
|
||||
csr_pem = self._call_with_key(["a.example"], False, [ipaddress.ip_address('127.0.0.1'), ipaddress.ip_address('::1')])
|
||||
assert b'--BEGIN CERTIFICATE REQUEST--' in csr_pem
|
||||
assert b'--END CERTIFICATE REQUEST--' in csr_pem
|
||||
csr = OpenSSL.crypto.load_certificate_request(
|
||||
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
|
||||
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
|
||||
# have a get_extensions() method, so we skip this test if the method
|
||||
# isn't available.
|
||||
if hasattr(csr, 'get_extensions'):
|
||||
assert len(csr.get_extensions()) == 1
|
||||
assert csr.get_extensions()[0].get_data() == \
|
||||
OpenSSL.crypto.X509Extension(
|
||||
b'subjectAltName',
|
||||
critical=False,
|
||||
value=b'DNS:a.example, IP:127.0.0.1, IP:::1',
|
||||
).get_data()
|
||||
# for IP san it's actually need to be octet-string,
|
||||
# but somewhere downstream thankfully handle it for us
|
||||
csr_pem = self._call_with_key(
|
||||
["a.example"],
|
||||
False,
|
||||
[ipaddress.ip_address("127.0.0.1"), ipaddress.ip_address("::1")],
|
||||
)
|
||||
assert b"--BEGIN CERTIFICATE REQUEST--" in csr_pem
|
||||
assert b"--END CERTIFICATE REQUEST--" in csr_pem
|
||||
|
||||
csr = x509.load_pem_x509_csr(csr_pem)
|
||||
|
||||
assert len(csr.extensions) == 1
|
||||
assert list(
|
||||
csr.extensions.get_extension_for_class(x509.SubjectAlternativeName).value
|
||||
) == [
|
||||
x509.DNSName("a.example"),
|
||||
x509.IPAddress(ipaddress.ip_address("127.0.0.1")),
|
||||
x509.IPAddress(ipaddress.ip_address("::1")),
|
||||
]
|
||||
|
||||
def test_make_csr_must_staple(self):
|
||||
csr_pem = self._call_with_key(["a.example"], must_staple=True)
|
||||
csr = OpenSSL.crypto.load_certificate_request(
|
||||
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
|
||||
csr = x509.load_pem_x509_csr(csr_pem)
|
||||
|
||||
# In pyopenssl 0.13 (used with TOXENV=py27-oldest), csr objects don't
|
||||
# have a get_extensions() method, so we skip this test if the method
|
||||
# isn't available.
|
||||
if hasattr(csr, 'get_extensions'):
|
||||
assert len(csr.get_extensions()) == 2
|
||||
# NOTE: Ideally we would filter by the TLS Feature OID, but
|
||||
# OpenSSL.crypto.X509Extension doesn't give us the extension's raw OID,
|
||||
# and the shortname field is just "UNDEF"
|
||||
must_staple_exts = [e for e in csr.get_extensions()
|
||||
if e.get_data() == b"0\x03\x02\x01\x05"]
|
||||
assert len(must_staple_exts) == 1, \
|
||||
"Expected exactly one Must Staple extension"
|
||||
assert len(csr.extensions) == 2
|
||||
assert list(csr.extensions.get_extension_for_class(x509.TLSFeature).value) == [
|
||||
x509.TLSFeatureType.status_request
|
||||
]
|
||||
|
||||
def test_make_csr_without_hostname(self):
|
||||
with pytest.raises(ValueError):
|
||||
self._call_with_key()
|
||||
|
||||
def test_make_csr_correct_version(self):
|
||||
csr_pem = self._call_with_key(["a.example"])
|
||||
csr = OpenSSL.crypto.load_certificate_request(
|
||||
OpenSSL.crypto.FILETYPE_PEM, csr_pem)
|
||||
def test_make_csr_invalid_key_type(self):
|
||||
privkey = x25519.X25519PrivateKey.generate()
|
||||
privkey_pem = privkey.private_bytes(
|
||||
serialization.Encoding.PEM,
|
||||
serialization.PrivateFormat.PKCS8,
|
||||
serialization.NoEncryption(),
|
||||
)
|
||||
from acme.crypto_util import make_csr
|
||||
|
||||
assert csr.get_version() == 0, \
|
||||
"Expected CSR version to be v1 (encoded as 0), per RFC 2986, section 4"
|
||||
with pytest.raises(ValueError):
|
||||
make_csr(privkey_pem, ["a.example"])
|
||||
|
||||
|
||||
class DumpPyopensslChainTest(unittest.TestCase):
|
||||
|
||||
@@ -21,7 +21,7 @@ class HeaderTest(unittest.TestCase):
|
||||
except (ValueError, TypeError):
|
||||
assert True
|
||||
else:
|
||||
assert False # pragma: no cover
|
||||
pytest.fail("Exception from jose.b64decode wasn't raised") # pragma: no cover
|
||||
|
||||
def test_nonce_decoder(self):
|
||||
from acme.jws import Header
|
||||
|
||||
@@ -193,7 +193,7 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
|
||||
from acme.standalone import BaseDualNetworkedServers
|
||||
|
||||
mock_bind.side_effect = socket.error(EADDRINUSE, "Fake addr in use error")
|
||||
mock_bind.side_effect = OSError(EADDRINUSE, "Fake addr in use error")
|
||||
|
||||
with pytest.raises(socket.error) as exc_info:
|
||||
BaseDualNetworkedServers(
|
||||
|
||||
@@ -3,6 +3,7 @@
|
||||
.. warning:: This module is not part of the public API.
|
||||
|
||||
"""
|
||||
import importlib.resources
|
||||
import os
|
||||
import sys
|
||||
|
||||
@@ -12,16 +13,11 @@ import josepy as jose
|
||||
from josepy.util import ComparableECKey
|
||||
from OpenSSL import crypto
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
|
||||
def load_vector(*names):
|
||||
"""Load contents of a test vector."""
|
||||
# luckily, resource_string opens file in binary mode
|
||||
vector_ref = importlib_resources.files(__package__).joinpath('testdata', *names)
|
||||
vector_ref = importlib.resources.files(__package__).joinpath('testdata', *names)
|
||||
return vector_ref.read_bytes()
|
||||
|
||||
|
||||
|
||||
28
acme/acme/_internal/tests/testdata/csr-mixed.pem
vendored
28
acme/acme/_internal/tests/testdata/csr-mixed.pem
vendored
@@ -1,16 +1,16 @@
|
||||
-----BEGIN CERTIFICATE REQUEST-----
|
||||
MIICdjCCAV4CAQIwADCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAMXq
|
||||
v1y8EIcCbaUIzCtOcLkLS0MJ35oS+6DmV5WB1A0cIk6YrjsHIsY2lwMm13BWIvmw
|
||||
tY+Y6n0rr7eViNx5ZRGHpHEI/TL3Neb+VefTydL5CgvK3dd4ex2kSbTaed3fmpOx
|
||||
qMajEduwNcZPCcmoEXPkfrCP8w2vKQUkQ+JRPcdX1nTuzticeRP5B7YCmJsmxkEh
|
||||
Y0tzzZ+NIRDARoYNofefY86h3e5q66gtJxccNchmIM3YQahhg5n3Xoo8hGfM/TIc
|
||||
R7ncCBCLO6vtqo0QFva/NQODrgOmOsmgvqPkUWQFdZfWM8yIaU826dktx0CPB78t
|
||||
TudnJ1rBRvGsjHMsZikCAwEAAaAxMC8GCSqGSIb3DQEJDjEiMCAwHgYDVR0RBBcw
|
||||
FYINYS5leGVtcGxlLmNvbYcEwAACbzANBgkqhkiG9w0BAQsFAAOCAQEAdGMcRCxq
|
||||
1X09gn1TNdMt64XUv+wdJCKDaJ+AgyIJj7QvVw8H5k7dOnxS4I+a/yo4jE+LDl2/
|
||||
AuHcBLFEI4ddewdJSMrTNZjuRYuOdr3KP7fL7MffICSBi45vw5EOXg0tnjJCEiKu
|
||||
6gcJgbLSP5JMMd7Haf33Q/VWsmHofR3VwOMdrnakwAU3Ff5WTuXTNVhL1kT/uLFX
|
||||
yW1ru6BF4unwNqSR2UeulljpNfRBsiN4zJK11W6n9KT0NkBr9zY5WCM4sW7i8k9V
|
||||
TeypWGo3jBKzYAGeuxZsB97U77jZ2lrGdBLZKfbcjnTeRVqCvCRrui4El7UGYFmj
|
||||
7s6OJyWx5DSV8w==
|
||||
MIICdjCCAV4CAQAwADCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBANoV
|
||||
T1pdvRUUBOqvm7M2ebLEHV7higUH7qAGUZEkfP6W4YriYVY+IHrH1svNPSa+oPTK
|
||||
7weDNmT11ehWnGyECIM9z2r2Hi9yVV0ycxh4hWQ4Nt8BAKZwCwaXpyWm7Gj6m2Ez
|
||||
pSN5Dd67g5YAQBrUUh1+RRbFi9c0Ls/6ZOExMvfg8kqt4c2sXCgH1IFnxvvOjBYo
|
||||
p7xh0x3L1Akyax0tw8qgQp/z5mkupmVDNJYPFmbzFPMNyDR61ed6QUTDg7P4UAuF
|
||||
kejLLzFvz5YaO7vC+huaTuPhInAhpzqpr4yU97KIjos2/83Itu/Cv8U1RAeEeRTk
|
||||
h0WjUfltoem/5f8bIdsCAwEAAaAxMC8GCSqGSIb3DQEJDjEiMCAwHgYDVR0RBBcw
|
||||
FYINYS5leGVtcGxlLmNvbYcEwAACbzANBgkqhkiG9w0BAQsFAAOCAQEAQ7n/hYen
|
||||
5INHlcslHPYCQ/BAbX6Ou+Y8hUu8puWNVpE2OM95L2C87jbWwTmCRnkFBwtyoNqo
|
||||
j3DXVW2RYv8y/exq7V6Y5LtpHTgwfugINJ3XlcVzA4Vnf1xqOxv3kwejkq74RuXn
|
||||
xd5N28srgiFqb0e4tOAWVI8Tw27bgBqjoXl0QDFPZpctqUia5bcDJ9WzNSM7VaO1
|
||||
CBNGHBRz+zL8sqoqJA4HV58tjcgzl+1RtGM+iUHxXpnH+aCNKWIUINrAzIm4Sm00
|
||||
93RJjhb1kdNR0BC7ikWVbAWaVviHdvATK/RfpmhWDqfEaNgBpvT91GnkhpzctSFD
|
||||
ro0yCUUXXrIr0w==
|
||||
-----END CERTIFICATE REQUEST-----
|
||||
|
||||
@@ -15,6 +15,7 @@ from typing import Type
|
||||
from typing import TypeVar
|
||||
from typing import Union
|
||||
|
||||
from cryptography import x509
|
||||
from cryptography.hazmat.primitives import hashes
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
@@ -89,7 +90,7 @@ class _TokenChallenge(Challenge):
|
||||
:ivar bytes token:
|
||||
|
||||
"""
|
||||
TOKEN_SIZE = 128 / 8 # Based on the entropy value from the spec
|
||||
TOKEN_SIZE = 128 // 8 # Based on the entropy value from the spec
|
||||
"""Minimum size of the :attr:`token` in bytes."""
|
||||
|
||||
# TODO: acme-spec doesn't specify token as base64-encoded value
|
||||
@@ -460,34 +461,39 @@ class TLSALPN01Response(KeyAuthorizationChallengeResponse):
|
||||
return crypto_util.probe_sni(host=host.encode(), port=port, name=domain.encode(),
|
||||
alpn_protocols=[self.ACME_TLS_1_PROTOCOL])
|
||||
|
||||
def verify_cert(self, domain: str, cert: crypto.X509) -> bool:
|
||||
def verify_cert(self, domain: str, cert: Union[x509.Certificate, crypto.X509]) -> bool:
|
||||
"""Verify tls-alpn-01 challenge certificate.
|
||||
|
||||
:param str domain: Domain name being validated.
|
||||
:param OpensSSL.crypto.X509 cert: Challenge certificate.
|
||||
:param cert: Challenge certificate.
|
||||
:type cert: `cryptography.x509.Certificate` or `OpenSSL.crypto.X509`
|
||||
|
||||
:returns: Whether the certificate was successfully verified.
|
||||
:rtype: bool
|
||||
|
||||
"""
|
||||
# pylint: disable=protected-access
|
||||
names = crypto_util._pyopenssl_cert_or_req_all_names(cert)
|
||||
# Type ignore needed due to
|
||||
# https://github.com/pyca/pyopenssl/issues/730.
|
||||
logger.debug('Certificate %s. SANs: %s',
|
||||
cert.digest('sha256'), names)
|
||||
if not isinstance(cert, x509.Certificate):
|
||||
cert = cert.to_cryptography()
|
||||
|
||||
names = crypto_util.get_names_from_subject_and_extensions(
|
||||
cert.subject, cert.extensions
|
||||
)
|
||||
logger.debug(
|
||||
"Certificate %s. SANs: %s", cert.fingerprint(hashes.SHA256()), names
|
||||
)
|
||||
if len(names) != 1 or names[0].lower() != domain.lower():
|
||||
return False
|
||||
|
||||
for i in range(cert.get_extension_count()):
|
||||
ext = cert.get_extension(i)
|
||||
# FIXME: assume this is the ACME extension. Currently there is no
|
||||
# way to get full OID of an unknown extension from pyopenssl.
|
||||
if ext.get_short_name() == b'UNDEF':
|
||||
data = ext.get_data()
|
||||
return data == self.h
|
||||
try:
|
||||
ext = cert.extensions.get_extension_for_oid(
|
||||
x509.ObjectIdentifier(self.ID_PE_ACME_IDENTIFIER_V1.decode())
|
||||
)
|
||||
except x509.ExtensionNotFound:
|
||||
return False
|
||||
|
||||
return False
|
||||
# This is for the type checker.
|
||||
assert isinstance(ext.value, x509.UnrecognizedExtension)
|
||||
return ext.value.value == self.h
|
||||
|
||||
# pylint: disable=too-many-arguments
|
||||
def simple_verify(self, chall: 'TLSALPN01', domain: str, account_public_key: jose.JWK,
|
||||
|
||||
@@ -12,10 +12,11 @@ from typing import List
|
||||
from typing import Mapping
|
||||
from typing import Optional
|
||||
from typing import Set
|
||||
from typing import Text
|
||||
from typing import Tuple
|
||||
from typing import Union
|
||||
|
||||
from cryptography import x509
|
||||
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
import requests
|
||||
@@ -122,18 +123,21 @@ class ClientV2:
|
||||
:returns: The newly created order.
|
||||
:rtype: OrderResource
|
||||
"""
|
||||
csr = OpenSSL.crypto.load_certificate_request(OpenSSL.crypto.FILETYPE_PEM, csr_pem)
|
||||
# pylint: disable=protected-access
|
||||
dnsNames = crypto_util._pyopenssl_cert_or_req_all_names(csr)
|
||||
ipNames = crypto_util._pyopenssl_cert_or_req_san_ip(csr)
|
||||
# ipNames is now []string
|
||||
csr = x509.load_pem_x509_csr(csr_pem)
|
||||
dnsNames = crypto_util.get_names_from_subject_and_extensions(csr.subject, csr.extensions)
|
||||
try:
|
||||
san_ext = csr.extensions.get_extension_for_class(x509.SubjectAlternativeName)
|
||||
except x509.ExtensionNotFound:
|
||||
ipNames = []
|
||||
else:
|
||||
ipNames = san_ext.value.get_values_for_type(x509.IPAddress)
|
||||
identifiers = []
|
||||
for name in dnsNames:
|
||||
identifiers.append(messages.Identifier(typ=messages.IDENTIFIER_FQDN,
|
||||
value=name))
|
||||
for ips in ipNames:
|
||||
for ip in ipNames:
|
||||
identifiers.append(messages.Identifier(typ=messages.IDENTIFIER_IP,
|
||||
value=ips))
|
||||
value=str(ip)))
|
||||
order = messages.NewOrder(identifiers=identifiers)
|
||||
response = self._post(self.directory['newOrder'], order)
|
||||
body = messages.Order.from_json(response.json())
|
||||
@@ -517,7 +521,7 @@ class ClientNetwork:
|
||||
self.account = account
|
||||
self.alg = alg
|
||||
self.verify_ssl = verify_ssl
|
||||
self._nonces: Set[Text] = set()
|
||||
self._nonces: Set[str] = set()
|
||||
self.user_agent = user_agent
|
||||
self.session = requests.Session()
|
||||
self._default_timeout = timeout
|
||||
|
||||
@@ -1,11 +1,12 @@
|
||||
"""Crypto utilities."""
|
||||
import binascii
|
||||
import contextlib
|
||||
import enum
|
||||
import ipaddress
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
import socket
|
||||
import typing
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import List
|
||||
@@ -16,6 +17,9 @@ from typing import Set
|
||||
from typing import Tuple
|
||||
from typing import Union
|
||||
|
||||
from cryptography import x509
|
||||
from cryptography.hazmat.primitives import hashes, serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import dsa, rsa, ec, ed25519, ed448
|
||||
import josepy as jose
|
||||
from OpenSSL import crypto
|
||||
from OpenSSL import SSL
|
||||
@@ -34,6 +38,24 @@ logger = logging.getLogger(__name__)
|
||||
_DEFAULT_SSL_METHOD = SSL.SSLv23_METHOD
|
||||
|
||||
|
||||
class Format(enum.IntEnum):
|
||||
"""File format to be used when parsing or serializing X.509 structures.
|
||||
|
||||
Backwards compatible with the `FILETYPE_ASN1` and `FILETYPE_PEM` constants
|
||||
from pyOpenSSL.
|
||||
"""
|
||||
DER = crypto.FILETYPE_ASN1
|
||||
PEM = crypto.FILETYPE_PEM
|
||||
|
||||
def to_cryptography_encoding(self) -> serialization.Encoding:
|
||||
"""Converts the Format to the corresponding cryptography `Encoding`.
|
||||
"""
|
||||
if self == Format.DER:
|
||||
return serialization.Encoding.DER
|
||||
else:
|
||||
return serialization.Encoding.PEM
|
||||
|
||||
|
||||
class _DefaultCertSelection:
|
||||
def __init__(self, certs: Mapping[bytes, Tuple[crypto.PKey, crypto.X509]]):
|
||||
self.certs = certs
|
||||
@@ -73,13 +95,9 @@ class SSLSocket: # pylint: disable=too-few-public-methods
|
||||
raise ValueError("Neither cert_selection or certs specified.")
|
||||
if cert_selection and certs:
|
||||
raise ValueError("Both cert_selection and certs specified.")
|
||||
actual_cert_selection: Union[_DefaultCertSelection,
|
||||
Optional[Callable[[SSL.Connection],
|
||||
Optional[Tuple[crypto.PKey,
|
||||
crypto.X509]]]]] = cert_selection
|
||||
if actual_cert_selection is None:
|
||||
actual_cert_selection = _DefaultCertSelection(certs if certs else {})
|
||||
self.cert_selection = actual_cert_selection
|
||||
if cert_selection is None:
|
||||
cert_selection = _DefaultCertSelection(certs if certs else {})
|
||||
self.cert_selection = cert_selection
|
||||
|
||||
def __getattr__(self, name: str) -> Any:
|
||||
return getattr(self.sock, name)
|
||||
@@ -131,7 +149,7 @@ class SSLSocket: # pylint: disable=too-few-public-methods
|
||||
# in the standard library. This is useful when this object is
|
||||
# used by code which expects a standard socket such as
|
||||
# socketserver in the standard library.
|
||||
raise socket.error(error)
|
||||
raise OSError(error)
|
||||
|
||||
def accept(self) -> Tuple[FakeConnection, Any]: # pylint: disable=missing-function-docstring
|
||||
sock, addr = self.sock.accept()
|
||||
@@ -155,7 +173,7 @@ class SSLSocket: # pylint: disable=too-few-public-methods
|
||||
except SSL.Error as error:
|
||||
# _pick_certificate_cb might have returned without
|
||||
# creating SSL context (wrong server name)
|
||||
raise socket.error(error)
|
||||
raise OSError(error)
|
||||
|
||||
return ssl_sock, addr
|
||||
except:
|
||||
@@ -203,7 +221,7 @@ def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, #
|
||||
)
|
||||
socket_tuple: Tuple[bytes, int] = (host, port)
|
||||
sock = socket.create_connection(socket_tuple, **socket_kwargs) # type: ignore[arg-type]
|
||||
except socket.error as error:
|
||||
except OSError as error:
|
||||
raise errors.Error(error)
|
||||
|
||||
with contextlib.closing(sock) as client:
|
||||
@@ -222,77 +240,118 @@ def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, #
|
||||
return cert
|
||||
|
||||
|
||||
def make_csr(private_key_pem: bytes, domains: Optional[Union[Set[str], List[str]]] = None,
|
||||
must_staple: bool = False,
|
||||
ipaddrs: Optional[List[Union[ipaddress.IPv4Address, ipaddress.IPv6Address]]] = None
|
||||
) -> bytes:
|
||||
def make_csr(
|
||||
private_key_pem: bytes,
|
||||
domains: Optional[Union[Set[str], List[str]]] = None,
|
||||
must_staple: bool = False,
|
||||
ipaddrs: Optional[List[Union[ipaddress.IPv4Address, ipaddress.IPv6Address]]] = None,
|
||||
) -> bytes:
|
||||
"""Generate a CSR containing domains or IPs as subjectAltNames.
|
||||
|
||||
Parameters are ordered this way for backwards compatibility when called using positional
|
||||
arguments.
|
||||
|
||||
:param buffer private_key_pem: Private key, in PEM PKCS#8 format.
|
||||
:param list domains: List of DNS names to include in subjectAltNames of CSR.
|
||||
:param bool must_staple: Whether to include the TLS Feature extension (aka
|
||||
OCSP Must Staple: https://tools.ietf.org/html/rfc7633).
|
||||
:param list ipaddrs: List of IPaddress(type ipaddress.IPv4Address or ipaddress.IPv6Address)
|
||||
names to include in subbjectAltNames of CSR.
|
||||
params ordered this way for backward competablity when called by positional argument.
|
||||
names to include in subbjectAltNames of CSR.
|
||||
|
||||
:returns: buffer PEM-encoded Certificate Signing Request.
|
||||
|
||||
"""
|
||||
private_key = crypto.load_privatekey(
|
||||
crypto.FILETYPE_PEM, private_key_pem)
|
||||
csr = crypto.X509Req()
|
||||
sanlist = []
|
||||
# if domain or ip list not supplied make it empty list so it's easier to iterate
|
||||
private_key = serialization.load_pem_private_key(private_key_pem, password=None)
|
||||
# There are a few things that aren't valid for x509 signing. mypy
|
||||
# complains if we don't check.
|
||||
if not isinstance(
|
||||
private_key,
|
||||
(
|
||||
dsa.DSAPrivateKey,
|
||||
rsa.RSAPrivateKey,
|
||||
ec.EllipticCurvePrivateKey,
|
||||
ed25519.Ed25519PrivateKey,
|
||||
ed448.Ed448PrivateKey,
|
||||
),
|
||||
):
|
||||
raise ValueError(f"Invalid private key type: {type(private_key)}")
|
||||
if domains is None:
|
||||
domains = []
|
||||
if ipaddrs is None:
|
||||
ipaddrs = []
|
||||
if len(domains)+len(ipaddrs) == 0:
|
||||
raise ValueError("At least one of domains or ipaddrs parameter need to be not empty")
|
||||
for address in domains:
|
||||
sanlist.append('DNS:' + address)
|
||||
for ips in ipaddrs:
|
||||
sanlist.append('IP:' + ips.exploded)
|
||||
# make sure its ascii encoded
|
||||
san_string = ', '.join(sanlist).encode('ascii')
|
||||
# for IP san it's actually need to be octet-string,
|
||||
# but somewhere downsteam thankfully handle it for us
|
||||
extensions = [
|
||||
crypto.X509Extension(
|
||||
b'subjectAltName',
|
||||
if len(domains) + len(ipaddrs) == 0:
|
||||
raise ValueError(
|
||||
"At least one of domains or ipaddrs parameter need to be not empty"
|
||||
)
|
||||
|
||||
builder = (
|
||||
x509.CertificateSigningRequestBuilder()
|
||||
.subject_name(x509.Name([]))
|
||||
.add_extension(
|
||||
x509.SubjectAlternativeName(
|
||||
[x509.DNSName(d) for d in domains]
|
||||
+ [x509.IPAddress(i) for i in ipaddrs]
|
||||
),
|
||||
critical=False,
|
||||
value=san_string
|
||||
),
|
||||
]
|
||||
)
|
||||
)
|
||||
if must_staple:
|
||||
extensions.append(crypto.X509Extension(
|
||||
b"1.3.6.1.5.5.7.1.24",
|
||||
builder = builder.add_extension(
|
||||
# "status_request" is the feature commonly known as OCSP
|
||||
# Must-Staple
|
||||
x509.TLSFeature([x509.TLSFeatureType.status_request]),
|
||||
critical=False,
|
||||
value=b"DER:30:03:02:01:05"))
|
||||
csr.add_extensions(extensions)
|
||||
csr.set_pubkey(private_key)
|
||||
# RFC 2986 Section 4.1 only defines version 0
|
||||
csr.set_version(0)
|
||||
csr.sign(private_key, 'sha256')
|
||||
return crypto.dump_certificate_request(
|
||||
crypto.FILETYPE_PEM, csr)
|
||||
)
|
||||
|
||||
csr = builder.sign(private_key, hashes.SHA256())
|
||||
return csr.public_bytes(serialization.Encoding.PEM)
|
||||
|
||||
|
||||
def get_names_from_subject_and_extensions(
|
||||
subject: x509.Name, exts: x509.Extensions
|
||||
) -> List[str]:
|
||||
"""Gets all DNS SAN names as well as the first Common Name from subject.
|
||||
|
||||
:param subject: Name of the x509 object, which may include Common Name
|
||||
:type subject: `cryptography.x509.Name`
|
||||
:param exts: Extensions of the x509 object, which may include SANs
|
||||
:type exts: `cryptography.x509.Extensions`
|
||||
|
||||
:returns: List of DNS Subject Alternative Names and first Common Name
|
||||
:rtype: `list` of `str`
|
||||
"""
|
||||
# We know these are always `str` because `bytes` is only possible for
|
||||
# other OIDs.
|
||||
cns = [
|
||||
typing.cast(str, c.value)
|
||||
for c in subject.get_attributes_for_oid(x509.NameOID.COMMON_NAME)
|
||||
]
|
||||
try:
|
||||
san_ext = exts.get_extension_for_class(x509.SubjectAlternativeName)
|
||||
except x509.ExtensionNotFound:
|
||||
dns_names = []
|
||||
else:
|
||||
dns_names = san_ext.value.get_values_for_type(x509.DNSName)
|
||||
|
||||
if not cns:
|
||||
return dns_names
|
||||
else:
|
||||
# We only include the first CN, if there are multiple. This matches
|
||||
# the behavior of the previously implementation using pyOpenSSL.
|
||||
return [cns[0]] + [d for d in dns_names if d != cns[0]]
|
||||
|
||||
|
||||
def _pyopenssl_cert_or_req_all_names(loaded_cert_or_req: Union[crypto.X509, crypto.X509Req]
|
||||
) -> List[str]:
|
||||
# unlike its name this only outputs DNS names, other type of idents will ignored
|
||||
common_name = loaded_cert_or_req.get_subject().CN
|
||||
sans = _pyopenssl_cert_or_req_san(loaded_cert_or_req)
|
||||
|
||||
if common_name is None:
|
||||
return sans
|
||||
return [common_name] + [d for d in sans if d != common_name]
|
||||
cert_or_req = loaded_cert_or_req.to_cryptography()
|
||||
return get_names_from_subject_and_extensions(
|
||||
cert_or_req.subject, cert_or_req.extensions
|
||||
)
|
||||
|
||||
|
||||
def _pyopenssl_cert_or_req_san(cert_or_req: Union[crypto.X509, crypto.X509Req]) -> List[str]:
|
||||
"""Get Subject Alternative Names from certificate or CSR using pyOpenSSL.
|
||||
|
||||
.. todo:: Implement directly in PyOpenSSL!
|
||||
|
||||
.. note:: Although this is `acme` internal API, it is used by
|
||||
`letsencrypt`.
|
||||
|
||||
@@ -303,67 +362,13 @@ def _pyopenssl_cert_or_req_san(cert_or_req: Union[crypto.X509, crypto.X509Req])
|
||||
:rtype: `list` of `str`
|
||||
|
||||
"""
|
||||
# This function finds SANs with dns name
|
||||
exts = cert_or_req.to_cryptography().extensions
|
||||
try:
|
||||
san_ext = exts.get_extension_for_class(x509.SubjectAlternativeName)
|
||||
except x509.ExtensionNotFound:
|
||||
return []
|
||||
|
||||
# constants based on PyOpenSSL certificate/CSR text dump
|
||||
part_separator = ":"
|
||||
prefix = "DNS" + part_separator
|
||||
|
||||
sans_parts = _pyopenssl_extract_san_list_raw(cert_or_req)
|
||||
|
||||
return [part.split(part_separator)[1]
|
||||
for part in sans_parts if part.startswith(prefix)]
|
||||
|
||||
|
||||
def _pyopenssl_cert_or_req_san_ip(cert_or_req: Union[crypto.X509, crypto.X509Req]) -> List[str]:
|
||||
"""Get Subject Alternative Names IPs from certificate or CSR using pyOpenSSL.
|
||||
|
||||
:param cert_or_req: Certificate or CSR.
|
||||
:type cert_or_req: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
|
||||
|
||||
:returns: A list of Subject Alternative Names that are IP Addresses.
|
||||
:rtype: `list` of `str`. note that this returns as string, not IPaddress object
|
||||
|
||||
"""
|
||||
|
||||
# constants based on PyOpenSSL certificate/CSR text dump
|
||||
part_separator = ":"
|
||||
prefix = "IP Address" + part_separator
|
||||
|
||||
sans_parts = _pyopenssl_extract_san_list_raw(cert_or_req)
|
||||
|
||||
return [part[len(prefix):] for part in sans_parts if part.startswith(prefix)]
|
||||
|
||||
|
||||
def _pyopenssl_extract_san_list_raw(cert_or_req: Union[crypto.X509, crypto.X509Req]) -> List[str]:
|
||||
"""Get raw SAN string from cert or csr, parse it as UTF-8 and return.
|
||||
|
||||
:param cert_or_req: Certificate or CSR.
|
||||
:type cert_or_req: `OpenSSL.crypto.X509` or `OpenSSL.crypto.X509Req`.
|
||||
|
||||
:returns: raw san strings, parsed byte as utf-8
|
||||
:rtype: `list` of `str`
|
||||
|
||||
"""
|
||||
# This function finds SANs by dumping the certificate/CSR to text and
|
||||
# searching for "X509v3 Subject Alternative Name" in the text. This method
|
||||
# is used to because in PyOpenSSL version <0.17 `_subjectAltNameString` methods are
|
||||
# not able to Parse IP Addresses in subjectAltName string.
|
||||
|
||||
if isinstance(cert_or_req, crypto.X509):
|
||||
# pylint: disable=line-too-long
|
||||
text = crypto.dump_certificate(crypto.FILETYPE_TEXT, cert_or_req).decode('utf-8')
|
||||
else:
|
||||
text = crypto.dump_certificate_request(crypto.FILETYPE_TEXT, cert_or_req).decode('utf-8')
|
||||
# WARNING: this function does not support multiple SANs extensions.
|
||||
# Multiple X509v3 extensions of the same type is disallowed by RFC 5280.
|
||||
raw_san = re.search(r"X509v3 Subject Alternative Name:(?: critical)?\s*(.*)", text)
|
||||
|
||||
parts_separator = ", "
|
||||
# WARNING: this function assumes that no SAN can include
|
||||
# parts_separator, hence the split!
|
||||
sans_parts = [] if raw_san is None else raw_san.group(1).split(parts_separator)
|
||||
return sans_parts
|
||||
return san_ext.value.get_values_for_type(x509.DNSName)
|
||||
|
||||
|
||||
def gen_ss_cert(key: crypto.PKey, domains: Optional[List[str]] = None,
|
||||
@@ -433,7 +438,7 @@ def gen_ss_cert(key: crypto.PKey, domains: Optional[List[str]] = None,
|
||||
|
||||
|
||||
def dump_pyopenssl_chain(chain: Union[List[jose.ComparableX509], List[crypto.X509]],
|
||||
filetype: int = crypto.FILETYPE_PEM) -> bytes:
|
||||
filetype: Union[Format, int] = Format.PEM) -> bytes:
|
||||
"""Dump certificate chain into a bundle.
|
||||
|
||||
:param list chain: List of `OpenSSL.crypto.X509` (or wrapped in
|
||||
@@ -446,12 +451,14 @@ def dump_pyopenssl_chain(chain: Union[List[jose.ComparableX509], List[crypto.X50
|
||||
# XXX: returns empty string when no chain is available, which
|
||||
# shuts up RenewableCert, but might not be the best solution...
|
||||
|
||||
filetype = Format(filetype)
|
||||
def _dump_cert(cert: Union[jose.ComparableX509, crypto.X509]) -> bytes:
|
||||
if isinstance(cert, jose.ComparableX509):
|
||||
if isinstance(cert.wrapped, crypto.X509Req):
|
||||
raise errors.Error("Unexpected CSR provided.") # pragma: no cover
|
||||
cert = cert.wrapped
|
||||
return crypto.dump_certificate(filetype, cert)
|
||||
|
||||
return cert.to_cryptography().public_bytes(filetype.to_cryptography_encoding())
|
||||
|
||||
# assumes that OpenSSL.crypto.dump_certificate includes ending
|
||||
# newline character
|
||||
|
||||
@@ -29,7 +29,7 @@ class Header(jose.Header):
|
||||
|
||||
class Signature(jose.Signature):
|
||||
"""ACME-specific Signature. Uses ACME-specific Header for customer fields."""
|
||||
__slots__ = jose.Signature._orig_slots # type: ignore[attr-defined] # pylint: disable=protected-access,no-member
|
||||
__slots__ = jose.Signature._orig_slots # pylint: disable=protected-access,no-member
|
||||
|
||||
# TODO: decoder/encoder should accept cls? Otherwise, subclassing
|
||||
# JSONObjectWithFields is tricky...
|
||||
@@ -44,7 +44,7 @@ class Signature(jose.Signature):
|
||||
class JWS(jose.JWS):
|
||||
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
|
||||
signature_cls = Signature
|
||||
__slots__ = jose.JWS._orig_slots # type: ignore[attr-defined] # pylint: disable=protected-access
|
||||
__slots__ = jose.JWS._orig_slots # pylint: disable=protected-access
|
||||
|
||||
@classmethod
|
||||
# type: ignore[override] # pylint: disable=arguments-differ
|
||||
|
||||
@@ -98,7 +98,7 @@ class BaseDualNetworkedServers:
|
||||
logger.debug(
|
||||
"Successfully bound to %s:%s using %s", new_address[0],
|
||||
new_address[1], "IPv6" if ip_version else "IPv4")
|
||||
except socket.error as e:
|
||||
except OSError as e:
|
||||
last_socket_err = e
|
||||
if self.servers:
|
||||
# Already bound using IPv6.
|
||||
@@ -121,7 +121,7 @@ class BaseDualNetworkedServers:
|
||||
if last_socket_err:
|
||||
raise last_socket_err
|
||||
else: # pragma: no cover
|
||||
raise socket.error("Could not bind to IPv4 or IPv6.")
|
||||
raise OSError("Could not bind to IPv4 or IPv6.")
|
||||
|
||||
def serve_forever(self) -> None:
|
||||
"""Wraps socketserver.TCPServer.serve_forever"""
|
||||
|
||||
5
acme/docs/api/crypto_util.rst
Normal file
5
acme/docs/api/crypto_util.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
Crypto_util
|
||||
-----------
|
||||
|
||||
.. automodule:: acme.crypto_util
|
||||
:members:
|
||||
5
acme/docs/api/jws.rst
Normal file
5
acme/docs/api/jws.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
JWS
|
||||
---
|
||||
|
||||
.. automodule:: acme.jws
|
||||
:members:
|
||||
5
acme/docs/api/util.rst
Normal file
5
acme/docs/api/util.rst
Normal file
@@ -0,0 +1,5 @@
|
||||
Util
|
||||
----
|
||||
|
||||
.. automodule:: acme.util
|
||||
:members:
|
||||
@@ -3,6 +3,6 @@ usage: jws [-h] [--compact] {sign,verify} ...
|
||||
positional arguments:
|
||||
{sign,verify}
|
||||
|
||||
optional arguments:
|
||||
options:
|
||||
-h, --help show this help message and exit
|
||||
--compact
|
||||
|
||||
@@ -28,6 +28,7 @@ Workflow:
|
||||
from contextlib import contextmanager
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
import josepy as jose
|
||||
import OpenSSL
|
||||
@@ -68,10 +69,9 @@ def new_csr_comp(domain_name, pkey_pem=None):
|
||||
"""Create certificate signing request."""
|
||||
if pkey_pem is None:
|
||||
# Create private key.
|
||||
pkey = OpenSSL.crypto.PKey()
|
||||
pkey.generate_key(OpenSSL.crypto.TYPE_RSA, CERT_PKEY_BITS)
|
||||
pkey_pem = OpenSSL.crypto.dump_privatekey(OpenSSL.crypto.FILETYPE_PEM,
|
||||
pkey)
|
||||
pkey = rsa.generate_private_key(public_exponent=65537, key_size=CERT_PKEY_BITS)
|
||||
pkey_pem = pkey.public_bytes(serialization.Encoding.PEM)
|
||||
|
||||
csr_pem = crypto_util.make_csr(pkey_pem, [domain_name])
|
||||
return pkey_pem, csr_pem
|
||||
|
||||
@@ -201,8 +201,10 @@ def example_http():
|
||||
# Revoke certificate
|
||||
|
||||
fullchain_com = jose.ComparableX509(
|
||||
OpenSSL.crypto.load_certificate(
|
||||
OpenSSL.crypto.FILETYPE_PEM, fullchain_pem))
|
||||
OpenSSL.crypto.X509.from_cryptography(
|
||||
x509.load_pem_x509_certificate(fullchain_pem)
|
||||
)
|
||||
)
|
||||
|
||||
try:
|
||||
client_acme.revoke(fullchain_com, 0) # revocation reason = 0
|
||||
|
||||
@@ -3,17 +3,18 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'cryptography>=3.2.1',
|
||||
'josepy>=1.13.0',
|
||||
# Josepy 2+ may introduce backward incompatible changes by droping usage of
|
||||
# deprecated PyOpenSSL APIs.
|
||||
'josepy>=1.13.0, <2',
|
||||
# pyOpenSSL 23.1.0 is a bad release: https://github.com/pyca/pyopenssl/issues/1199
|
||||
'PyOpenSSL>=17.5.0,!=23.1.0',
|
||||
'pyrfc3339',
|
||||
'pytz>=2019.3',
|
||||
'requests>=2.20.0',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
docs_extras = [
|
||||
@@ -22,15 +23,6 @@ docs_extras = [
|
||||
]
|
||||
|
||||
test_extras = [
|
||||
# In theory we could scope importlib_resources to env marker 'python_version<"3.9"'. But this
|
||||
# makes the pinning mechanism emit warnings when running `poetry lock` because in the corner
|
||||
# case of an extra dependency with env marker coming from a setup.py file, it generate the
|
||||
# invalid requirement 'importlib_resource>=1.3.1;python<=3.9;extra=="test"'.
|
||||
# To fix the issue, we do not pass the env marker. This is fine because:
|
||||
# - importlib_resources can be applied to any Python version,
|
||||
# - this is a "test" extra dependency for limited audience,
|
||||
# - it does not change anything at the end for the generated requirement files.
|
||||
'importlib_resources>=1.3.1',
|
||||
'pytest',
|
||||
'pytest-xdist',
|
||||
'typing-extensions',
|
||||
@@ -44,17 +36,17 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
],
|
||||
|
||||
@@ -2,10 +2,10 @@
|
||||
import atexit
|
||||
import binascii
|
||||
import fnmatch
|
||||
import importlib.resources
|
||||
import logging
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
from contextlib import ExitStack
|
||||
from typing import Dict
|
||||
from typing import Iterable
|
||||
@@ -17,12 +17,6 @@ from certbot import errors
|
||||
from certbot import util
|
||||
from certbot.compat import os
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@@ -257,6 +251,6 @@ def find_ssl_apache_conf(prefix: str) -> str:
|
||||
"""
|
||||
file_manager = ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
ref = importlib_resources.files("certbot_apache").joinpath(
|
||||
"_internal", "tls_configs", "{0}-options-ssl-apache.conf".format(prefix))
|
||||
return str(file_manager.enter_context(importlib_resources.as_file(ref)))
|
||||
ref = (importlib.resources.files("certbot_apache").joinpath("_internal")
|
||||
.joinpath("tls_configs").joinpath("{0}-options-ssl-apache.conf".format(prefix)))
|
||||
return str(file_manager.enter_context(importlib.resources.as_file(ref)))
|
||||
|
||||
@@ -295,7 +295,7 @@ class ApacheConfigurator(common.Configurator):
|
||||
try:
|
||||
with open(ssl_module_location, mode="rb") as f:
|
||||
contents = f.read()
|
||||
except IOError as error:
|
||||
except OSError as error:
|
||||
logger.debug(str(error), exc_info=True)
|
||||
return None
|
||||
return contents
|
||||
@@ -928,7 +928,7 @@ class ApacheConfigurator(common.Configurator):
|
||||
try:
|
||||
socket.inet_aton(addr.get_addr())
|
||||
return socket.gethostbyaddr(addr.get_addr())[0]
|
||||
except (socket.error, socket.herror, socket.timeout):
|
||||
except (OSError, socket.herror, socket.timeout):
|
||||
pass
|
||||
|
||||
return ""
|
||||
@@ -1523,7 +1523,7 @@ class ApacheConfigurator(common.Configurator):
|
||||
# activation (it's not included as default)
|
||||
if not self.parser.parsed_in_current(ssl_fp):
|
||||
self.parser.parse_file(ssl_fp)
|
||||
except IOError:
|
||||
except OSError:
|
||||
logger.critical("Error writing/reading to file in make_vhost_ssl", exc_info=True)
|
||||
raise errors.PluginError("Unable to write/read in make_vhost_ssl")
|
||||
|
||||
|
||||
@@ -1,14 +1,10 @@
|
||||
"""Apache plugin constants."""
|
||||
import atexit
|
||||
import sys
|
||||
import importlib.resources
|
||||
from contextlib import ExitStack
|
||||
from typing import Dict
|
||||
from typing import List
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
|
||||
"""Name of the mod_ssl config file as saved
|
||||
@@ -46,8 +42,8 @@ def _generate_augeas_lens_dir_static() -> str:
|
||||
# Python process, and will be automatically cleaned up on exit.
|
||||
file_manager = ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
augeas_lens_dir_ref = importlib_resources.files("certbot_apache") / "_internal" / "augeas_lens"
|
||||
return str(file_manager.enter_context(importlib_resources.as_file(augeas_lens_dir_ref)))
|
||||
augeas_lens_dir_ref = importlib.resources.files("certbot_apache") / "_internal" / "augeas_lens"
|
||||
return str(file_manager.enter_context(importlib.resources.as_file(augeas_lens_dir_ref)))
|
||||
|
||||
AUGEAS_LENS_DIR = _generate_augeas_lens_dir_static()
|
||||
"""Path to the Augeas lens directory"""
|
||||
|
||||
@@ -4,6 +4,7 @@ from typing import Type
|
||||
|
||||
from certbot import util
|
||||
from certbot_apache._internal import configurator
|
||||
from certbot_apache._internal import override_alpine
|
||||
from certbot_apache._internal import override_arch
|
||||
from certbot_apache._internal import override_centos
|
||||
from certbot_apache._internal import override_darwin
|
||||
@@ -14,6 +15,7 @@ from certbot_apache._internal import override_suse
|
||||
from certbot_apache._internal import override_void
|
||||
|
||||
OVERRIDE_CLASSES: Dict[str, Type[configurator.ApacheConfigurator]] = {
|
||||
"alpine": override_alpine.AlpineConfigurator,
|
||||
"arch": override_arch.ArchConfigurator,
|
||||
"cloudlinux": override_centos.CentOSConfigurator,
|
||||
"darwin": override_darwin.DarwinConfigurator,
|
||||
|
||||
19
certbot-apache/certbot_apache/_internal/override_alpine.py
Normal file
19
certbot-apache/certbot_apache/_internal/override_alpine.py
Normal file
@@ -0,0 +1,19 @@
|
||||
""" Distribution specific override class for Alpine Linux """
|
||||
from certbot_apache._internal import configurator
|
||||
from certbot_apache._internal.configurator import OsOptions
|
||||
|
||||
|
||||
class AlpineConfigurator(configurator.ApacheConfigurator):
|
||||
"""Alpine Linux specific ApacheConfigurator override class"""
|
||||
|
||||
OS_DEFAULTS = OsOptions(
|
||||
server_root="/etc/apache2",
|
||||
vhost_root="/etc/apache2/conf.d",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/apache2",
|
||||
ctl="apachectl",
|
||||
version_cmd=['apachectl', '-v'],
|
||||
restart_cmd=['apachectl', 'graceful'],
|
||||
conftest_cmd=['apachectl', 'configtest'],
|
||||
challenge_location="/etc/apache2/conf.d",
|
||||
)
|
||||
@@ -153,7 +153,7 @@ class ApacheParser:
|
||||
try:
|
||||
# This is a noop save
|
||||
self.aug.save()
|
||||
except (RuntimeError, IOError):
|
||||
except (OSError, RuntimeError):
|
||||
self._log_save_errors(ex_errs)
|
||||
# Erase Save Notes
|
||||
self.configurator.save_notes = ""
|
||||
@@ -198,7 +198,7 @@ class ApacheParser:
|
||||
ex_errs = self.aug.match("/augeas//error")
|
||||
try:
|
||||
self.aug.save()
|
||||
except IOError:
|
||||
except OSError:
|
||||
self._log_save_errors(ex_errs)
|
||||
raise
|
||||
|
||||
|
||||
@@ -14,7 +14,7 @@ SCRIPT_DIRNAME = os.path.dirname(__file__)
|
||||
|
||||
def main() -> int:
|
||||
args = sys.argv[1:]
|
||||
with acme_server.ACMEServer('pebble', [], False) as acme_xdist:
|
||||
with acme_server.ACMEServer([], False) as acme_xdist:
|
||||
environ = os.environ.copy()
|
||||
environ['SERVER'] = acme_xdist['directory_url']
|
||||
command = [os.path.join(SCRIPT_DIRNAME, 'apache-conf-test')]
|
||||
|
||||
@@ -174,9 +174,9 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
assert "certbot.demo" in names
|
||||
|
||||
def test_get_bad_path(self):
|
||||
assert apache_util.get_file_path(None) == None
|
||||
assert apache_util.get_file_path("nonexistent") == None
|
||||
assert self.config._create_vhost("nonexistent") == None # pylint: disable=protected-access
|
||||
assert apache_util.get_file_path(None) is None
|
||||
assert apache_util.get_file_path("nonexistent") is None
|
||||
assert self.config._create_vhost("nonexistent") is None # pylint: disable=protected-access
|
||||
|
||||
def test_get_aug_internal_path(self):
|
||||
from certbot_apache._internal.apache_util import get_internal_aug_path
|
||||
@@ -303,7 +303,7 @@ class MultipleVhostsTest(util.ApacheTest):
|
||||
# pylint: disable=protected-access
|
||||
assert self.vh_truth[3] == self.config._find_best_vhost("certbot.demo")
|
||||
assert self.vh_truth[0] == self.config._find_best_vhost("encryption-example.demo")
|
||||
assert self.config._find_best_vhost("does-not-exist.com") == None
|
||||
assert self.config._find_best_vhost("does-not-exist.com") is None
|
||||
|
||||
def test_find_best_vhost_variety(self):
|
||||
# pylint: disable=protected-access
|
||||
@@ -1417,7 +1417,7 @@ class AugeasVhostsTest(util.ApacheTest):
|
||||
self.config.parser.aug.match.side_effect = RuntimeError
|
||||
path = "debian_apache_2_4/augeas_vhosts/apache2/sites-available/old-and-default.conf"
|
||||
chosen_vhost = self.config._create_vhost(path)
|
||||
assert None == chosen_vhost
|
||||
assert chosen_vhost is None
|
||||
|
||||
def test_choosevhost_works(self):
|
||||
path = "debian_apache_2_4/augeas_vhosts/apache2/sites-available/old-and-default.conf"
|
||||
@@ -1518,16 +1518,11 @@ class MultiVhostsTest(util.ApacheTest):
|
||||
with_index_1 = ["/path[1]/section[1]"]
|
||||
without_index = ["/path/section"]
|
||||
with_index_2 = ["/path[2]/section[2]"]
|
||||
assert self.config._get_new_vh_path(without_index,
|
||||
with_index_1) == \
|
||||
None
|
||||
assert self.config._get_new_vh_path(without_index,
|
||||
with_index_2) == \
|
||||
with_index_2[0]
|
||||
assert self.config._get_new_vh_path(without_index, with_index_1) is None
|
||||
assert self.config._get_new_vh_path(without_index, with_index_2) == with_index_2[0]
|
||||
|
||||
both = with_index_1 + with_index_2
|
||||
assert self.config._get_new_vh_path(without_index, both) == \
|
||||
with_index_2[0]
|
||||
assert self.config._get_new_vh_path(without_index, both) == with_index_2[0]
|
||||
|
||||
@mock.patch("certbot_apache._internal.configurator.display_util.notify")
|
||||
def test_make_vhost_ssl_with_existing_rewrite_rule(self, mock_notify):
|
||||
@@ -1658,15 +1653,12 @@ class InstallSslOptionsConfTest(util.ApacheTest):
|
||||
file has been manually edited by the user, and will refuse to update it.
|
||||
This test ensures that all necessary hashes are present.
|
||||
"""
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
import importlib.resources
|
||||
|
||||
from certbot_apache._internal.constants import ALL_SSL_OPTIONS_HASHES
|
||||
|
||||
ref = importlib_resources.files("certbot_apache") / "_internal" / "tls_configs"
|
||||
with importlib_resources.as_file(ref) as tls_configs_dir:
|
||||
ref = importlib.resources.files("certbot_apache") / "_internal" / "tls_configs"
|
||||
with importlib.resources.as_file(ref) as tls_configs_dir:
|
||||
all_files = [os.path.join(tls_configs_dir, name) for name in os.listdir(tls_configs_dir)
|
||||
if name.endswith('options-ssl-apache.conf')]
|
||||
assert len(all_files) >= 1
|
||||
@@ -1723,14 +1715,14 @@ class InstallSslOptionsConfTest(util.ApacheTest):
|
||||
|
||||
self.config._openssl_version = None
|
||||
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
|
||||
assert self.config.openssl_version() == None
|
||||
assert self.config.openssl_version() is None
|
||||
assert "Could not find ssl_module" in mock_log.call_args[0][0]
|
||||
|
||||
# When no ssl_module is present at all
|
||||
self.config._openssl_version = None
|
||||
assert "ssl_module" not in self.config.parser.modules
|
||||
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
|
||||
assert self.config.openssl_version() == None
|
||||
assert self.config.openssl_version() is None
|
||||
assert "Could not find ssl_module" in mock_log.call_args[0][0]
|
||||
|
||||
# When ssl_module is statically linked but --apache-bin not provided
|
||||
@@ -1738,13 +1730,13 @@ class InstallSslOptionsConfTest(util.ApacheTest):
|
||||
self.config.options.bin = None
|
||||
self.config.parser.modules['ssl_module'] = None
|
||||
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
|
||||
assert self.config.openssl_version() == None
|
||||
assert self.config.openssl_version() is None
|
||||
assert "ssl_module is statically linked but" in mock_log.call_args[0][0]
|
||||
|
||||
self.config.parser.modules['ssl_module'] = "/fake/path"
|
||||
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
|
||||
# Check that correct logger.warning was printed
|
||||
assert self.config.openssl_version() == None
|
||||
assert self.config.openssl_version() is None
|
||||
assert "Unable to read" in mock_log.call_args[0][0]
|
||||
|
||||
contents_missing_openssl = b"these contents won't match the regex"
|
||||
@@ -1753,7 +1745,7 @@ class InstallSslOptionsConfTest(util.ApacheTest):
|
||||
mock_omf.return_value = contents_missing_openssl
|
||||
with mock.patch("certbot_apache._internal.configurator.logger.warning") as mock_log:
|
||||
# Check that correct logger.warning was printed
|
||||
assert self.config.openssl_version() == None
|
||||
assert self.config.openssl_version() is None
|
||||
assert "Could not find OpenSSL" in mock_log.call_args[0][0]
|
||||
|
||||
def test_open_module_file(self):
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
# We specify the minimum acme and certbot version as the current plugin
|
||||
@@ -9,9 +9,7 @@ install_requires = [
|
||||
# https://github.com/certbot/certbot/issues/8761 for more info.
|
||||
f'acme>={version}',
|
||||
f'certbot>={version}',
|
||||
'importlib_resources>=1.3.1; python_version < "3.9"',
|
||||
'python-augeas',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
dev_extras = [
|
||||
@@ -30,7 +28,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -39,10 +37,10 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -1,4 +1,3 @@
|
||||
recursive-include certbot_integration_tests/assets *
|
||||
include certbot_integration_tests/py.typed
|
||||
include snap_integration_tests/py.typed
|
||||
include windows_installer_integration_tests/py.typed
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
"""This module contains advanced assertions for the certbot integration tests."""
|
||||
import io
|
||||
import os
|
||||
from typing import Optional
|
||||
from typing import Type
|
||||
@@ -60,7 +59,7 @@ def assert_hook_execution(probe_path: str, probe_content: str) -> None:
|
||||
:param str probe_content: content expected when the hook is executed
|
||||
"""
|
||||
encoding = 'utf-8' if POSIX_MODE else 'utf-16'
|
||||
with io.open(probe_path, 'rt', encoding=encoding) as file:
|
||||
with open(probe_path, 'rt', encoding=encoding) as file:
|
||||
data = file.read()
|
||||
|
||||
lines = [line.strip() for line in data.splitlines()]
|
||||
@@ -76,7 +75,7 @@ def assert_saved_lineage_option(config_dir: str, lineage: str,
|
||||
:param str option: the option key
|
||||
:param value: if desired, the expected option value
|
||||
"""
|
||||
with open(os.path.join(config_dir, 'renewal', '{0}.conf'.format(lineage))) as file_h:
|
||||
with open(os.path.join(config_dir, 'renewal', f'{lineage}.conf')) as file_h:
|
||||
assert f"{option} = {value if value else ''}" in file_h.read()
|
||||
|
||||
|
||||
|
||||
@@ -23,7 +23,6 @@ class IntegrationTestsContext:
|
||||
self.worker_id = 'primary'
|
||||
acme_xdist = request.config.acme_xdist # type: ignore[attr-defined]
|
||||
|
||||
self.acme_server = acme_xdist['acme_server']
|
||||
self.directory_url = acme_xdist['directory_url']
|
||||
self.tls_alpn_01_port = acme_xdist['https_port'][self.worker_id]
|
||||
self.http_01_port = acme_xdist['http_port'][self.worker_id]
|
||||
|
||||
@@ -7,7 +7,6 @@ import shutil
|
||||
import subprocess
|
||||
import time
|
||||
from typing import Generator
|
||||
from typing import Iterable
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
|
||||
@@ -82,11 +81,9 @@ def test_registration_override(context: IntegrationTestsContext) -> None:
|
||||
context.certbot(['update_account', '--email', 'ex1@domain.org,ex2@domain.org'])
|
||||
stdout2, _ = context.certbot(['show_account'])
|
||||
|
||||
# https://github.com/letsencrypt/boulder/issues/6144
|
||||
if context.acme_server != 'boulder-v2':
|
||||
assert 'example@domain.org' in stdout1, "New email should be present"
|
||||
assert 'example@domain.org' not in stdout2, "Old email should not be present"
|
||||
assert 'ex1@domain.org, ex2@domain.org' in stdout2, "New emails should be present"
|
||||
assert 'example@domain.org' in stdout1, "New email should be present"
|
||||
assert 'example@domain.org' not in stdout2, "Old email should not be present"
|
||||
assert 'ex1@domain.org, ex2@domain.org' in stdout2, "New emails should be present"
|
||||
|
||||
|
||||
def test_prepare_plugins(context: IntegrationTestsContext) -> None:
|
||||
@@ -566,19 +563,15 @@ def test_default_rsa_size(context: IntegrationTestsContext) -> None:
|
||||
assert_rsa_key(key1, 2048)
|
||||
|
||||
|
||||
@pytest.mark.parametrize('curve,curve_cls,skip_servers', [
|
||||
@pytest.mark.parametrize('curve,curve_cls', [
|
||||
# Curve name, Curve class, ACME servers to skip
|
||||
('secp256r1', SECP256R1, []),
|
||||
('secp384r1', SECP384R1, []),
|
||||
('secp521r1', SECP521R1, ['boulder-v2'])]
|
||||
('secp256r1', SECP256R1),
|
||||
('secp384r1', SECP384R1),
|
||||
('secp521r1', SECP521R1)]
|
||||
)
|
||||
def test_ecdsa_curves(context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve],
|
||||
skip_servers: Iterable[str]) -> None:
|
||||
def test_ecdsa_curves(context: IntegrationTestsContext, curve: str,
|
||||
curve_cls: Type[EllipticCurve]) -> None:
|
||||
"""Test issuance for each supported ECDSA curve"""
|
||||
if context.acme_server in skip_servers:
|
||||
pytest.skip('ACME server {} does not support ECDSA curve {}'
|
||||
.format(context.acme_server, curve))
|
||||
|
||||
domain = context.get_domain('curve')
|
||||
context.certbot([
|
||||
'certonly',
|
||||
@@ -640,9 +633,6 @@ def test_renew_with_ec_keys(context: IntegrationTestsContext) -> None:
|
||||
|
||||
def test_ocsp_must_staple(context: IntegrationTestsContext) -> None:
|
||||
"""Test that OCSP Must-Staple is correctly set in the generated certificate."""
|
||||
if context.acme_server == 'pebble':
|
||||
pytest.skip('Pebble does not support OCSP Must-Staple.')
|
||||
|
||||
certname = context.get_domain('must-staple')
|
||||
context.certbot(['auth', '--must-staple', '--domains', certname])
|
||||
|
||||
@@ -710,17 +700,14 @@ def test_revoke_and_unregister(context: IntegrationTestsContext) -> None:
|
||||
assert cert3 in stdout
|
||||
|
||||
|
||||
@pytest.mark.parametrize('curve,curve_cls,skip_servers', [
|
||||
('secp256r1', SECP256R1, []),
|
||||
('secp384r1', SECP384R1, []),
|
||||
('secp521r1', SECP521R1, ['boulder-v2'])]
|
||||
@pytest.mark.parametrize('curve,curve_cls', [
|
||||
('secp256r1', SECP256R1),
|
||||
('secp384r1', SECP384R1),
|
||||
('secp521r1', SECP521R1)]
|
||||
)
|
||||
def test_revoke_ecdsa_cert_key(
|
||||
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve],
|
||||
skip_servers: Iterable[str]) -> None:
|
||||
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve]) -> None:
|
||||
"""Test revoking a certificate """
|
||||
if context.acme_server in skip_servers:
|
||||
pytest.skip(f'ACME server {context.acme_server} does not support ECDSA curve {curve}')
|
||||
cert: str = context.get_domain('curve')
|
||||
context.certbot([
|
||||
'certonly',
|
||||
@@ -738,17 +725,14 @@ def test_revoke_ecdsa_cert_key(
|
||||
assert stdout.count('INVALID: REVOKED') == 1, 'Expected {0} to be REVOKED'.format(cert)
|
||||
|
||||
|
||||
@pytest.mark.parametrize('curve,curve_cls,skip_servers', [
|
||||
('secp256r1', SECP256R1, []),
|
||||
('secp384r1', SECP384R1, []),
|
||||
('secp521r1', SECP521R1, ['boulder-v2'])]
|
||||
@pytest.mark.parametrize('curve,curve_cls', [
|
||||
('secp256r1', SECP256R1),
|
||||
('secp384r1', SECP384R1),
|
||||
('secp521r1', SECP521R1)]
|
||||
)
|
||||
def test_revoke_ecdsa_cert_key_delete(
|
||||
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve],
|
||||
skip_servers: Iterable[str]) -> None:
|
||||
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve]) -> None:
|
||||
"""Test revoke and deletion for each supported curve type"""
|
||||
if context.acme_server in skip_servers:
|
||||
pytest.skip(f'ACME server {context.acme_server} does not support ECDSA curve {curve}')
|
||||
cert: str = context.get_domain('curve')
|
||||
context.certbot([
|
||||
'certonly',
|
||||
@@ -913,7 +897,7 @@ def test_dry_run_deactivate_authzs(context: IntegrationTestsContext) -> None:
|
||||
def test_preferred_chain(context: IntegrationTestsContext) -> None:
|
||||
"""Test that --preferred-chain results in the correct chain.pem being produced"""
|
||||
try:
|
||||
issuers = misc.get_acme_issuers(context)
|
||||
issuers = misc.get_acme_issuers()
|
||||
except NotImplementedError:
|
||||
pytest.skip('This ACME server does not support alternative issuers.')
|
||||
|
||||
|
||||
@@ -8,7 +8,6 @@ for a directory a specific configuration using built-in pytest hooks.
|
||||
See https://docs.pytest.org/en/latest/reference.html#hook-reference
|
||||
"""
|
||||
import contextlib
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
from certbot_integration_tests.utils import acme_server as acme_lib
|
||||
@@ -20,10 +19,6 @@ def pytest_addoption(parser):
|
||||
Standard pytest hook to add options to the pytest parser.
|
||||
:param parser: current pytest parser that will be used on the CLI
|
||||
"""
|
||||
parser.addoption('--acme-server', default='pebble',
|
||||
choices=['boulder-v2', 'pebble'],
|
||||
help='select the ACME server to use (boulder-v2, pebble), '
|
||||
'defaulting to pebble')
|
||||
parser.addoption('--dns-server', default='challtestsrv',
|
||||
choices=['bind', 'challtestsrv'],
|
||||
help='select the DNS server to use (bind, challtestsrv), '
|
||||
@@ -69,7 +64,7 @@ def _setup_primary_node(config):
|
||||
Setup the environment for integration tests.
|
||||
|
||||
This function will:
|
||||
- check runtime compatibility (Docker, docker-compose, Nginx)
|
||||
- check runtime compatibility (Docker, docker compose, Nginx)
|
||||
- create a temporary workspace and the persistent GIT repositories space
|
||||
- configure and start a DNS server using Docker, if configured
|
||||
- configure and start paralleled ACME CA servers using Docker
|
||||
@@ -80,22 +75,6 @@ def _setup_primary_node(config):
|
||||
|
||||
:param config: Configuration of the pytest primary node. Is modified by this function.
|
||||
"""
|
||||
# Check for runtime compatibility: some tools are required to be available in PATH
|
||||
if 'boulder' in config.option.acme_server:
|
||||
try:
|
||||
subprocess.check_output(['docker', '-v'], stderr=subprocess.STDOUT)
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
raise ValueError('Error: docker is required in PATH to launch the integration tests on'
|
||||
'boulder, but is not installed or not available for current user.')
|
||||
|
||||
try:
|
||||
subprocess.check_output(['docker-compose', '-v'], stderr=subprocess.STDOUT)
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
raise ValueError(
|
||||
'Error: docker-compose is required in PATH to launch the integration tests, '
|
||||
'but is not installed or not available for current user.'
|
||||
)
|
||||
|
||||
# Parameter numprocesses is added to option by pytest-xdist
|
||||
workers = ['primary'] if not config.option.numprocesses\
|
||||
else ['gw{0}'.format(i) for i in range(config.option.numprocesses)]
|
||||
@@ -115,8 +94,7 @@ def _setup_primary_node(config):
|
||||
|
||||
# By calling setup_acme_server we ensure that all necessary acme server instances will be
|
||||
# fully started. This runtime is reflected by the acme_xdist returned.
|
||||
acme_server = acme_lib.ACMEServer(config.option.acme_server, workers,
|
||||
dns_server=acme_dns_server)
|
||||
acme_server = acme_lib.ACMEServer(workers, dns_server=acme_dns_server)
|
||||
config.add_cleanup(acme_server.stop)
|
||||
print('ACME xdist config:\n{0}'.format(acme_server.acme_xdist))
|
||||
acme_server.start()
|
||||
|
||||
@@ -2,15 +2,10 @@
|
||||
"""General purpose nginx test configuration generator."""
|
||||
import atexit
|
||||
import getpass
|
||||
import sys
|
||||
import importlib.resources
|
||||
from contextlib import ExitStack
|
||||
from typing import Optional
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
|
||||
def construct_nginx_config(nginx_root: str, nginx_webroot: str, http_port: int, https_port: int,
|
||||
other_port: int, default_server: bool, key_path: Optional[str] = None,
|
||||
@@ -32,14 +27,16 @@ def construct_nginx_config(nginx_root: str, nginx_webroot: str, http_port: int,
|
||||
if not key_path:
|
||||
file_manager = ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
ref = importlib_resources.files('certbot_integration_tests').joinpath('assets', 'key.pem')
|
||||
key_path = str(file_manager.enter_context(importlib_resources.as_file(ref)))
|
||||
ref = (importlib.resources.files('certbot_integration_tests').joinpath('assets')
|
||||
.joinpath('key.pem'))
|
||||
key_path = str(file_manager.enter_context(importlib.resources.as_file(ref)))
|
||||
|
||||
if not cert_path:
|
||||
file_manager = ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
ref = importlib_resources.files('certbot_integration_tests').joinpath('assets', 'cert.pem')
|
||||
cert_path = str(file_manager.enter_context(importlib_resources.as_file(ref)))
|
||||
ref = (importlib.resources.files('certbot_integration_tests').joinpath('assets')
|
||||
.joinpath('cert.pem'))
|
||||
cert_path = str(file_manager.enter_context(importlib.resources.as_file(ref)))
|
||||
|
||||
return '''\
|
||||
# This error log will be written regardless of server scope error_log
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
"""Module to handle the context of RFC2136 integration tests."""
|
||||
from contextlib import contextmanager
|
||||
import sys
|
||||
import importlib.resources
|
||||
import tempfile
|
||||
from typing import Generator
|
||||
from typing import Iterable
|
||||
@@ -11,11 +11,6 @@ import pytest
|
||||
from certbot_integration_tests.certbot_tests import context as certbot_context
|
||||
from certbot_integration_tests.utils import certbot_call
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
|
||||
class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
|
||||
"""Integration test context for certbot-dns-rfc2136"""
|
||||
@@ -48,10 +43,9 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
|
||||
:yields: Path to credentials file
|
||||
:rtype: str
|
||||
"""
|
||||
src_ref_file = importlib_resources.files('certbot_integration_tests').joinpath(
|
||||
'assets', 'bind-config', f'rfc2136-credentials-{label}.ini.tpl'
|
||||
)
|
||||
with importlib_resources.as_file(src_ref_file) as src_file:
|
||||
src_ref_file = (importlib.resources.files('certbot_integration_tests').joinpath('assets')
|
||||
.joinpath('bind-config').joinpath(f'rfc2136-credentials-{label}.ini.tpl'))
|
||||
with importlib.resources.as_file(src_ref_file) as src_file:
|
||||
with open(src_file, 'r') as f:
|
||||
contents = f.read().format(
|
||||
server_address=self._dns_xdist['address'],
|
||||
|
||||
@@ -5,7 +5,6 @@ import argparse
|
||||
import errno
|
||||
import json
|
||||
import os
|
||||
from os.path import join
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
@@ -18,14 +17,12 @@ from typing import Dict
|
||||
from typing import List
|
||||
from typing import Mapping
|
||||
from typing import Optional
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
|
||||
import requests
|
||||
|
||||
# pylint: disable=wildcard-import,unused-wildcard-import
|
||||
from certbot_integration_tests.utils import misc
|
||||
from certbot_integration_tests.utils import pebble_artifacts
|
||||
from certbot_integration_tests.utils import pebble_ocsp_server
|
||||
from certbot_integration_tests.utils import proxy
|
||||
from certbot_integration_tests.utils.constants import *
|
||||
|
||||
@@ -42,34 +39,30 @@ class ACMEServer:
|
||||
ACMEServer is also a context manager, and so can be used to ensure ACME server is
|
||||
started/stopped upon context enter/exit.
|
||||
"""
|
||||
def __init__(self, acme_server: str, nodes: List[str], http_proxy: bool = True,
|
||||
def __init__(self, nodes: List[str], http_proxy: bool = True,
|
||||
stdout: bool = False, dns_server: Optional[str] = None,
|
||||
http_01_port: Optional[int] = None) -> None:
|
||||
"""
|
||||
Create an ACMEServer instance.
|
||||
:param str acme_server: the type of acme server used (boulder-v2 or pebble)
|
||||
:param list nodes: list of node names that will be setup by pytest xdist
|
||||
:param bool http_proxy: if False do not start the HTTP proxy
|
||||
:param bool stdout: if True stream all subprocesses stdout to standard stdout
|
||||
:param str dns_server: if set, Pebble/Boulder will use it to resolve domains
|
||||
:param str dns_server: if set, Pebble will use it to resolve domains
|
||||
:param int http_01_port: port to use for http-01 validation; currently
|
||||
only supported for pebble without an HTTP proxy
|
||||
"""
|
||||
self._construct_acme_xdist(acme_server, nodes)
|
||||
self._construct_acme_xdist(nodes)
|
||||
|
||||
self._acme_type = 'pebble' if acme_server == 'pebble' else 'boulder'
|
||||
self._proxy = http_proxy
|
||||
self._workspace = tempfile.mkdtemp()
|
||||
self._processes: List[subprocess.Popen] = []
|
||||
self._stdout = sys.stdout if stdout else open(os.devnull, 'w') # pylint: disable=consider-using-with
|
||||
self._dns_server = dns_server
|
||||
self._preterminate_cmds_args: List[Tuple[Tuple[Any, ...], Dict[str, Any]]] = []
|
||||
self._http_01_port = BOULDER_HTTP_01_PORT if self._acme_type == 'boulder' \
|
||||
else DEFAULT_HTTP_01_PORT
|
||||
self._http_01_port = DEFAULT_HTTP_01_PORT
|
||||
if http_01_port:
|
||||
if (self._acme_type == 'pebble' and self._proxy) or self._acme_type == 'boulder':
|
||||
if self._proxy:
|
||||
raise ValueError('Setting http_01_port is not currently supported when '
|
||||
'using Boulder or the HTTP proxy')
|
||||
'using the HTTP proxy')
|
||||
self._http_01_port = http_01_port
|
||||
|
||||
def start(self) -> None:
|
||||
@@ -77,10 +70,7 @@ class ACMEServer:
|
||||
try:
|
||||
if self._proxy:
|
||||
self._prepare_http_proxy()
|
||||
if self._acme_type == 'pebble':
|
||||
self._prepare_pebble_server()
|
||||
if self._acme_type == 'boulder':
|
||||
self._prepare_boulder_server()
|
||||
self._prepare_pebble_server()
|
||||
except BaseException as e:
|
||||
self.stop()
|
||||
raise e
|
||||
@@ -89,7 +79,6 @@ class ACMEServer:
|
||||
"""Stop the test stack, and clean its resources"""
|
||||
print('=> Tear down the test infrastructure...')
|
||||
try:
|
||||
self._run_preterminate_cmds()
|
||||
for process in self._processes:
|
||||
try:
|
||||
process.terminate()
|
||||
@@ -115,19 +104,14 @@ class ACMEServer:
|
||||
traceback: Optional[TracebackType]) -> None:
|
||||
self.stop()
|
||||
|
||||
def _construct_acme_xdist(self, acme_server: str, nodes: List[str]) -> None:
|
||||
def _construct_acme_xdist(self, nodes: List[str]) -> None:
|
||||
"""Generate and return the acme_xdist dict"""
|
||||
acme_xdist: Dict[str, Any] = {'acme_server': acme_server}
|
||||
acme_xdist: Dict[str, Any] = {}
|
||||
|
||||
# Directory and ACME port are set implicitly in the docker-compose.yml
|
||||
# files of Boulder/Pebble.
|
||||
if acme_server == 'pebble':
|
||||
acme_xdist['directory_url'] = PEBBLE_DIRECTORY_URL
|
||||
acme_xdist['challtestsrv_url'] = PEBBLE_CHALLTESTSRV_URL
|
||||
else: # boulder
|
||||
acme_xdist['directory_url'] = BOULDER_V2_DIRECTORY_URL
|
||||
acme_xdist['challtestsrv_url'] = BOULDER_V2_CHALLTESTSRV_URL
|
||||
|
||||
# files of Pebble.
|
||||
acme_xdist['directory_url'] = PEBBLE_DIRECTORY_URL
|
||||
acme_xdist['challtestsrv_url'] = PEBBLE_CHALLTESTSRV_URL
|
||||
acme_xdist['http_port'] = dict(zip(nodes, range(5200, 5200 + len(nodes))))
|
||||
acme_xdist['https_port'] = dict(zip(nodes, range(5100, 5100 + len(nodes))))
|
||||
acme_xdist['other_port'] = dict(zip(nodes, range(5300, 5300 + len(nodes))))
|
||||
@@ -161,11 +145,6 @@ class ACMEServer:
|
||||
[pebble_path, '-config', pebble_config_path, '-dnsserver', dns_server, '-strict'],
|
||||
env=environ)
|
||||
|
||||
# pebble_ocsp_server is imported here and not at the top of module in order to avoid a
|
||||
# useless ImportError, in the case where cryptography dependency is too old to support
|
||||
# ocsp, but Boulder is used instead of Pebble, so pebble_ocsp_server is not used. This is
|
||||
# the typical situation of integration-certbot-oldest tox testenv.
|
||||
from certbot_integration_tests.utils import pebble_ocsp_server
|
||||
self._launch_process([sys.executable, pebble_ocsp_server.__file__])
|
||||
|
||||
# Wait for the ACME CA server to be up.
|
||||
@@ -174,68 +153,6 @@ class ACMEServer:
|
||||
|
||||
print('=> Finished pebble instance deployment.')
|
||||
|
||||
def _prepare_boulder_server(self) -> None:
|
||||
"""Configure and launch the Boulder server"""
|
||||
print('=> Starting boulder instance deployment...')
|
||||
instance_path = join(self._workspace, 'boulder')
|
||||
|
||||
# Load Boulder from git, that includes a docker-compose.yml ready for production.
|
||||
process = self._launch_process(['git', 'clone', 'https://github.com/letsencrypt/boulder',
|
||||
'--single-branch', '--depth=1', instance_path])
|
||||
process.wait(MAX_SUBPROCESS_WAIT)
|
||||
|
||||
# Allow Boulder to ignore usual limit rate policies, useful for tests.
|
||||
os.rename(join(instance_path, 'test/rate-limit-policies-b.yml'),
|
||||
join(instance_path, 'test/rate-limit-policies.yml'))
|
||||
|
||||
if self._dns_server:
|
||||
# Change Boulder config to use the provided DNS server
|
||||
for suffix in ["", "-remote-a", "-remote-b"]:
|
||||
with open(join(instance_path, 'test/config/va{}.json'.format(suffix)), 'r') as f:
|
||||
config = json.loads(f.read())
|
||||
config['va']['dnsResolvers'] = [self._dns_server]
|
||||
with open(join(instance_path, 'test/config/va{}.json'.format(suffix)), 'w') as f:
|
||||
f.write(json.dumps(config, indent=2, separators=(',', ': ')))
|
||||
|
||||
# This command needs to be run before we try and terminate running processes because
|
||||
# docker-compose up doesn't always respond to SIGTERM. See
|
||||
# https://github.com/certbot/certbot/pull/9435.
|
||||
self._register_preterminate_cmd(['docker-compose', 'down'], cwd=instance_path)
|
||||
# Boulder docker generates build artifacts owned by root with 0o744 permissions.
|
||||
# If we started the acme server from a normal user that has access to the Docker
|
||||
# daemon, this user will not be able to delete these artifacts from the host.
|
||||
# We need to do it through a docker.
|
||||
self._register_preterminate_cmd(['docker', 'run', '--rm', '-v',
|
||||
'{0}:/workspace'.format(self._workspace), 'alpine', 'rm',
|
||||
'-rf', '/workspace/boulder'])
|
||||
try:
|
||||
# Launch the Boulder server
|
||||
self._launch_process(['docker-compose', 'up', '--force-recreate'], cwd=instance_path)
|
||||
|
||||
# Wait for the ACME CA server to be up.
|
||||
print('=> Waiting for boulder instance to respond...')
|
||||
misc.check_until_timeout(
|
||||
self.acme_xdist['directory_url'], attempts=300)
|
||||
|
||||
if not self._dns_server:
|
||||
# Configure challtestsrv to answer any A record request with ip of the docker host.
|
||||
response = requests.post(
|
||||
f'{BOULDER_V2_CHALLTESTSRV_URL}/set-default-ipv4',
|
||||
json={'ip': '10.77.77.1'},
|
||||
timeout=10
|
||||
)
|
||||
response.raise_for_status()
|
||||
except BaseException:
|
||||
# If we failed to set up boulder, print its logs.
|
||||
print('=> Boulder setup failed. Boulder logs are:')
|
||||
process = self._launch_process([
|
||||
'docker-compose', 'logs'], cwd=instance_path, force_stderr=True
|
||||
)
|
||||
process.wait(MAX_SUBPROCESS_WAIT)
|
||||
raise
|
||||
|
||||
print('=> Finished boulder instance deployment.')
|
||||
|
||||
def _prepare_http_proxy(self) -> None:
|
||||
"""Configure and launch an HTTP proxy"""
|
||||
print(f'=> Configuring the HTTP proxy on port {self._http_01_port}...')
|
||||
@@ -260,26 +177,11 @@ class ACMEServer:
|
||||
self._processes.append(process)
|
||||
return process
|
||||
|
||||
def _register_preterminate_cmd(self, *args: Any, **kwargs: Any) -> None:
|
||||
self._preterminate_cmds_args.append((args, kwargs))
|
||||
|
||||
def _run_preterminate_cmds(self) -> None:
|
||||
for args, kwargs in self._preterminate_cmds_args:
|
||||
process = self._launch_process(*args, **kwargs)
|
||||
process.wait(MAX_SUBPROCESS_WAIT)
|
||||
# It's unlikely to matter, but let's clear the list of cleanup commands
|
||||
# once they've been run.
|
||||
self._preterminate_cmds_args.clear()
|
||||
|
||||
|
||||
def main() -> None:
|
||||
# pylint: disable=missing-function-docstring
|
||||
parser = argparse.ArgumentParser(
|
||||
description='CLI tool to start a local instance of Pebble or Boulder CA server.')
|
||||
parser.add_argument('--server-type', '-s',
|
||||
choices=['pebble', 'boulder-v2'], default='pebble',
|
||||
help='type of CA server to start: can be Pebble or Boulder. '
|
||||
'Pebble is used if not set.')
|
||||
description='CLI tool to start a local instance of Pebble CA server.')
|
||||
parser.add_argument('--dns-server', '-d',
|
||||
help='specify the DNS server as `IP:PORT` to use by '
|
||||
'Pebble; if not specified, a local mock DNS server will be used to '
|
||||
@@ -290,8 +192,8 @@ def main() -> None:
|
||||
args = parser.parse_args()
|
||||
|
||||
acme_server = ACMEServer(
|
||||
args.server_type, [], http_proxy=False, stdout=True,
|
||||
dns_server=args.dns_server, http_01_port=args.http_01_port,
|
||||
[], http_proxy=False, stdout=True, dns_server=args.dns_server,
|
||||
http_01_port=args.http_01_port,
|
||||
)
|
||||
|
||||
try:
|
||||
|
||||
@@ -51,7 +51,7 @@ def _prepare_environ(workspace: str) -> Dict[str, str]:
|
||||
|
||||
# So, pytest is nice, and a little too nice for our usage.
|
||||
# In order to help user to call seamlessly any piece of python code without requiring to
|
||||
# install it as a full-fledged setuptools distribution for instance, it may inject the path
|
||||
# install it as a full-fledged Python package for instance, it may inject the path
|
||||
# to the test files into the PYTHONPATH. This allows the python interpreter to import
|
||||
# as modules any python file available at this path.
|
||||
# See https://docs.pytest.org/en/3.2.5/pythonpath.html for the explanation and description.
|
||||
@@ -96,7 +96,6 @@ def _prepare_args_env(certbot_args: List[str], directory_url: str, http_01_port:
|
||||
'--no-verify-ssl',
|
||||
'--http-01-port', str(http_01_port),
|
||||
'--https-port', str(tls_alpn_01_port),
|
||||
'--manual-public-ip-logging-ok',
|
||||
'--config-dir', config_dir,
|
||||
'--work-dir', os.path.join(workspace, 'work'),
|
||||
'--logs-dir', os.path.join(workspace, 'logs'),
|
||||
|
||||
@@ -1,10 +1,7 @@
|
||||
"""Some useful constants to use throughout certbot-ci integration tests"""
|
||||
DEFAULT_HTTP_01_PORT = 5002
|
||||
BOULDER_HTTP_01_PORT = 80
|
||||
TLS_ALPN_01_PORT = 5001
|
||||
CHALLTESTSRV_PORT = 8055
|
||||
BOULDER_V2_CHALLTESTSRV_URL = f'http://10.77.77.77:{CHALLTESTSRV_PORT}'
|
||||
BOULDER_V2_DIRECTORY_URL = 'http://localhost:4001/directory'
|
||||
PEBBLE_DIRECTORY_URL = 'https://localhost:14000/dir'
|
||||
PEBBLE_MANAGEMENT_URL = 'https://localhost:15000'
|
||||
PEBBLE_CHALLTESTSRV_URL = f'http://localhost:{CHALLTESTSRV_PORT}'
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
#!/usr/bin/env python
|
||||
"""Module to setup an RFC2136-capable DNS server"""
|
||||
import importlib.resources
|
||||
import os
|
||||
import os.path
|
||||
import shutil
|
||||
@@ -17,12 +18,7 @@ from typing import Type
|
||||
|
||||
from certbot_integration_tests.utils import constants
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
BIND_DOCKER_IMAGE = "internetsystemsconsortium/bind9:9.16"
|
||||
BIND_DOCKER_IMAGE = "internetsystemsconsortium/bind9:9.20"
|
||||
BIND_BIND_ADDRESS = ("127.0.0.1", 45953)
|
||||
|
||||
# A TCP DNS message which is a query for '. CH A' transaction ID 0xcb37. This is used
|
||||
@@ -83,8 +79,8 @@ class DNSServer:
|
||||
|
||||
def _configure_bind(self) -> None:
|
||||
"""Configure the BIND9 server based on the prebaked configuration"""
|
||||
ref = importlib_resources.files("certbot_integration_tests") / "assets" / "bind-config"
|
||||
with importlib_resources.as_file(ref) as path:
|
||||
ref = importlib.resources.files("certbot_integration_tests") / "assets" / "bind-config"
|
||||
with importlib.resources.as_file(ref) as path:
|
||||
for directory in ("conf", "zones"):
|
||||
shutil.copytree(
|
||||
os.path.join(path, directory), os.path.join(self.bind_root, directory)
|
||||
|
||||
@@ -7,6 +7,7 @@ import contextlib
|
||||
import errno
|
||||
import functools
|
||||
import http.server as SimpleHTTPServer
|
||||
import importlib.resources
|
||||
import os
|
||||
import re
|
||||
import shutil
|
||||
@@ -21,10 +22,13 @@ from typing import Iterable
|
||||
from typing import List
|
||||
from typing import Optional
|
||||
from typing import Tuple
|
||||
import warnings
|
||||
from typing import Union
|
||||
|
||||
from cryptography import x509
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import hashes
|
||||
from cryptography.hazmat.primitives.asymmetric import ec
|
||||
from cryptography.hazmat.primitives.asymmetric import rsa
|
||||
from cryptography.hazmat.primitives.serialization import Encoding
|
||||
from cryptography.hazmat.primitives.serialization import NoEncryption
|
||||
from cryptography.hazmat.primitives.serialization import PrivateFormat
|
||||
@@ -33,15 +37,9 @@ from cryptography.x509 import load_pem_x509_certificate
|
||||
from OpenSSL import crypto
|
||||
import requests
|
||||
|
||||
from certbot_integration_tests.certbot_tests.context import IntegrationTestsContext
|
||||
from certbot_integration_tests.utils.constants import PEBBLE_ALTERNATE_ROOTS
|
||||
from certbot_integration_tests.utils.constants import PEBBLE_MANAGEMENT_URL
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
RSA_KEY_TYPE = 'rsa'
|
||||
ECDSA_KEY_TYPE = 'ecdsa'
|
||||
|
||||
@@ -125,9 +123,9 @@ def generate_test_file_hooks(config_dir: str, hook_probe: str) -> None:
|
||||
"""
|
||||
file_manager = contextlib.ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
hook_path_ref = importlib_resources.files('certbot_integration_tests').joinpath(
|
||||
'assets', 'hook.py')
|
||||
hook_path = str(file_manager.enter_context(importlib_resources.as_file(hook_path_ref)))
|
||||
hook_path_ref = (importlib.resources.files('certbot_integration_tests').joinpath('assets')
|
||||
.joinpath('hook.py'))
|
||||
hook_path = str(file_manager.enter_context(importlib.resources.as_file(hook_path_ref)))
|
||||
|
||||
for hook_dir in list_renewal_hooks_dirs(config_dir):
|
||||
# We want an equivalent of bash `chmod -p $HOOK_DIR, that does not fail if one folder of
|
||||
@@ -200,8 +198,9 @@ shutil.rmtree(well_known)
|
||||
shutil.rmtree(tempdir)
|
||||
|
||||
|
||||
def generate_csr(domains: Iterable[str], key_path: str, csr_path: str,
|
||||
key_type: str = RSA_KEY_TYPE) -> None:
|
||||
def generate_csr(
|
||||
domains: Iterable[str], key_path: str, csr_path: str, key_type: str = RSA_KEY_TYPE
|
||||
) -> None:
|
||||
"""
|
||||
Generate a private key, and a CSR for the given domains using this key.
|
||||
:param domains: the domain names to include in the CSR
|
||||
@@ -210,35 +209,33 @@ def generate_csr(domains: Iterable[str], key_path: str, csr_path: str,
|
||||
:param str csr_path: path to the CSR that will be generated
|
||||
:param str key_type: type of the key (misc.RSA_KEY_TYPE or misc.ECDSA_KEY_TYPE)
|
||||
"""
|
||||
key: Union[rsa.RSAPrivateKey, ec.EllipticCurvePrivateKey]
|
||||
if key_type == RSA_KEY_TYPE:
|
||||
key = crypto.PKey()
|
||||
key.generate_key(crypto.TYPE_RSA, 2048)
|
||||
key = rsa.generate_private_key(public_exponent=65537, key_size=2048)
|
||||
elif key_type == ECDSA_KEY_TYPE:
|
||||
with warnings.catch_warnings():
|
||||
# Ignore a warning on some old versions of cryptography
|
||||
warnings.simplefilter('ignore', category=PendingDeprecationWarning)
|
||||
_key = ec.generate_private_key(ec.SECP384R1(), default_backend())
|
||||
_bytes = _key.private_bytes(encoding=Encoding.PEM,
|
||||
format=PrivateFormat.TraditionalOpenSSL,
|
||||
encryption_algorithm=NoEncryption())
|
||||
key = crypto.load_privatekey(crypto.FILETYPE_PEM, _bytes)
|
||||
key = ec.generate_private_key(ec.SECP384R1())
|
||||
else:
|
||||
raise ValueError('Invalid key type: {0}'.format(key_type))
|
||||
raise ValueError("Invalid key type: {0}".format(key_type))
|
||||
|
||||
with open(key_path, 'wb') as file_h:
|
||||
file_h.write(crypto.dump_privatekey(crypto.FILETYPE_PEM, key))
|
||||
with open(key_path, "wb") as file_h:
|
||||
file_h.write(
|
||||
key.private_bytes(
|
||||
Encoding.PEM, PrivateFormat.TraditionalOpenSSL, NoEncryption()
|
||||
)
|
||||
)
|
||||
|
||||
req = crypto.X509Req()
|
||||
san = ', '.join('DNS:{0}'.format(item) for item in domains)
|
||||
san_constraint = crypto.X509Extension(b'subjectAltName', False, san.encode('utf-8'))
|
||||
req.add_extensions([san_constraint])
|
||||
csr = (
|
||||
x509.CertificateSigningRequestBuilder()
|
||||
.subject_name(x509.Name([]))
|
||||
.add_extension(
|
||||
x509.SubjectAlternativeName([x509.DNSName(d) for d in domains]),
|
||||
critical=False,
|
||||
)
|
||||
.sign(key, hashes.SHA256())
|
||||
)
|
||||
|
||||
req.set_pubkey(key)
|
||||
req.set_version(0)
|
||||
req.sign(key, 'sha256')
|
||||
|
||||
with open(csr_path, 'wb') as file_h:
|
||||
file_h.write(crypto.dump_certificate_request(crypto.FILETYPE_ASN1, req))
|
||||
with open(csr_path, "wb") as file_h:
|
||||
file_h.write(csr.public_bytes(Encoding.DER))
|
||||
|
||||
|
||||
def read_certificate(cert_path: str) -> str:
|
||||
@@ -248,11 +245,13 @@ def read_certificate(cert_path: str) -> str:
|
||||
:param str cert_path: the path to the certificate
|
||||
:returns: the TEXT version of the certificate, as it would be displayed by openssl binary
|
||||
"""
|
||||
with open(cert_path, 'rb') as file:
|
||||
with open(cert_path, "rb") as file:
|
||||
data = file.read()
|
||||
|
||||
cert = crypto.load_certificate(crypto.FILETYPE_PEM, data)
|
||||
return crypto.dump_certificate(crypto.FILETYPE_TEXT, cert).decode('utf-8')
|
||||
cert = x509.load_pem_x509_certificate(data)
|
||||
return crypto.dump_certificate(
|
||||
crypto.FILETYPE_TEXT, crypto.X509.from_cryptography(cert)
|
||||
).decode("utf-8")
|
||||
|
||||
|
||||
def load_sample_data_path(workspace: str) -> str:
|
||||
@@ -262,10 +261,9 @@ def load_sample_data_path(workspace: str) -> str:
|
||||
:returns: the path to the loaded sample data directory
|
||||
:rtype: str
|
||||
"""
|
||||
original_ref = importlib_resources.files('certbot_integration_tests').joinpath(
|
||||
'assets', 'sample-config'
|
||||
)
|
||||
with importlib_resources.as_file(original_ref) as original:
|
||||
original_ref = (importlib.resources.files('certbot_integration_tests').joinpath('assets')
|
||||
.joinpath('sample-config'))
|
||||
with importlib.resources.as_file(original_ref) as original:
|
||||
copied = os.path.join(workspace, 'sample-config')
|
||||
shutil.copytree(original, copied, symlinks=True)
|
||||
|
||||
@@ -304,16 +302,12 @@ def echo(keyword: str, path: Optional[str] = None) -> str:
|
||||
os.path.basename(sys.executable), keyword, ' >> "{0}"'.format(path) if path else '')
|
||||
|
||||
|
||||
def get_acme_issuers(context: IntegrationTestsContext) -> List[Certificate]:
|
||||
def get_acme_issuers() -> List[Certificate]:
|
||||
"""Gets the list of one or more issuer certificates from the ACME server used by the
|
||||
context.
|
||||
:param context: the testing context.
|
||||
:return: the `list of x509.Certificate` representing the list of issuers.
|
||||
"""
|
||||
# TODO: in fact, Boulder has alternate chains in config-next/, just not yet in config/.
|
||||
if context.acme_server != "pebble":
|
||||
raise NotImplementedError()
|
||||
|
||||
_suppress_x509_verification_warnings()
|
||||
|
||||
issuers = []
|
||||
|
||||
@@ -1,55 +1,62 @@
|
||||
# pylint: disable=missing-module-docstring
|
||||
import atexit
|
||||
import importlib.resources
|
||||
import io
|
||||
import json
|
||||
import os
|
||||
import stat
|
||||
import sys
|
||||
import zipfile
|
||||
from contextlib import ExitStack
|
||||
from typing import Tuple
|
||||
from typing import Optional, Tuple
|
||||
|
||||
import requests
|
||||
|
||||
from certbot_integration_tests.utils.constants import DEFAULT_HTTP_01_PORT
|
||||
from certbot_integration_tests.utils.constants import MOCK_OCSP_SERVER_PORT
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
PEBBLE_VERSION = 'v2.3.1'
|
||||
PEBBLE_VERSION = 'v2.5.1'
|
||||
|
||||
|
||||
def fetch(workspace: str, http_01_port: int = DEFAULT_HTTP_01_PORT) -> Tuple[str, str, str]:
|
||||
# pylint: disable=missing-function-docstring
|
||||
suffix = 'linux-amd64' if os.name != 'nt' else 'windows-amd64.exe'
|
||||
|
||||
file_manager = ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
pebble_path_ref = importlib_resources.files('certbot_integration_tests') / 'assets'
|
||||
assets_path = str(file_manager.enter_context(importlib_resources.as_file(pebble_path_ref)))
|
||||
pebble_path_ref = importlib.resources.files('certbot_integration_tests') / 'assets'
|
||||
assets_path = str(file_manager.enter_context(importlib.resources.as_file(pebble_path_ref)))
|
||||
|
||||
pebble_path = _fetch_asset('pebble', suffix, assets_path)
|
||||
challtestsrv_path = _fetch_asset('pebble-challtestsrv', suffix, assets_path)
|
||||
pebble_path = _fetch_asset('pebble', assets_path)
|
||||
challtestsrv_path = _fetch_asset('pebble-challtestsrv', assets_path)
|
||||
pebble_config_path = _build_pebble_config(workspace, http_01_port, assets_path)
|
||||
|
||||
return pebble_path, challtestsrv_path, pebble_config_path
|
||||
|
||||
|
||||
def _fetch_asset(asset: str, suffix: str, assets_path: str) -> str:
|
||||
asset_path = os.path.join(assets_path, '{0}_{1}_{2}'.format(asset, PEBBLE_VERSION, suffix))
|
||||
def _fetch_asset(asset: str, assets_path: str) -> str:
|
||||
platform = 'linux-amd64'
|
||||
base_url = 'https://github.com/letsencrypt/pebble/releases/download'
|
||||
asset_path = os.path.join(assets_path, f'{asset}_{PEBBLE_VERSION}_{platform}')
|
||||
if not os.path.exists(asset_path):
|
||||
asset_url = ('https://github.com/letsencrypt/pebble/releases/download/{0}/{1}_{2}'
|
||||
.format(PEBBLE_VERSION, asset, suffix))
|
||||
asset_url = f'{base_url}/{PEBBLE_VERSION}/{asset}-{platform}.zip'
|
||||
response = requests.get(asset_url, timeout=30)
|
||||
response.raise_for_status()
|
||||
asset_data = _unzip_asset(response.content, asset)
|
||||
if asset_data is None:
|
||||
raise ValueError(f"zipfile {asset_url} didn't contain file {asset}")
|
||||
with open(asset_path, 'wb') as file_h:
|
||||
file_h.write(response.content)
|
||||
file_h.write(asset_data)
|
||||
os.chmod(asset_path, os.stat(asset_path).st_mode | stat.S_IEXEC)
|
||||
|
||||
return asset_path
|
||||
|
||||
|
||||
def _unzip_asset(zipped_data: bytes, asset_name: str) -> Optional[bytes]:
|
||||
with zipfile.ZipFile(io.BytesIO(zipped_data)) as zip_file:
|
||||
for entry in zip_file.filelist:
|
||||
if not entry.is_dir() and entry.filename.endswith(asset_name):
|
||||
return zip_file.read(entry)
|
||||
return None
|
||||
|
||||
|
||||
def _build_pebble_config(workspace: str, http_01_port: int, assets_path: str) -> str:
|
||||
config_path = os.path.join(workspace, 'pebble-config.json')
|
||||
with open(config_path, 'w') as file_h:
|
||||
|
||||
@@ -6,7 +6,6 @@ version = '0.32.0.dev0'
|
||||
install_requires = [
|
||||
'coverage',
|
||||
'cryptography',
|
||||
'importlib_resources>=1.3.1; python_version < "3.9"',
|
||||
'pyopenssl',
|
||||
'pytest',
|
||||
'pytest-cov',
|
||||
@@ -22,7 +21,6 @@ install_requires = [
|
||||
# requests unvendored its dependencies in version 2.16.0 and this code relies on that for
|
||||
# calling `urllib3.disable_warnings`.
|
||||
'requests>=2.16.0',
|
||||
'setuptools',
|
||||
'types-python-dateutil',
|
||||
]
|
||||
|
||||
@@ -34,17 +32,17 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 3 - Alpha',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
],
|
||||
|
||||
@@ -1,38 +0,0 @@
|
||||
# type: ignore
|
||||
"""
|
||||
General conftest for pytest execution of all integration tests lying
|
||||
in the window_installer_integration tests package.
|
||||
As stated by pytest documentation, conftest module is used to set on
|
||||
for a directory a specific configuration using built-in pytest hooks.
|
||||
|
||||
See https://docs.pytest.org/en/latest/reference.html#hook-reference
|
||||
"""
|
||||
|
||||
import os
|
||||
|
||||
ROOT_PATH = os.path.dirname(os.path.dirname(os.path.dirname(__file__)))
|
||||
|
||||
|
||||
def pytest_addoption(parser):
|
||||
"""
|
||||
Standard pytest hook to add options to the pytest parser.
|
||||
:param parser: current pytest parser that will be used on the CLI
|
||||
"""
|
||||
parser.addoption('--installer-path',
|
||||
default=os.path.join(ROOT_PATH, 'windows-installer', 'build',
|
||||
'nsis', 'certbot-beta-installer-win_amd64.exe'),
|
||||
help='set the path of the windows installer to use, default to '
|
||||
'CERTBOT_ROOT_PATH\\windows-installer\\build\\nsis\\certbot-beta-installer-win_amd64.exe') # pylint: disable=line-too-long
|
||||
parser.addoption('--allow-persistent-changes', action='store_true',
|
||||
help='needs to be set, and confirm that the test will make persistent changes on this machine') # pylint: disable=line-too-long
|
||||
|
||||
|
||||
def pytest_configure(config):
|
||||
"""
|
||||
Standard pytest hook used to add a configuration logic for each node of a pytest run.
|
||||
:param config: the current pytest configuration
|
||||
"""
|
||||
if not config.option.allow_persistent_changes:
|
||||
raise RuntimeError('This integration test would install Certbot on your machine. '
|
||||
'Please run it again with the `--allow-persistent-changes` '
|
||||
'flag set to acknowledge.')
|
||||
@@ -1,69 +0,0 @@
|
||||
"""Module executing integration tests for the windows installer."""
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import time
|
||||
from typing import Any
|
||||
|
||||
import pytest
|
||||
|
||||
|
||||
@pytest.mark.skipif(os.name != 'nt', reason='Windows installer tests must be run on Windows.')
|
||||
def test_it(request: pytest.FixtureRequest) -> None:
|
||||
try:
|
||||
subprocess.check_call(['certbot', '--version'])
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
pass
|
||||
else:
|
||||
raise AssertionError('Expect certbot to not be available in the PATH.')
|
||||
|
||||
try:
|
||||
# Install certbot
|
||||
subprocess.check_call([request.config.option.installer_path, '/S'])
|
||||
|
||||
# Assert certbot is installed and runnable
|
||||
output = subprocess.check_output(['certbot', '--version'], universal_newlines=True)
|
||||
assert re.match(r'^certbot \d+\.\d+\.\d+.*$',
|
||||
output), 'Flag --version does not output a version.'
|
||||
|
||||
# Assert renew task is installed and ready
|
||||
output = _ps('(Get-ScheduledTask -TaskName "Certbot Renew Task").State',
|
||||
capture_stdout=True)
|
||||
assert output.strip() == 'Ready'
|
||||
|
||||
# Assert renew task is working
|
||||
now = time.time()
|
||||
_ps('Start-ScheduledTask -TaskName "Certbot Renew Task"')
|
||||
|
||||
status = 'Running'
|
||||
while status != 'Ready':
|
||||
status = _ps('(Get-ScheduledTask -TaskName "Certbot Renew Task").State',
|
||||
capture_stdout=True).strip()
|
||||
time.sleep(1)
|
||||
|
||||
log_path = os.path.join('C:\\', 'Certbot', 'log', 'letsencrypt.log')
|
||||
|
||||
modification_time = os.path.getmtime(log_path)
|
||||
assert now < modification_time, 'Certbot log file has not been modified by the renew task.'
|
||||
|
||||
with open(log_path) as file_h:
|
||||
data = file_h.read()
|
||||
assert 'no renewal failures' in data, 'Renew task did not execute properly.'
|
||||
|
||||
finally:
|
||||
# Sadly this command cannot work in non interactive mode: uninstaller will
|
||||
# ask explicitly permission in an UAC prompt
|
||||
# print('Uninstalling Certbot ...')
|
||||
# uninstall_path = _ps('(gci "HKLM:\\SOFTWARE\\Wow6432Node\\Microsoft\\Windows\\CurrentVersion\\Uninstall"' # pylint: disable=line-too-long
|
||||
# ' | foreach { gp $_.PSPath }'
|
||||
# ' | ? { $_ -match "Certbot" }'
|
||||
# ' | select UninstallString)'
|
||||
# '.UninstallString', capture_stdout=True)
|
||||
# subprocess.check_call([uninstall_path, '/S'])
|
||||
pass
|
||||
|
||||
|
||||
def _ps(powershell_str: str, capture_stdout: bool = False) -> Any:
|
||||
fn = subprocess.check_output if capture_stdout else subprocess.check_call
|
||||
return fn(['powershell.exe', '-c', powershell_str], # type: ignore[operator]
|
||||
universal_newlines=True)
|
||||
@@ -1,4 +1,4 @@
|
||||
FROM docker.io/python:3.8-buster
|
||||
FROM docker.io/python:3.11-buster
|
||||
LABEL maintainer="Brad Warren <bmw@eff.org>"
|
||||
|
||||
# This does not include the dependencies needed to build cryptography. See
|
||||
|
||||
@@ -75,7 +75,7 @@ def _get_server_root(config: str) -> str:
|
||||
if os.path.isdir(os.path.join(config, name))]
|
||||
|
||||
if len(subdirs) != 1:
|
||||
errors.Error("Malformed configuration directory {0}".format(config))
|
||||
raise errors.Error("Malformed configuration directory {0}".format(config))
|
||||
|
||||
return os.path.join(config, subdirs[0].rstrip())
|
||||
|
||||
|
||||
@@ -18,7 +18,7 @@ from typing import Optional
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
|
||||
from OpenSSL import crypto
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
from urllib3.util import connection
|
||||
|
||||
from acme import challenges
|
||||
@@ -147,10 +147,10 @@ def test_installer(args: argparse.Namespace, plugin: common.Proxy, config: str,
|
||||
|
||||
def test_deploy_cert(plugin: common.Proxy, temp_dir: str, domains: List[str]) -> bool:
|
||||
"""Tests deploy_cert returning True if the tests are successful"""
|
||||
cert = crypto_util.gen_ss_cert(util.KEY, domains)
|
||||
cert = crypto_util.gen_ss_cert(util.KEY, domains).to_cryptography()
|
||||
cert_path = os.path.join(temp_dir, "cert.pem")
|
||||
with open(cert_path, "wb") as f:
|
||||
f.write(crypto.dump_certificate(crypto.FILETYPE_PEM, cert))
|
||||
f.write(cert.public_bytes(serialization.Encoding.PEM))
|
||||
|
||||
for domain in domains:
|
||||
try:
|
||||
@@ -390,7 +390,7 @@ def _fake_dns_resolution(resolved_ip: str) -> Generator[None, None, None]:
|
||||
"""Monkey patch urllib3 to make any hostname be resolved to the provided IP"""
|
||||
_original_create_connection = connection.create_connection
|
||||
|
||||
def _patched_create_connection(address: Tuple[str, str],
|
||||
def _patched_create_connection(address: Tuple[str, int],
|
||||
*args: Any, **kwargs: Any) -> socket.socket:
|
||||
_, port = address
|
||||
return _original_create_connection((resolved_ip, port), *args, **kwargs)
|
||||
|
||||
Binary file not shown.
@@ -6,7 +6,7 @@ from typing import Mapping
|
||||
from typing import Optional
|
||||
from typing import Union
|
||||
|
||||
from OpenSSL import crypto
|
||||
from cryptography import x509
|
||||
import requests
|
||||
|
||||
from acme import crypto_util
|
||||
@@ -21,7 +21,7 @@ _VALIDATION_TIMEOUT = 10
|
||||
class Validator:
|
||||
"""Collection of functions to test a live webserver's configuration"""
|
||||
|
||||
def certificate(self, cert: crypto.X509, name: Union[str, bytes],
|
||||
def certificate(self, cert: x509.Certificate, name: Union[str, bytes],
|
||||
alt_host: Optional[str] = None, port: int = 443) -> bool:
|
||||
"""Verifies the certificate presented at name is cert"""
|
||||
if alt_host is None:
|
||||
@@ -39,7 +39,7 @@ class Validator:
|
||||
logger.exception(str(error))
|
||||
return False
|
||||
|
||||
return presented_cert.digest("sha256") == cert.digest("sha256")
|
||||
return presented_cert.to_cryptography() == cert
|
||||
|
||||
def redirect(self, name: str, port: int = 80,
|
||||
headers: Optional[Mapping[str, str]] = None) -> bool:
|
||||
|
||||
@@ -19,7 +19,6 @@
|
||||
|
||||
server {
|
||||
listen 80 default_server;
|
||||
listen [::]:80 default_server ipv6only=on;
|
||||
|
||||
root /usr/share/nginx/html;
|
||||
index index.html index.htm;
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'certbot',
|
||||
@@ -18,17 +18,17 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 3 - Alpha',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
],
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: certbot-dns-cloudflare/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: certbot-dns-cloudflare/readthedocs.org.requirements.txt
|
||||
@@ -170,6 +170,6 @@ texinfo_documents = [
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {
|
||||
'python': ('https://docs.python.org/', None),
|
||||
'acme': ('https://acme-python.readthedocs.org/en/latest/', None),
|
||||
'acme': ('https://acme-python.readthedocs.io/en/latest/', None),
|
||||
'certbot': ('https://eff-certbot.readthedocs.io/en/stable/', None),
|
||||
}
|
||||
|
||||
@@ -4,11 +4,12 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'cloudflare>=1.5.1',
|
||||
'setuptools>=41.6.0',
|
||||
# for now, do not upgrade to cloudflare>=2.20 to avoid deprecation warnings and the breaking
|
||||
# changes in version 3.0. see https://github.com/certbot/certbot/issues/9938
|
||||
'cloudflare>=1.5.1, <2.20',
|
||||
]
|
||||
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
@@ -39,7 +40,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -48,10 +49,10 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: certbot-dns-digitalocean/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: certbot-dns-digitalocean/readthedocs.org.requirements.txt
|
||||
@@ -170,6 +170,6 @@ texinfo_documents = [
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {
|
||||
'python': ('https://docs.python.org/', None),
|
||||
'acme': ('https://acme-python.readthedocs.org/en/latest/', None),
|
||||
'acme': ('https://acme-python.readthedocs.io/en/latest/', None),
|
||||
'certbot': ('https://eff-certbot.readthedocs.io/en/stable/', None),
|
||||
}
|
||||
|
||||
@@ -4,11 +4,10 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'python-digitalocean>=1.11', # 1.15.0 or newer is recommended for TTL support
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
@@ -39,7 +38,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -48,10 +47,10 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: certbot-dns-dnsimple/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: certbot-dns-dnsimple/readthedocs.org.requirements.txt
|
||||
@@ -170,6 +170,6 @@ texinfo_documents = [
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {
|
||||
'python': ('https://docs.python.org/', None),
|
||||
'acme': ('https://acme-python.readthedocs.org/en/latest/', None),
|
||||
'acme': ('https://acme-python.readthedocs.io/en/latest/', None),
|
||||
'certbot': ('https://eff-certbot.readthedocs.io/en/stable/', None),
|
||||
}
|
||||
|
||||
@@ -4,13 +4,12 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
# This version of lexicon is required to address the problem described in
|
||||
# https://github.com/AnalogJ/lexicon/issues/387.
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
@@ -41,7 +40,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -50,10 +49,10 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: certbot-dns-dnsmadeeasy/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: certbot-dns-dnsmadeeasy/readthedocs.org.requirements.txt
|
||||
@@ -170,6 +170,6 @@ texinfo_documents = [
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {
|
||||
'python': ('https://docs.python.org/', None),
|
||||
'acme': ('https://acme-python.readthedocs.org/en/latest/', None),
|
||||
'acme': ('https://acme-python.readthedocs.io/en/latest/', None),
|
||||
'certbot': ('https://eff-certbot.readthedocs.io/en/stable/', None),
|
||||
}
|
||||
|
||||
@@ -4,11 +4,10 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
@@ -39,7 +38,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -48,10 +47,10 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: certbot-dns-gehirn/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: certbot-dns-gehirn/readthedocs.org.requirements.txt
|
||||
@@ -1,7 +1,6 @@
|
||||
"""Tests for certbot_dns_gehirn._internal.dns_gehirn."""
|
||||
|
||||
import sys
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
|
||||
@@ -170,6 +170,6 @@ texinfo_documents = [
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {
|
||||
'python': ('https://docs.python.org/', None),
|
||||
'acme': ('https://acme-python.readthedocs.org/en/latest/', None),
|
||||
'acme': ('https://acme-python.readthedocs.io/en/latest/', None),
|
||||
'certbot': ('https://eff-certbot.readthedocs.io/en/stable/', None),
|
||||
}
|
||||
|
||||
@@ -4,11 +4,10 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
@@ -39,7 +38,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -48,10 +47,10 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: certbot-dns-google/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: certbot-dns-google/readthedocs.org.requirements.txt
|
||||
@@ -170,6 +170,6 @@ texinfo_documents = [
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {
|
||||
'python': ('https://docs.python.org/', None),
|
||||
'acme': ('https://acme-python.readthedocs.org/en/latest/', None),
|
||||
'acme': ('https://acme-python.readthedocs.io/en/latest/', None),
|
||||
'certbot': ('https://eff-certbot.readthedocs.io/en/stable/', None),
|
||||
}
|
||||
|
||||
@@ -4,12 +4,11 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'google-api-python-client>=1.6.5',
|
||||
'google-auth>=2.16.0',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
@@ -40,7 +39,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -49,10 +48,10 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: certbot-dns-linode/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: certbot-dns-linode/readthedocs.org.requirements.txt
|
||||
@@ -170,6 +170,6 @@ texinfo_documents = [
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {
|
||||
'python': ('https://docs.python.org/', None),
|
||||
'acme': ('https://acme-python.readthedocs.org/en/latest/', None),
|
||||
'acme': ('https://acme-python.readthedocs.io/en/latest/', None),
|
||||
'certbot': ('https://eff-certbot.readthedocs.io/en/stable/', None),
|
||||
}
|
||||
|
||||
@@ -4,11 +4,10 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
@@ -39,7 +38,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -48,10 +47,10 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: certbot-dns-luadns/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: certbot-dns-luadns/readthedocs.org.requirements.txt
|
||||
@@ -170,6 +170,6 @@ texinfo_documents = [
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {
|
||||
'python': ('https://docs.python.org/', None),
|
||||
'acme': ('https://acme-python.readthedocs.org/en/latest/', None),
|
||||
'acme': ('https://acme-python.readthedocs.io/en/latest/', None),
|
||||
'certbot': ('https://eff-certbot.readthedocs.io/en/stable/', None),
|
||||
}
|
||||
|
||||
@@ -4,11 +4,10 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
@@ -39,7 +38,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -48,10 +47,10 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: certbot-dns-nsone/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: certbot-dns-nsone/readthedocs.org.requirements.txt
|
||||
@@ -170,6 +170,6 @@ texinfo_documents = [
|
||||
# Example configuration for intersphinx: refer to the Python standard library.
|
||||
intersphinx_mapping = {
|
||||
'python': ('https://docs.python.org/', None),
|
||||
'acme': ('https://acme-python.readthedocs.org/en/latest/', None),
|
||||
'acme': ('https://acme-python.readthedocs.io/en/latest/', None),
|
||||
'certbot': ('https://eff-certbot.readthedocs.io/en/stable/', None),
|
||||
}
|
||||
|
||||
@@ -4,11 +4,10 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.8.0.dev0'
|
||||
version = '3.2.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
@@ -39,7 +38,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.8',
|
||||
python_requires='>=3.9',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -48,10 +47,10 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -14,14 +14,14 @@ build:
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: docs/conf.py
|
||||
configuration: certbot-dns-ovh/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
# formats:
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
@@ -30,4 +30,4 @@ sphinx:
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: ../tools/requirements.txt
|
||||
- requirements: certbot-dns-ovh/readthedocs.org.requirements.txt
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user