Compare commits
212 Commits
test-rewri
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
a2f86b5514 | ||
|
|
4b51e3004c | ||
|
|
018800c5cc | ||
|
|
2eb4154169 | ||
|
|
becc2c3fee | ||
|
|
cb5382d4d5 | ||
|
|
6975e32998 | ||
|
|
62962357c5 | ||
|
|
343b540970 | ||
|
|
089b7efacd | ||
|
|
1584b0b58c | ||
|
|
141b15077c | ||
|
|
ee2c4844b9 | ||
|
|
181813b9b2 | ||
|
|
43d0652b0d | ||
|
|
80e68bec26 | ||
|
|
7b2b2b1685 | ||
|
|
c3c587001f | ||
|
|
281b724996 | ||
|
|
3d5714f499 | ||
|
|
ba9f1939ab | ||
|
|
481c8c0600 | ||
|
|
35b177a1a0 | ||
|
|
95976762ac | ||
|
|
bf64e7f4e4 | ||
|
|
9213154e44 | ||
|
|
810d50eb3d | ||
|
|
99a4129cd4 | ||
|
|
8db8fcf26c | ||
|
|
6d8fec7760 | ||
|
|
4f3af45f5c | ||
|
|
8ebd8ea9fb | ||
|
|
83d8fbbd75 | ||
|
|
0c49ab462f | ||
|
|
35091d878f | ||
|
|
c31f53a225 | ||
|
|
d2a13c55f2 | ||
|
|
de1ce7340f | ||
|
|
929f9e944f | ||
|
|
6c422774d5 | ||
|
|
443ec2200f | ||
|
|
38cbeb560c | ||
|
|
873f979a25 | ||
|
|
2a41402f2a | ||
|
|
6ecf3782ac | ||
|
|
d1347fce9a | ||
|
|
9412ce9f05 | ||
|
|
fabe7bbc78 | ||
|
|
1e34fb8b51 | ||
|
|
4d7d0d6d04 | ||
|
|
cf77b3c3fa | ||
|
|
a7674bd45a | ||
|
|
cdeac7a745 | ||
|
|
50b2097d38 | ||
|
|
30e7f23360 | ||
|
|
248455a92b | ||
|
|
cca30ace31 | ||
|
|
90348bde4e | ||
|
|
54dd12cd57 | ||
|
|
4e6934a4b6 | ||
|
|
57bb4e40b7 | ||
|
|
7f885292f9 | ||
|
|
8978e4dbff | ||
|
|
920b717c45 | ||
|
|
54b7b1883e | ||
|
|
87ab76fc7d | ||
|
|
4925f71933 | ||
|
|
39fda1d44d | ||
|
|
c8a1e30981 | ||
|
|
7abf143394 | ||
|
|
f4e031f505 | ||
|
|
2844fdd74a | ||
|
|
3b183961a9 | ||
|
|
76411ecca7 | ||
|
|
725c64d581 | ||
|
|
99ae4ac5ef | ||
|
|
b8b759f1d2 | ||
|
|
8b5a017b05 | ||
|
|
b7ef536ec3 | ||
|
|
282df74ee9 | ||
|
|
0a565815f9 | ||
|
|
d33bbf35c2 | ||
|
|
714a0b348d | ||
|
|
7ca1b8f286 | ||
|
|
be40e377d9 | ||
|
|
01cf4bae75 | ||
|
|
ef949f9149 | ||
|
|
926d0c7e0f | ||
|
|
9d8eb6ccfd | ||
|
|
585f70e700 | ||
|
|
21e24264f4 | ||
|
|
cf78ad3a3d | ||
|
|
dccb92d57f | ||
|
|
f9d31faadc | ||
|
|
e9225d1cc2 | ||
|
|
3dd1f0eea9 | ||
|
|
917e3aba6b | ||
|
|
3833255980 | ||
|
|
619654f317 | ||
|
|
76f9a33e45 | ||
|
|
5f67bb99a8 | ||
|
|
d8392bf394 | ||
|
|
6a89fcbc56 | ||
|
|
2adaacab82 | ||
|
|
2ae810c45a | ||
|
|
b62133e3e1 | ||
|
|
a92bb44ff9 | ||
|
|
9650c25968 | ||
|
|
c3c29afdca | ||
|
|
dca4ddd3d8 | ||
|
|
7bb85f8440 | ||
|
|
cf4f07d17e | ||
|
|
36c78b3717 | ||
|
|
bf5475fa74 | ||
|
|
9bfc9dda5c | ||
|
|
e904bd4e29 | ||
|
|
d140a7df52 | ||
|
|
bd550c09c2 | ||
|
|
01405a8fa6 | ||
|
|
5bf833fe28 | ||
|
|
d1577280ad | ||
|
|
3ae9d7e03a | ||
|
|
5594ac20e0 | ||
|
|
7f6000f1d4 | ||
|
|
1863c66179 | ||
|
|
185c20c71b | ||
|
|
a1b773cbdc | ||
|
|
937eaef621 | ||
|
|
e40741955f | ||
|
|
6f7b5ab1cd | ||
|
|
5cf5f36f19 | ||
|
|
a96fb4b6ce | ||
|
|
11e17ef77b | ||
|
|
8a95c030e6 | ||
|
|
d9d825ac50 | ||
|
|
07b1b0d8b2 | ||
|
|
beec975379 | ||
|
|
01d129dfca | ||
|
|
8bf21cad25 | ||
|
|
dcac5ed8f0 | ||
|
|
228e3f2a8d | ||
|
|
6624e0b65c | ||
|
|
21113d17c7 | ||
|
|
61773be971 | ||
|
|
5849ff73fb | ||
|
|
4e60a0d03a | ||
|
|
44046c70c3 | ||
|
|
02efc8c5ca | ||
|
|
0862e05754 | ||
|
|
08d1979bcb | ||
|
|
6c66764f25 | ||
|
|
c4642c2dfe | ||
|
|
bcb7f371e3 | ||
|
|
732a3ac962 | ||
|
|
694c758db7 | ||
|
|
f5cb0a156b | ||
|
|
4178e8ffc4 | ||
|
|
a3353b5c42 | ||
|
|
24c8825d22 | ||
|
|
23f9dfc655 | ||
|
|
cc359dab46 | ||
|
|
89902e26bf | ||
|
|
b1978ff188 | ||
|
|
579b39dce1 | ||
|
|
9b4b99f3e8 | ||
|
|
3e84b94308 | ||
|
|
2cb2cb0575 | ||
|
|
ddd4b31b1c | ||
|
|
68d812e6dd | ||
|
|
6effedc2f4 | ||
|
|
c31d3a2cfd | ||
|
|
e6572e695b | ||
|
|
a7674548ab | ||
|
|
436b7fbe28 | ||
|
|
d0e11c81b1 | ||
|
|
4fc4d536c1 | ||
|
|
b1e5efac3c | ||
|
|
539d48d438 | ||
|
|
ae6268ea3c | ||
|
|
2d8a274eb5 | ||
|
|
ff8afe827b | ||
|
|
468f4749b8 | ||
|
|
a5d223d1e5 | ||
|
|
b5661e84e8 | ||
|
|
aa270b37a2 | ||
|
|
35209d921d | ||
|
|
0ac8e10c85 | ||
|
|
36bfddbf4e | ||
|
|
721c4665e6 | ||
|
|
013621d04e | ||
|
|
e0e2bfe13a | ||
|
|
d2e2a92cdd | ||
|
|
6e52695faa | ||
|
|
5b5a2efdc9 | ||
|
|
8a0b0f63de | ||
|
|
10fba2ee3f | ||
|
|
67f14f177b | ||
|
|
f378ec4a0f | ||
|
|
b0d0a83277 | ||
|
|
399b932a86 | ||
|
|
b9ec3155f7 | ||
|
|
ef5f4cae04 | ||
|
|
31094bc547 | ||
|
|
f41673982d | ||
|
|
996cc20cd7 | ||
|
|
20ccf8c9c9 | ||
|
|
5503d12395 | ||
|
|
4740e20725 | ||
|
|
dc05b4da7a | ||
|
|
5149dfd96e | ||
|
|
9ee1eee219 | ||
|
|
7a68b29140 |
@@ -1,8 +1,8 @@
|
||||
# Configuring Azure Pipelines with Certbot
|
||||
|
||||
Let's begin. All pipelines are defined in `.azure-pipelines`. Currently there are two:
|
||||
* `.azure-pipelines/main.yml` is the main one, executed on PRs for master, and pushes to master,
|
||||
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for master.
|
||||
* `.azure-pipelines/main.yml` is the main one, executed on PRs for main, and pushes to main,
|
||||
* `.azure-pipelines/advanced.yml` add installer testing on top of the main pipeline, and is executed for `test-*` branches, release branches, and nightly run for main.
|
||||
|
||||
Several templates are defined in `.azure-pipelines/templates`. These YAML files aggregate common jobs configuration that can be reused in several pipelines.
|
||||
|
||||
@@ -64,7 +64,7 @@ Azure Pipeline needs RW on code, RO on metadata, RW on checks, commit statuses,
|
||||
RW access here is required to allow update of the pipelines YAML files from Azure DevOps interface, and to
|
||||
update the status of builds and PRs on GitHub side when Azure Pipelines are triggered.
|
||||
Note however that no admin access is defined here: this means that Azure Pipelines cannot do anything with
|
||||
protected branches, like master, and cannot modify the security context around this on GitHub.
|
||||
protected branches, like main, and cannot modify the security context around this on GitHub.
|
||||
Access can be defined for all or only selected repositories, which is nice.
|
||||
```
|
||||
|
||||
@@ -91,11 +91,11 @@ grant permissions from Azure Pipelines to GitHub in order to setup a GitHub OAut
|
||||
then are way too large (admin level on almost everything), while the classic approach does not add any more
|
||||
permissions, and works perfectly well.__
|
||||
|
||||
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, master in default branch.
|
||||
- Select GitHub in "Select your repository section", choose certbot/certbot in Repository, main in default branch.
|
||||
- Click on YAML option for "Select a template"
|
||||
- Choose a name for the pipeline (eg. test-pipeline), and browse to the actual pipeline YAML definition in the
|
||||
"YAML file path" input (eg. `.azure-pipelines/test-pipeline.yml`)
|
||||
- Click "Save & queue", choose the master branch to build the first pipeline, and click "Save and run" button.
|
||||
- Click "Save & queue", choose the main branch to build the first pipeline, and click "Save and run" button.
|
||||
|
||||
_Done. Pipeline is operational. Repeat to add more pipelines from existing YAML files in `.azure-pipelines`._
|
||||
|
||||
|
||||
@@ -1,9 +1,9 @@
|
||||
# We run the test suite on commits to master so codecov gets coverage data
|
||||
# about the master branch and can use it to track coverage changes.
|
||||
# We run the test suite on commits to main so codecov gets coverage data
|
||||
# about the main branch and can use it to track coverage changes.
|
||||
trigger:
|
||||
- master
|
||||
- main
|
||||
pr:
|
||||
- master
|
||||
- main
|
||||
- '*.x'
|
||||
|
||||
variables:
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Nightly pipeline running each day for master.
|
||||
# Nightly pipeline running each day for main.
|
||||
trigger: none
|
||||
pr: none
|
||||
schedules:
|
||||
@@ -6,7 +6,7 @@ schedules:
|
||||
displayName: Nightly build
|
||||
branches:
|
||||
include:
|
||||
- master
|
||||
- main
|
||||
always: true
|
||||
|
||||
variables:
|
||||
@@ -15,5 +15,5 @@ variables:
|
||||
|
||||
stages:
|
||||
- template: templates/stages/test-and-package-stage.yml
|
||||
- template: templates/stages/deploy-stage.yml
|
||||
- template: templates/stages/nightly-deploy-stage.yml
|
||||
- template: templates/stages/notify-failure-stage.yml
|
||||
|
||||
@@ -13,7 +13,5 @@ variables:
|
||||
stages:
|
||||
- template: templates/stages/test-and-package-stage.yml
|
||||
- template: templates/stages/changelog-stage.yml
|
||||
- template: templates/stages/deploy-stage.yml
|
||||
parameters:
|
||||
snapReleaseChannel: beta
|
||||
- template: templates/stages/release-deploy-stage.yml
|
||||
- template: templates/stages/notify-failure-stage.yml
|
||||
|
||||
@@ -72,3 +72,57 @@ jobs:
|
||||
tools/retry.sh eval snapcraft upload --release=${{ parameters.snapReleaseChannel }} "${SNAP_FILE}"
|
||||
done
|
||||
displayName: Publish to Snap store
|
||||
# The credentials used in the following jobs are for the shared
|
||||
# certbotbot account on Docker Hub. The credentials are stored
|
||||
# in a service account which was created by following the
|
||||
# instructions at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
|
||||
# The name given to this service account must match the value
|
||||
# given to containerRegistry below. The authentication used when
|
||||
# creating this service account was a personal access token
|
||||
# rather than a password to bypass 2FA. When Brad set this up,
|
||||
# Azure Pipelines failed to verify the credentials with an error
|
||||
# like "access is forbidden with a JWT issued from a personal
|
||||
# access token", but after saving them without verification, the
|
||||
# access token worked when the pipeline actually ran. "Grant
|
||||
# access to all pipelines" should also be checked on the service
|
||||
# account. The access token can be deleted on Docker Hub if
|
||||
# these credentials need to be revoked.
|
||||
- job: publish_docker_by_arch
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
strategy:
|
||||
matrix:
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
path: $(Build.SourcesDirectory)
|
||||
displayName: Retrieve Docker images
|
||||
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
|
||||
displayName: Load Docker images
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy_images.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Deploy the Docker images by architecture
|
||||
- job: publish_docker_multiarch
|
||||
dependsOn: publish_docker_by_arch
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy_manifests.sh $(dockerTag) all
|
||||
displayName: Deploy the Docker multiarch manifests
|
||||
@@ -4,55 +4,47 @@ jobs:
|
||||
- name: IMAGE_NAME
|
||||
value: ubuntu-22.04
|
||||
- name: PYTHON_VERSION
|
||||
value: 3.11
|
||||
value: 3.12
|
||||
- group: certbot-common
|
||||
strategy:
|
||||
matrix:
|
||||
linux-py38:
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: py38
|
||||
linux-py39:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: py39
|
||||
linux-py310:
|
||||
PYTHON_VERSION: 3.10
|
||||
TOXENV: py310
|
||||
linux-boulder-v2-integration-certbot-oldest:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: integration-certbot-oldest
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-integration-nginx-oldest:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: integration-nginx-oldest
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-py37-integration:
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-py38-integration:
|
||||
linux-py311:
|
||||
PYTHON_VERSION: 3.11
|
||||
TOXENV: py311
|
||||
linux-isolated:
|
||||
TOXENV: 'isolated-acme,isolated-certbot,isolated-apache,isolated-cloudflare,isolated-digitalocean,isolated-dnsimple,isolated-dnsmadeeasy,isolated-gehirn,isolated-google,isolated-linode,isolated-luadns,isolated-nsone,isolated-ovh,isolated-rfc2136,isolated-route53,isolated-sakuracloud,isolated-nginx'
|
||||
linux-integration-certbot-oldest:
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-py39-integration:
|
||||
TOXENV: integration-certbot-oldest
|
||||
linux-integration-nginx-oldest:
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration-nginx-oldest
|
||||
# python 3.8 integration tests are not run here because they're run as
|
||||
# part of the standard test suite
|
||||
linux-py39-integration:
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-py310-integration:
|
||||
linux-py310-integration:
|
||||
PYTHON_VERSION: 3.10
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-boulder-v2-py311-integration:
|
||||
linux-py311-integration:
|
||||
PYTHON_VERSION: 3.11
|
||||
TOXENV: integration
|
||||
ACME_SERVER: boulder-v2
|
||||
linux-py312-integration:
|
||||
PYTHON_VERSION: 3.12
|
||||
TOXENV: integration
|
||||
nginx-compat:
|
||||
TOXENV: nginx_compat
|
||||
linux-integration-rfc2136:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration-dns-rfc2136
|
||||
docker-dev:
|
||||
TOXENV: docker_dev
|
||||
le-modification:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: modification
|
||||
|
||||
@@ -4,12 +4,12 @@ jobs:
|
||||
vmImage: ubuntu-22.04
|
||||
strategy:
|
||||
matrix:
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
# The default timeout of 60 minutes is a little low for compiling
|
||||
# cryptography on ARM architectures.
|
||||
timeoutInMinutes: 180
|
||||
@@ -32,84 +32,29 @@ jobs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
displayName: Store Docker artifact
|
||||
- job: docker_run
|
||||
- job: docker_test
|
||||
dependsOn: docker_build
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
strategy:
|
||||
matrix:
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: docker_amd64
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
path: $(Build.SourcesDirectory)
|
||||
displayName: Retrieve Docker images
|
||||
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
|
||||
displayName: Load Docker images
|
||||
- bash: |
|
||||
set -ex
|
||||
DOCKER_IMAGES=$(docker images --filter reference='*/certbot' --filter reference='*/dns-*' --format '{{.Repository}}:{{.Tag}}')
|
||||
for DOCKER_IMAGE in ${DOCKER_IMAGES}
|
||||
do docker run --rm "${DOCKER_IMAGE}" plugins --prepare
|
||||
done
|
||||
set -e && tools/docker/test.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Run integration tests for Docker images
|
||||
- job: installer_build
|
||||
pool:
|
||||
vmImage: windows-2019
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.9
|
||||
architecture: x64
|
||||
addToPath: true
|
||||
- script: |
|
||||
python -m venv venv
|
||||
venv\Scripts\python tools\pip_install.py -e windows-installer
|
||||
displayName: Prepare Windows installer build environment
|
||||
- script: |
|
||||
venv\Scripts\construct-windows-installer
|
||||
displayName: Build Certbot installer
|
||||
- task: CopyFiles@2
|
||||
inputs:
|
||||
sourceFolder: $(System.DefaultWorkingDirectory)/windows-installer/build/nsis
|
||||
contents: '*.exe'
|
||||
targetFolder: $(Build.ArtifactStagingDirectory)
|
||||
- task: PublishPipelineArtifact@1
|
||||
inputs:
|
||||
path: $(Build.ArtifactStagingDirectory)
|
||||
# If we change the artifact's name, it should also be changed in tools/create_github_release.py
|
||||
artifact: windows-installer
|
||||
displayName: Publish Windows installer
|
||||
- job: installer_run
|
||||
dependsOn: installer_build
|
||||
strategy:
|
||||
matrix:
|
||||
win2019:
|
||||
imageName: windows-2019
|
||||
pool:
|
||||
vmImage: $(imageName)
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.9
|
||||
addToPath: true
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: windows-installer
|
||||
path: $(Build.SourcesDirectory)/bin
|
||||
displayName: Retrieve Windows installer
|
||||
- script: |
|
||||
python -m venv venv
|
||||
venv\Scripts\python tools\pip_install.py -e certbot-ci
|
||||
env:
|
||||
PIP_NO_BUILD_ISOLATION: no
|
||||
displayName: Prepare Certbot-CI
|
||||
- script: |
|
||||
set PATH=%ProgramFiles%\Certbot\bin;%PATH%
|
||||
venv\Scripts\python -m pytest certbot-ci\windows_installer_integration_tests --allow-persistent-changes --installer-path $(Build.SourcesDirectory)\bin\certbot-beta-installer-win_amd64.exe
|
||||
displayName: Run windows installer integration tests
|
||||
- script: |
|
||||
set PATH=%ProgramFiles%\Certbot\bin;%PATH%
|
||||
venv\Scripts\python -m pytest certbot-ci\certbot_integration_tests\certbot_tests -n 4
|
||||
displayName: Run certbot integration tests
|
||||
- job: snaps_build
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
@@ -131,7 +76,7 @@ jobs:
|
||||
displayName: Install dependencies
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
versionSpec: 3.12
|
||||
addToPath: true
|
||||
- task: DownloadSecureFile@1
|
||||
name: credentials
|
||||
@@ -162,7 +107,7 @@ jobs:
|
||||
steps:
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
versionSpec: 3.12
|
||||
addToPath: true
|
||||
- script: |
|
||||
set -e
|
||||
@@ -182,7 +127,7 @@ jobs:
|
||||
displayName: Install Certbot snap
|
||||
- script: |
|
||||
set -e
|
||||
venv/bin/python -m tox -e integration-external,apacheconftest-external-with-pebble
|
||||
venv/bin/python -m tox run -e integration-external,apacheconftest-external-with-pebble
|
||||
displayName: Run tox
|
||||
- job: snap_dns_run
|
||||
dependsOn: snaps_build
|
||||
@@ -196,7 +141,7 @@ jobs:
|
||||
displayName: Install dependencies
|
||||
- task: UsePythonVersion@0
|
||||
inputs:
|
||||
versionSpec: 3.8
|
||||
versionSpec: 3.12
|
||||
addToPath: true
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
|
||||
@@ -1,40 +1,29 @@
|
||||
jobs:
|
||||
- job: test
|
||||
variables:
|
||||
PYTHON_VERSION: 3.11
|
||||
PYTHON_VERSION: 3.12
|
||||
strategy:
|
||||
matrix:
|
||||
macos-py37-cover:
|
||||
macos-py38-cover:
|
||||
IMAGE_NAME: macOS-12
|
||||
PYTHON_VERSION: 3.7
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: cover
|
||||
# As of pip 23.1.0, builds started failing on macOS unless this flag was set.
|
||||
# See https://github.com/certbot/certbot/pull/9717#issuecomment-1610861794.
|
||||
PIP_USE_PEP517: "true"
|
||||
macos-cover:
|
||||
IMAGE_NAME: macOS-12
|
||||
IMAGE_NAME: macOS-13
|
||||
TOXENV: cover
|
||||
windows-py37:
|
||||
IMAGE_NAME: windows-2019
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: py37-win
|
||||
windows-py39-cover:
|
||||
IMAGE_NAME: windows-2019
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: cover-win
|
||||
windows-integration-certbot:
|
||||
IMAGE_NAME: windows-2019
|
||||
PYTHON_VERSION: 3.9
|
||||
TOXENV: integration-certbot
|
||||
linux-oldest-tests-1:
|
||||
# See explanation under macos-py38-cover.
|
||||
PIP_USE_PEP517: "true"
|
||||
linux-oldest:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: '{acme,apache,apache-v2,certbot}-oldest'
|
||||
linux-oldest-tests-2:
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: oldest
|
||||
linux-py38:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: '{dns,nginx}-oldest'
|
||||
linux-py37:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.7
|
||||
TOXENV: py37
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: py38
|
||||
linux-cover:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: cover
|
||||
@@ -43,12 +32,11 @@ jobs:
|
||||
TOXENV: lint-posix
|
||||
linux-mypy:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: mypy-posix
|
||||
TOXENV: mypy
|
||||
linux-integration:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
PYTHON_VERSION: 3.8
|
||||
TOXENV: integration
|
||||
ACME_SERVER: pebble
|
||||
apache-compat:
|
||||
IMAGE_NAME: ubuntu-22.04
|
||||
TOXENV: apache_compat
|
||||
|
||||
@@ -1,67 +0,0 @@
|
||||
parameters:
|
||||
# We do not define acceptable values for this parameter here as it is passed
|
||||
# through to ../jobs/snap-deploy-job.yml which does its own sanity checking.
|
||||
- name: snapReleaseChannel
|
||||
type: string
|
||||
default: edge
|
||||
|
||||
stages:
|
||||
- stage: Deploy
|
||||
jobs:
|
||||
- template: ../jobs/snap-deploy-job.yml
|
||||
parameters:
|
||||
snapReleaseChannel: ${{ parameters.snapReleaseChannel }}
|
||||
# The credentials used in the following jobs are for the shared
|
||||
# certbotbot account on Docker Hub. The credentials are stored
|
||||
# in a service account which was created by following the
|
||||
# instructions at
|
||||
# https://docs.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#sep-docreg.
|
||||
# The name given to this service account must match the value
|
||||
# given to containerRegistry below. The authentication used when
|
||||
# creating this service account was a personal access token
|
||||
# rather than a password to bypass 2FA. When Brad set this up,
|
||||
# Azure Pipelines failed to verify the credentials with an error
|
||||
# like "access is forbidden with a JWT issued from a personal
|
||||
# access token", but after saving them without verification, the
|
||||
# access token worked when the pipeline actually ran. "Grant
|
||||
# access to all pipelines" should also be checked on the service
|
||||
# account. The access token can be deleted on Docker Hub if
|
||||
# these credentials need to be revoked.
|
||||
- job: publish_docker_by_arch
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
strategy:
|
||||
matrix:
|
||||
amd64:
|
||||
DOCKER_ARCH: amd64
|
||||
arm32v6:
|
||||
DOCKER_ARCH: arm32v6
|
||||
arm64v8:
|
||||
DOCKER_ARCH: arm64v8
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: docker_$(DOCKER_ARCH)
|
||||
path: $(Build.SourcesDirectory)
|
||||
displayName: Retrieve Docker images
|
||||
- bash: set -e && docker load --input $(Build.SourcesDirectory)/images.tar
|
||||
displayName: Load Docker images
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy_by_arch.sh $(dockerTag) $DOCKER_ARCH
|
||||
displayName: Deploy the Docker images by architecture
|
||||
- job: publish_docker_multiarch
|
||||
dependsOn: publish_docker_by_arch
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- task: Docker@2
|
||||
inputs:
|
||||
command: login
|
||||
containerRegistry: docker-hub
|
||||
displayName: Login to Docker Hub
|
||||
- bash: set -e && tools/docker/deploy_multiarch.sh $(dockerTag)
|
||||
displayName: Deploy the Docker multiarch manifests
|
||||
@@ -0,0 +1,6 @@
|
||||
stages:
|
||||
- stage: Deploy
|
||||
jobs:
|
||||
- template: ../jobs/common-deploy-jobs.yml
|
||||
parameters:
|
||||
snapReleaseChannel: edge
|
||||
38
.azure-pipelines/templates/stages/release-deploy-stage.yml
Normal file
38
.azure-pipelines/templates/stages/release-deploy-stage.yml
Normal file
@@ -0,0 +1,38 @@
|
||||
stages:
|
||||
- stage: Deploy
|
||||
jobs:
|
||||
- template: ../jobs/common-deploy-jobs.yml
|
||||
parameters:
|
||||
snapReleaseChannel: beta
|
||||
- job: create_github_release
|
||||
pool:
|
||||
vmImage: ubuntu-22.04
|
||||
steps:
|
||||
- task: DownloadPipelineArtifact@2
|
||||
inputs:
|
||||
artifact: changelog
|
||||
path: '$(Pipeline.Workspace)'
|
||||
- task: GitHubRelease@1
|
||||
inputs:
|
||||
# this "github-releases" credential is what azure pipelines calls a
|
||||
# "service connection". it was created using the instructions at
|
||||
# https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#github-service-connection
|
||||
# with a fine-grained personal access token from github to limit
|
||||
# the permissions given to azure pipelines. the connection on azure
|
||||
# needs permissions for the "release" pipeline (and maybe the
|
||||
# "full-test-suite" pipeline to simplify testing it). information
|
||||
# on how to set up these permissions can be found at
|
||||
# https://learn.microsoft.com/en-us/azure/devops/pipelines/library/service-endpoints?view=azure-devops&tabs=yaml#secure-a-service-connection.
|
||||
# the github token that is used needs "contents:write" and
|
||||
# "workflows:write" permissions for the certbot repo
|
||||
#
|
||||
# as of writing this, the current token will expire on 3/15/2025.
|
||||
# when recreating it, you may also want to create it using the
|
||||
# shared "certbotbot" github account so the credentials aren't tied
|
||||
# to any one dev's github account and their access to the certbot
|
||||
# repo
|
||||
gitHubConnection: github-releases
|
||||
title: ${{ format('Certbot {0}', replace(variables['Build.SourceBranchName'], 'v', '')) }}
|
||||
releaseNotesFilePath: '$(Pipeline.Workspace)/release_notes.md'
|
||||
assets: '$(Build.SourcesDirectory)/packages/{*.tar.gz,SHA256SUMS*}'
|
||||
addChangeLog: false
|
||||
@@ -1,9 +1,17 @@
|
||||
# This does not include the dependencies needed to build cryptography. See
|
||||
# https://cryptography.io/en/latest/installation/
|
||||
steps:
|
||||
# We run brew update because we've seen attempts to install an older version
|
||||
# of a package fail. See
|
||||
# https://github.com/actions/virtual-environments/issues/3165.
|
||||
#
|
||||
# We untap homebrew/core and homebrew/cask and unset HOMEBREW_NO_INSTALL_FROM_API (which
|
||||
# is set by the CI macOS env) because GitHub has been having issues, making these jobs
|
||||
# fail on git clones: https://github.com/orgs/Homebrew/discussions/4612.
|
||||
- bash: |
|
||||
set -e
|
||||
unset HOMEBREW_NO_INSTALL_FROM_API
|
||||
brew untap homebrew/core homebrew/cask
|
||||
brew update
|
||||
brew install augeas
|
||||
condition: startswith(variables['IMAGE_NAME'], 'macOS')
|
||||
@@ -12,14 +20,8 @@ steps:
|
||||
set -e
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y --no-install-recommends \
|
||||
python3-dev \
|
||||
gcc \
|
||||
libaugeas0 \
|
||||
libssl-dev \
|
||||
libffi-dev \
|
||||
ca-certificates \
|
||||
nginx-light \
|
||||
openssl
|
||||
nginx-light
|
||||
sudo systemctl stop nginx
|
||||
sudo sysctl net.ipv4.ip_unprivileged_port_start=0
|
||||
condition: startswith(variables['IMAGE_NAME'], 'ubuntu')
|
||||
@@ -42,7 +44,7 @@ steps:
|
||||
export TARGET_BRANCH="`echo "${BUILD_SOURCEBRANCH}" | sed -E 's!refs/(heads|tags)/!!g'`"
|
||||
[ -z "${SYSTEM_PULLREQUEST_TARGETBRANCH}" ] || export TARGET_BRANCH="${SYSTEM_PULLREQUEST_TARGETBRANCH}"
|
||||
env
|
||||
python3 -m tox
|
||||
python3 -m tox run
|
||||
env:
|
||||
AWS_ACCESS_KEY_ID: $(AWS_ACCESS_KEY_ID)
|
||||
AWS_SECRET_ACCESS_KEY: $(AWS_SECRET_ACCESS_KEY)
|
||||
|
||||
19
.coveragerc
19
.coveragerc
@@ -1,5 +1,24 @@
|
||||
[run]
|
||||
omit = */setup.py
|
||||
source =
|
||||
acme
|
||||
certbot
|
||||
certbot-apache
|
||||
certbot-dns-cloudflare
|
||||
certbot-dns-digitalocean
|
||||
certbot-dns-dnsimple
|
||||
certbot-dns-dnsmadeeasy
|
||||
certbot-dns-gehirn
|
||||
certbot-dns-google
|
||||
certbot-dns-linode
|
||||
certbot-dns-luadns
|
||||
certbot-dns-nsone
|
||||
certbot-dns-ovh
|
||||
certbot-dns-rfc2136
|
||||
certbot-dns-route53
|
||||
certbot-dns-sakuracloud
|
||||
certbot-nginx
|
||||
|
||||
[report]
|
||||
omit = */setup.py
|
||||
show_missing = True
|
||||
|
||||
12
.envrc
12
.envrc
@@ -1,12 +0,0 @@
|
||||
# This file is just a nicety for developers who use direnv. When you cd under
|
||||
# the Certbot repo, Certbot's virtual environment will be automatically
|
||||
# activated and then deactivated when you cd elsewhere. Developers have to have
|
||||
# direnv set up and run `direnv allow` to allow this file to execute on their
|
||||
# system. You can find more information at https://direnv.net/.
|
||||
. venv/bin/activate
|
||||
# direnv doesn't support modifying PS1 so we unset it to squelch the error
|
||||
# it'll otherwise print about this being done in the activate script. See
|
||||
# https://github.com/direnv/direnv/wiki/PS1. If you would like your shell
|
||||
# prompt to change like it normally does, see
|
||||
# https://github.com/direnv/direnv/wiki/Python#restoring-the-ps1.
|
||||
unset PS1
|
||||
2
.github/pull_request_template.md
vendored
2
.github/pull_request_template.md
vendored
@@ -1,6 +1,6 @@
|
||||
## Pull Request Checklist
|
||||
|
||||
- [ ] The Certbot team has recently expressed interest in reviewing a PR for this. If not, this PR may be closed due our limited resources and need to prioritize how we spend them.
|
||||
- [ ] If the change being made is to a [distributed component](https://certbot.eff.org/docs/contributing.html#code-components-and-layout), edit the `master` section of `certbot/CHANGELOG.md` to include a description of the change being made.
|
||||
- [ ] If the change being made is to a [distributed component](https://certbot.eff.org/docs/contributing.html#code-components-and-layout), edit the `main` section of `certbot/CHANGELOG.md` to include a description of the change being made.
|
||||
- [ ] Add or update any documentation as needed to support the changes in this PR.
|
||||
- [ ] Include your name in `AUTHORS.md` if you like.
|
||||
|
||||
28
.github/workflows/merged.yaml
vendored
28
.github/workflows/merged.yaml
vendored
@@ -7,25 +7,15 @@ on:
|
||||
|
||||
jobs:
|
||||
if_merged:
|
||||
if: github.event.pull_request.merged == true
|
||||
# Forked repos can not access Mattermost secret.
|
||||
if: github.event.pull_request.merged == true && !github.event.pull_request.head.repo.fork
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Create Mattermost Message
|
||||
# https://docs.github.com/en/actions/security-guides/security-hardening-for-github-actions#example-of-a-script-injection-attack
|
||||
env:
|
||||
NUMBER: ${{ github.event.number }}
|
||||
PR_URL: https://github.com/${{ github.repository }}/pull/${{ github.event.number }}
|
||||
REPO: ${{ github.repository }}
|
||||
USER: ${{ github.actor }}
|
||||
TITLE: ${{ github.event.pull_request.title }}
|
||||
run: |
|
||||
jq --null-input \
|
||||
--arg number "$NUMBER" \
|
||||
--arg pr_url "$PR_URL" \
|
||||
--arg repo "$REPO" \
|
||||
--arg user "$USER" \
|
||||
--arg title "$TITLE" \
|
||||
'{ "text": "[\($repo)] | [\($title) #\($number)](\($pr_url)) was merged into master by \($user)" }' > mattermost.json
|
||||
- uses: mattermost/action-mattermost-notify@master
|
||||
env:
|
||||
- uses: mattermost/action-mattermost-notify@main
|
||||
with:
|
||||
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_MERGE_WEBHOOK }}
|
||||
TEXT: >
|
||||
[${{ github.repository }}] |
|
||||
[${{ github.event.pull_request.title }}
|
||||
#${{ github.event.number }}](https://github.com/${{ github.repository }}/pull/${{ github.event.number }})
|
||||
was merged into main by ${{ github.actor }}
|
||||
|
||||
24
.github/workflows/notify_weekly.yaml
vendored
24
.github/workflows/notify_weekly.yaml
vendored
@@ -4,6 +4,7 @@ on:
|
||||
schedule:
|
||||
# Every week on Thursday @ 13:00
|
||||
- cron: "0 13 * * 4"
|
||||
workflow_dispatch:
|
||||
jobs:
|
||||
send-mattermost-message:
|
||||
runs-on: ubuntu-latest
|
||||
@@ -11,15 +12,16 @@ jobs:
|
||||
steps:
|
||||
- name: Create Mattermost Message
|
||||
run: |
|
||||
DATE=$(date --date="7 days ago" +"%Y-%m-%d")
|
||||
MERGED_URL="https://github.com/pulls?q=merged%3A%3E${DATE}+org%3Acertbot"
|
||||
UPDATED_URL="https://github.com/pulls?q=updated%3A%3E${DATE}+org%3Acertbot"
|
||||
echo "{\"text\":\"## Updates Across Certbot Repos\n\n
|
||||
- Certbot team members SHOULD look at: [link]($MERGED_URL)\n\n
|
||||
- Certbot team members MAY also want to look at: [link]($UPDATED_URL)\n\n
|
||||
- Want to Discuss something today? Place it [here](https://docs.google.com/document/d/17YMUbtC1yg6MfiTMwT8zVm9LmO-cuGVBom0qFn8XJBM/edit?usp=sharing) and we can meet today on Zoom.\n\n
|
||||
- The key words SHOULD and MAY in this message are to be interpreted as described in [RFC 8147](https://www.rfc-editor.org/rfc/rfc8174). \"
|
||||
}" > mattermost.json
|
||||
- uses: mattermost/action-mattermost-notify@master
|
||||
env:
|
||||
DATE=$(date --date="7 days ago" +"%Y-%m-%d")
|
||||
echo "MERGED_URL=https://github.com/pulls?q=merged%3A%3E${DATE}+org%3Acertbot" >> $GITHUB_ENV
|
||||
echo "UPDATED_URL=https://github.com/pulls?q=updated%3A%3E${DATE}+org%3Acertbot" >> $GITHUB_ENV
|
||||
- uses: mattermost/action-mattermost-notify@main
|
||||
with:
|
||||
MATTERMOST_WEBHOOK_URL: ${{ secrets.MATTERMOST_WEBHOOK_URL }}
|
||||
MATTERMOST_CHANNEL: private-certbot
|
||||
TEXT: |
|
||||
## Updates Across Certbot Repos
|
||||
- Certbot team members SHOULD look at: [link](${{ env.MERGED_URL }})
|
||||
- Certbot team members MAY also want to look at: [link](${{ env.UPDATED_URL }})
|
||||
- Want to Discuss something today? Place it [here](https://docs.google.com/document/d/17YMUbtC1yg6MfiTMwT8zVm9LmO-cuGVBom0qFn8XJBM/edit?usp=sharing) and we can meet today on Zoom.
|
||||
- The key words SHOULD and MAY in this message are to be interpreted as described in [RFC 8147](https://www.rfc-editor.org/rfc/rfc8174).
|
||||
|
||||
11
.github/workflows/stale.yml
vendored
11
.github/workflows/stale.yml
vendored
@@ -1,8 +1,9 @@
|
||||
name: Update Stale Issues
|
||||
on:
|
||||
schedule:
|
||||
# Run 24 minutes past the hour 4 times a day.
|
||||
- cron: '24 */6 * * *'
|
||||
# Run 1:24AM every night
|
||||
- cron: '24 1 * * *'
|
||||
workflow_dispatch:
|
||||
permissions:
|
||||
issues: write
|
||||
jobs:
|
||||
@@ -41,5 +42,7 @@ jobs:
|
||||
should be reopened, please open a new issue with a link to this one and we'll
|
||||
take a look.
|
||||
|
||||
# Limit the number of actions per hour, from 1-30. Default is 30
|
||||
operations-per-run: 30
|
||||
# Limit the number of actions per run. As of writing this, GitHub's
|
||||
# rate limit is 1000 requests per hour so we're still a ways off. See
|
||||
# https://docs.github.com/en/rest/overview/resources-in-the-rest-api?apiVersion=2022-11-28#rate-limits-for-requests-from-github-actions.
|
||||
operations-per-run: 180
|
||||
|
||||
@@ -69,7 +69,7 @@ ignored-modules=
|
||||
# CERTBOT COMMENT
|
||||
# This is needed for pylint to import linter_plugin.py since
|
||||
# https://github.com/PyCQA/pylint/pull/3396.
|
||||
init-hook="import pylint.config, os, sys; sys.path.append(os.path.dirname(pylint.config.PYLINTRC))"
|
||||
init-hook="import pylint.config, os, sys; sys.path.append(os.path.dirname(next(pylint.config.find_default_config_files())))"
|
||||
|
||||
# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
|
||||
# number of processors available to use, and will cap the count on Windows to
|
||||
@@ -266,8 +266,8 @@ valid-metaclass-classmethod-first-arg=cls
|
||||
[EXCEPTIONS]
|
||||
|
||||
# Exceptions that will emit a warning when caught.
|
||||
overgeneral-exceptions=BaseException,
|
||||
Exception
|
||||
overgeneral-exceptions=builtins.BaseException,
|
||||
builtins.Exception
|
||||
|
||||
|
||||
[FORMAT]
|
||||
@@ -524,7 +524,7 @@ ignored-classes=optparse.Values,thread._local,_thread._local,argparse.Namespace,
|
||||
# List of module names for which member attributes should not be checked
|
||||
# (useful for modules/projects where namespaces are manipulated during runtime
|
||||
# and thus existing member attributes cannot be deduced by static analysis
|
||||
ignored-modules=pkg_resources,confargparse,argparse
|
||||
ignored-modules=confargparse,argparse
|
||||
|
||||
# Show a hint with possible names when a member name was not found. The aspect
|
||||
# of finding the hint is based on edit distance.
|
||||
|
||||
@@ -94,6 +94,7 @@ Authors
|
||||
* [Felix Yan](https://github.com/felixonmars)
|
||||
* [Filip Ochnik](https://github.com/filipochnik)
|
||||
* [Florian Klink](https://github.com/flokli)
|
||||
* [Francesco Colista](https://github.com/fcolista)
|
||||
* [Francois Marier](https://github.com/fmarier)
|
||||
* [Frank](https://github.com/Frankkkkk)
|
||||
* [Frederic BLANC](https://github.com/fblanc)
|
||||
@@ -123,6 +124,7 @@ Authors
|
||||
* [James Balazs](https://github.com/jamesbalazs)
|
||||
* [James Kasten](https://github.com/jdkasten)
|
||||
* [Jason Grinblat](https://github.com/ptychomancer)
|
||||
* [Jawshua](https://github.com/jawshua)
|
||||
* [Jay Faulkner](https://github.com/jayofdoom)
|
||||
* [J.C. Jones](https://github.com/jcjones)
|
||||
* [Jeff Hodges](https://github.com/jmhodges)
|
||||
@@ -153,6 +155,7 @@ Authors
|
||||
* [LeCoyote](https://github.com/LeCoyote)
|
||||
* [Lee Watson](https://github.com/TheReverend403)
|
||||
* [Leo Famulari](https://github.com/lfam)
|
||||
* [Leon G](https://github.com/LeonGr)
|
||||
* [lf](https://github.com/lf-)
|
||||
* [Liam Marshall](https://github.com/liamim)
|
||||
* [Lior Sabag](https://github.com/liorsbg)
|
||||
@@ -163,6 +166,7 @@ Authors
|
||||
* [Luca Ebach](https://github.com/lucebac)
|
||||
* [Luca Olivetti](https://github.com/olivluca)
|
||||
* [Luke Rogers](https://github.com/lukeroge)
|
||||
* [Lukhnos Liu](https://github.com/lukhnos)
|
||||
* [Maarten](https://github.com/mrtndwrd)
|
||||
* [Mads Jensen](https://github.com/atombrella)
|
||||
* [Maikel Martens](https://github.com/krukas)
|
||||
@@ -207,6 +211,7 @@ Authors
|
||||
* [Patrick Heppler](https://github.com/PatrickHeppler)
|
||||
* [Paul Buonopane](https://github.com/Zenexer)
|
||||
* [Paul Feitzinger](https://github.com/pfeyz)
|
||||
* [Paulo Dias](https://github.com/paulojmdias)
|
||||
* [Pavan Gupta](https://github.com/pavgup)
|
||||
* [Pavel Pavlov](https://github.com/ghost355)
|
||||
* [Peter Conrad](https://github.com/pconrad-fb)
|
||||
@@ -220,12 +225,14 @@ Authors
|
||||
* [Piotr Kasprzyk](https://github.com/kwadrat)
|
||||
* [Prayag Verma](https://github.com/pra85)
|
||||
* [Preston Locke](https://github.com/Preston12321)
|
||||
* [Q Misell][https://magicalcodewit.ch]
|
||||
* [Rasesh Patel](https://github.com/raspat1)
|
||||
* [Reinaldo de Souza Jr](https://github.com/juniorz)
|
||||
* [Remi Rampin](https://github.com/remram44)
|
||||
* [Rémy HUBSCHER](https://github.com/Natim)
|
||||
* [Rémy Léone](https://github.com/sieben)
|
||||
* [Richard Barnes](https://github.com/r-barnes)
|
||||
* [Richard Harman](https://github.com/warewolf)
|
||||
* [Richard Panek](https://github.com/kernelpanek)
|
||||
* [Robert Buchholz](https://github.com/rbu)
|
||||
* [Robert Dailey](https://github.com/pahrohfit)
|
||||
|
||||
@@ -1,21 +0,0 @@
|
||||
# This Dockerfile builds an image for development.
|
||||
FROM ubuntu:focal
|
||||
|
||||
# Note: this only exposes the port to other docker containers.
|
||||
EXPOSE 80 443
|
||||
|
||||
WORKDIR /opt/certbot/src
|
||||
|
||||
COPY . .
|
||||
RUN apt-get update && \
|
||||
DEBIAN_FRONTEND=noninteractive apt-get install apache2 git python3-dev \
|
||||
python3-venv gcc libaugeas0 libssl-dev libffi-dev ca-certificates \
|
||||
openssl nginx-light -y --no-install-recommends && \
|
||||
apt-get clean && \
|
||||
rm -rf /var/lib/apt/lists/* \
|
||||
/tmp/* \
|
||||
/var/tmp/*
|
||||
|
||||
RUN VENV_NAME="../venv" python3 tools/venv.py
|
||||
|
||||
ENV PATH /opt/certbot/venv/bin:$PATH
|
||||
33
acme/.readthedocs.yaml
Normal file
33
acme/.readthedocs.yaml
Normal file
@@ -0,0 +1,33 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: acme/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: acme/readthedocs.org.requirements.txt
|
||||
@@ -29,11 +29,9 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
from acme.crypto_util import SSLSocket
|
||||
|
||||
class _TestServer(socketserver.TCPServer):
|
||||
|
||||
def server_bind(self): # pylint: disable=missing-docstring
|
||||
self.socket = SSLSocket(socket.socket(),
|
||||
certs)
|
||||
socketserver.TCPServer.server_bind(self)
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.socket = SSLSocket(self.socket, certs)
|
||||
|
||||
self.server = _TestServer(('', 0), socketserver.BaseRequestHandler)
|
||||
self.port = self.server.socket.getsockname()[1]
|
||||
@@ -44,6 +42,7 @@ class SSLSocketAndProbeSNITest(unittest.TestCase):
|
||||
if self.server_thread.is_alive():
|
||||
# The thread may have already terminated.
|
||||
self.server_thread.join() # pragma: no cover
|
||||
self.server.server_close()
|
||||
|
||||
def _probe(self, name):
|
||||
from acme.crypto_util import probe_sni
|
||||
|
||||
@@ -34,7 +34,7 @@ class RFC3339FieldTest(unittest.TestCase):
|
||||
"""Tests for acme.fields.RFC3339Field."""
|
||||
|
||||
def setUp(self):
|
||||
self.decoded = datetime.datetime(2015, 3, 27, tzinfo=pytz.utc)
|
||||
self.decoded = datetime.datetime(2015, 3, 27, tzinfo=pytz.UTC)
|
||||
self.encoded = '2015-03-27T00:00:00Z'
|
||||
|
||||
def test_default_encoder(self):
|
||||
|
||||
@@ -55,6 +55,7 @@ class HTTP01ServerTest(unittest.TestCase):
|
||||
def tearDown(self):
|
||||
self.server.shutdown()
|
||||
self.thread.join()
|
||||
self.server.server_close()
|
||||
|
||||
def test_index(self):
|
||||
response = requests.get(
|
||||
@@ -88,25 +89,25 @@ class HTTP01ServerTest(unittest.TestCase):
|
||||
|
||||
def test_timely_shutdown(self):
|
||||
from acme.standalone import HTTP01Server
|
||||
server = HTTP01Server(('', 0), resources=set(), timeout=0.05)
|
||||
server_thread = threading.Thread(target=server.serve_forever)
|
||||
server_thread.start()
|
||||
with HTTP01Server(('', 0), resources=set(), timeout=0.05) as server:
|
||||
server_thread = threading.Thread(target=server.serve_forever)
|
||||
server_thread.start()
|
||||
|
||||
client = socket.socket()
|
||||
client.connect(('localhost', server.socket.getsockname()[1]))
|
||||
with socket.socket() as client:
|
||||
client.connect(('localhost', server.socket.getsockname()[1]))
|
||||
|
||||
stop_thread = threading.Thread(target=server.shutdown)
|
||||
stop_thread.start()
|
||||
server_thread.join(5.)
|
||||
stop_thread = threading.Thread(target=server.shutdown)
|
||||
stop_thread.start()
|
||||
server_thread.join(5.)
|
||||
|
||||
is_hung = server_thread.is_alive()
|
||||
try:
|
||||
client.shutdown(socket.SHUT_RDWR)
|
||||
except: # pragma: no cover, pylint: disable=bare-except
|
||||
# may raise error because socket could already be closed
|
||||
pass
|
||||
is_hung = server_thread.is_alive()
|
||||
try:
|
||||
client.shutdown(socket.SHUT_RDWR)
|
||||
except: # pragma: no cover, pylint: disable=bare-except
|
||||
# may raise error because socket could already be closed
|
||||
pass
|
||||
|
||||
assert not is_hung, 'Server shutdown should not be hung'
|
||||
assert not is_hung, 'Server shutdown should not be hung'
|
||||
|
||||
|
||||
@unittest.skipIf(not challenges.TLSALPN01.is_supported(), "pyOpenSSL too old")
|
||||
@@ -133,6 +134,7 @@ class TLSALPN01ServerTest(unittest.TestCase):
|
||||
def tearDown(self):
|
||||
self.server.shutdown() # pylint: disable=no-member
|
||||
self.thread.join()
|
||||
self.server.server_close()
|
||||
|
||||
# TODO: This is not implemented yet, see comments in standalone.py
|
||||
# def test_certs(self):
|
||||
@@ -214,6 +216,8 @@ class BaseDualNetworkedServersTest(unittest.TestCase):
|
||||
if prev_port:
|
||||
assert prev_port == port
|
||||
prev_port = port
|
||||
for server in servers.servers:
|
||||
server.server_close()
|
||||
|
||||
|
||||
class HTTP01DualNetworkedServersTest(unittest.TestCase):
|
||||
|
||||
@@ -4,20 +4,25 @@
|
||||
|
||||
"""
|
||||
import os
|
||||
import sys
|
||||
|
||||
from cryptography.hazmat.backends import default_backend
|
||||
from cryptography.hazmat.primitives import serialization
|
||||
import josepy as jose
|
||||
from josepy.util import ComparableECKey
|
||||
from OpenSSL import crypto
|
||||
import pkg_resources
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
|
||||
def load_vector(*names):
|
||||
"""Load contents of a test vector."""
|
||||
# luckily, resource_string opens file in binary mode
|
||||
return pkg_resources.resource_string(
|
||||
__name__, os.path.join('testdata', *names))
|
||||
vector_ref = importlib_resources.files(__package__).joinpath('testdata', *names)
|
||||
return vector_ref.read_bytes()
|
||||
|
||||
|
||||
def _guess_loader(filename, loader_pem, loader_der):
|
||||
|
||||
@@ -12,7 +12,6 @@ from typing import List
|
||||
from typing import Mapping
|
||||
from typing import Optional
|
||||
from typing import Set
|
||||
from typing import Text
|
||||
from typing import Tuple
|
||||
from typing import Union
|
||||
|
||||
@@ -517,7 +516,7 @@ class ClientNetwork:
|
||||
self.account = account
|
||||
self.alg = alg
|
||||
self.verify_ssl = verify_ssl
|
||||
self._nonces: Set[Text] = set()
|
||||
self._nonces: Set[str] = set()
|
||||
self.user_agent = user_agent
|
||||
self.session = requests.Session()
|
||||
self._default_timeout = timeout
|
||||
|
||||
@@ -136,27 +136,33 @@ class SSLSocket: # pylint: disable=too-few-public-methods
|
||||
def accept(self) -> Tuple[FakeConnection, Any]: # pylint: disable=missing-function-docstring
|
||||
sock, addr = self.sock.accept()
|
||||
|
||||
context = SSL.Context(self.method)
|
||||
context.set_options(SSL.OP_NO_SSLv2)
|
||||
context.set_options(SSL.OP_NO_SSLv3)
|
||||
context.set_tlsext_servername_callback(self._pick_certificate_cb)
|
||||
if self.alpn_selection is not None:
|
||||
context.set_alpn_select_callback(self.alpn_selection)
|
||||
|
||||
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
|
||||
ssl_sock.set_accept_state()
|
||||
|
||||
# This log line is especially desirable because without it requests to
|
||||
# our standalone TLSALPN server would not be logged.
|
||||
logger.debug("Performing handshake with %s", addr)
|
||||
try:
|
||||
ssl_sock.do_handshake()
|
||||
except SSL.Error as error:
|
||||
# _pick_certificate_cb might have returned without
|
||||
# creating SSL context (wrong server name)
|
||||
raise socket.error(error)
|
||||
context = SSL.Context(self.method)
|
||||
context.set_options(SSL.OP_NO_SSLv2)
|
||||
context.set_options(SSL.OP_NO_SSLv3)
|
||||
context.set_tlsext_servername_callback(self._pick_certificate_cb)
|
||||
if self.alpn_selection is not None:
|
||||
context.set_alpn_select_callback(self.alpn_selection)
|
||||
|
||||
return ssl_sock, addr
|
||||
ssl_sock = self.FakeConnection(SSL.Connection(context, sock))
|
||||
ssl_sock.set_accept_state()
|
||||
|
||||
# This log line is especially desirable because without it requests to
|
||||
# our standalone TLSALPN server would not be logged.
|
||||
logger.debug("Performing handshake with %s", addr)
|
||||
try:
|
||||
ssl_sock.do_handshake()
|
||||
except SSL.Error as error:
|
||||
# _pick_certificate_cb might have returned without
|
||||
# creating SSL context (wrong server name)
|
||||
raise socket.error(error)
|
||||
|
||||
return ssl_sock, addr
|
||||
except:
|
||||
# If we encounter any error, close the new socket before reraising
|
||||
# the exception.
|
||||
sock.close()
|
||||
raise
|
||||
|
||||
|
||||
def probe_sni(name: bytes, host: bytes, port: int = 443, timeout: int = 300, # pylint: disable=too-many-arguments
|
||||
|
||||
@@ -34,7 +34,7 @@ class RFC3339Field(jose.Field):
|
||||
|
||||
Handles decoding/encoding between RFC3339 strings and aware (not
|
||||
naive) `datetime.datetime` objects
|
||||
(e.g. ``datetime.datetime.now(pytz.utc)``).
|
||||
(e.g. ``datetime.datetime.now(pytz.UTC)``).
|
||||
|
||||
"""
|
||||
|
||||
|
||||
@@ -29,7 +29,7 @@ class Header(jose.Header):
|
||||
|
||||
class Signature(jose.Signature):
|
||||
"""ACME-specific Signature. Uses ACME-specific Header for customer fields."""
|
||||
__slots__ = jose.Signature._orig_slots # type: ignore[attr-defined] # pylint: disable=protected-access,no-member
|
||||
__slots__ = jose.Signature._orig_slots # pylint: disable=protected-access,no-member
|
||||
|
||||
# TODO: decoder/encoder should accept cls? Otherwise, subclassing
|
||||
# JSONObjectWithFields is tricky...
|
||||
@@ -44,7 +44,7 @@ class Signature(jose.Signature):
|
||||
class JWS(jose.JWS):
|
||||
"""ACME-specific JWS. Includes none, url, and kid in protected header."""
|
||||
signature_cls = Signature
|
||||
__slots__ = jose.JWS._orig_slots # type: ignore[attr-defined] # pylint: disable=protected-access
|
||||
__slots__ = jose.JWS._orig_slots # pylint: disable=protected-access
|
||||
|
||||
@classmethod
|
||||
# type: ignore[override] # pylint: disable=arguments-differ
|
||||
|
||||
@@ -37,6 +37,7 @@ extensions = [
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme',
|
||||
]
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
@@ -122,14 +123,7 @@ todo_include_todos = False
|
||||
# The theme to use for HTML and HTML Help pages. See the documentation for
|
||||
# a list of builtin themes.
|
||||
|
||||
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
import sphinx_rtd_theme
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
|
||||
@@ -3,6 +3,6 @@ usage: jws [-h] [--compact] {sign,verify} ...
|
||||
positional arguments:
|
||||
{sign,verify}
|
||||
|
||||
optional arguments:
|
||||
options:
|
||||
-h, --help show this help message and exit
|
||||
--compact
|
||||
|
||||
@@ -3,11 +3,13 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'cryptography>=2.5.0',
|
||||
'josepy>=1.13.0',
|
||||
'cryptography>=3.2.1',
|
||||
# Josepy 2+ may introduce backward incompatible changes by droping usage of
|
||||
# deprecated PyOpenSSL APIs.
|
||||
'josepy>=1.13.0, <2',
|
||||
# pyOpenSSL 23.1.0 is a bad release: https://github.com/pyca/pyopenssl/issues/1199
|
||||
'PyOpenSSL>=17.5.0,!=23.1.0',
|
||||
'pyrfc3339',
|
||||
@@ -22,6 +24,15 @@ docs_extras = [
|
||||
]
|
||||
|
||||
test_extras = [
|
||||
# In theory we could scope importlib_resources to env marker 'python_version<"3.9"'. But this
|
||||
# makes the pinning mechanism emit warnings when running `poetry lock` because in the corner
|
||||
# case of an extra dependency with env marker coming from a setup.py file, it generate the
|
||||
# invalid requirement 'importlib_resource>=1.3.1;python<=3.9;extra=="test"'.
|
||||
# To fix the issue, we do not pass the env marker. This is fine because:
|
||||
# - importlib_resources can be applied to any Python version,
|
||||
# - this is a "test" extra dependency for limited audience,
|
||||
# - it does not change anything at the end for the generated requirement files.
|
||||
'importlib_resources>=1.3.1',
|
||||
'pytest',
|
||||
'pytest-xdist',
|
||||
'typing-extensions',
|
||||
@@ -31,22 +42,22 @@ setup(
|
||||
name='acme',
|
||||
version=version,
|
||||
description='ACME protocol implementation in Python',
|
||||
url='https://github.com/letsencrypt/letsencrypt',
|
||||
url='https://github.com/certbot/certbot',
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
],
|
||||
|
||||
@@ -1,21 +1,28 @@
|
||||
""" Utility functions for certbot-apache plugin """
|
||||
import atexit
|
||||
import binascii
|
||||
import fnmatch
|
||||
import logging
|
||||
import re
|
||||
import subprocess
|
||||
import sys
|
||||
from contextlib import ExitStack
|
||||
from typing import Dict
|
||||
from typing import Iterable
|
||||
from typing import List
|
||||
from typing import Optional
|
||||
from typing import Tuple
|
||||
|
||||
import pkg_resources
|
||||
|
||||
from certbot import errors
|
||||
from certbot import util
|
||||
from certbot.compat import os
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@@ -248,6 +255,8 @@ def find_ssl_apache_conf(prefix: str) -> str:
|
||||
:return: the path the TLS Apache config file
|
||||
:rtype: str
|
||||
"""
|
||||
return pkg_resources.resource_filename(
|
||||
"certbot_apache",
|
||||
os.path.join("_internal", "tls_configs", "{0}-options-ssl-apache.conf".format(prefix)))
|
||||
file_manager = ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
ref = (importlib_resources.files("certbot_apache").joinpath("_internal")
|
||||
.joinpath("tls_configs").joinpath("{0}-options-ssl-apache.conf".format(prefix)))
|
||||
return str(file_manager.enter_context(importlib_resources.as_file(ref)))
|
||||
|
||||
@@ -1,10 +1,14 @@
|
||||
"""Apache plugin constants."""
|
||||
import atexit
|
||||
import sys
|
||||
from contextlib import ExitStack
|
||||
from typing import Dict
|
||||
from typing import List
|
||||
|
||||
import pkg_resources
|
||||
|
||||
from certbot.compat import os
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
MOD_SSL_CONF_DEST = "options-ssl-apache.conf"
|
||||
"""Name of the mod_ssl config file as saved
|
||||
@@ -37,8 +41,15 @@ ALL_SSL_OPTIONS_HASHES: List[str] = [
|
||||
]
|
||||
"""SHA256 hashes of the contents of previous versions of all versions of MOD_SSL_CONF_SRC"""
|
||||
|
||||
AUGEAS_LENS_DIR = pkg_resources.resource_filename(
|
||||
"certbot_apache", os.path.join("_internal", "augeas_lens"))
|
||||
def _generate_augeas_lens_dir_static() -> str:
|
||||
# This code ensures that the resource is accessible as file for the lifetime of current
|
||||
# Python process, and will be automatically cleaned up on exit.
|
||||
file_manager = ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
augeas_lens_dir_ref = importlib_resources.files("certbot_apache") / "_internal" / "augeas_lens"
|
||||
return str(file_manager.enter_context(importlib_resources.as_file(augeas_lens_dir_ref)))
|
||||
|
||||
AUGEAS_LENS_DIR = _generate_augeas_lens_dir_static()
|
||||
"""Path to the Augeas lens directory"""
|
||||
|
||||
REWRITE_HTTPS_ARGS: List[str] = [
|
||||
|
||||
@@ -4,6 +4,7 @@ from typing import Type
|
||||
|
||||
from certbot import util
|
||||
from certbot_apache._internal import configurator
|
||||
from certbot_apache._internal import override_alpine
|
||||
from certbot_apache._internal import override_arch
|
||||
from certbot_apache._internal import override_centos
|
||||
from certbot_apache._internal import override_darwin
|
||||
@@ -14,6 +15,7 @@ from certbot_apache._internal import override_suse
|
||||
from certbot_apache._internal import override_void
|
||||
|
||||
OVERRIDE_CLASSES: Dict[str, Type[configurator.ApacheConfigurator]] = {
|
||||
"alpine": override_alpine.AlpineConfigurator,
|
||||
"arch": override_arch.ArchConfigurator,
|
||||
"cloudlinux": override_centos.CentOSConfigurator,
|
||||
"darwin": override_darwin.DarwinConfigurator,
|
||||
|
||||
19
certbot-apache/certbot_apache/_internal/override_alpine.py
Normal file
19
certbot-apache/certbot_apache/_internal/override_alpine.py
Normal file
@@ -0,0 +1,19 @@
|
||||
""" Distribution specific override class for Alpine Linux """
|
||||
from certbot_apache._internal import configurator
|
||||
from certbot_apache._internal.configurator import OsOptions
|
||||
|
||||
|
||||
class AlpineConfigurator(configurator.ApacheConfigurator):
|
||||
"""Alpine Linux specific ApacheConfigurator override class"""
|
||||
|
||||
OS_DEFAULTS = OsOptions(
|
||||
server_root="/etc/apache2",
|
||||
vhost_root="/etc/apache2/conf.d",
|
||||
vhost_files="*.conf",
|
||||
logs_root="/var/log/apache2",
|
||||
ctl="apachectl",
|
||||
version_cmd=['apachectl', '-v'],
|
||||
restart_cmd=['apachectl', 'graceful'],
|
||||
conftest_cmd=['apachectl', 'configtest'],
|
||||
challenge_location="/etc/apache2/conf.d",
|
||||
)
|
||||
@@ -14,7 +14,7 @@ SCRIPT_DIRNAME = os.path.dirname(__file__)
|
||||
|
||||
def main() -> int:
|
||||
args = sys.argv[1:]
|
||||
with acme_server.ACMEServer('pebble', [], False) as acme_xdist:
|
||||
with acme_server.ACMEServer([], False) as acme_xdist:
|
||||
environ = os.environ.copy()
|
||||
environ['SERVER'] = acme_xdist['directory_url']
|
||||
command = [os.path.join(SCRIPT_DIRNAME, 'apache-conf-test')]
|
||||
|
||||
@@ -128,11 +128,11 @@ class AutoHSTSTest(util.ApacheTest):
|
||||
max_val
|
||||
|
||||
def test_autohsts_update_noop(self):
|
||||
with mock.patch("time.time") as mock_time:
|
||||
with mock.patch("certbot_apache._internal.configurator.time") as mock_time_module:
|
||||
# Time mock is used to make sure that the execution does not
|
||||
# continue when no autohsts entries exist in pluginstorage
|
||||
self.config.update_autohsts(mock.MagicMock())
|
||||
assert mock_time.called is False
|
||||
assert not mock_time_module.time.called
|
||||
|
||||
def test_autohsts_make_permanent_noop(self):
|
||||
self.config.storage.put = mock.MagicMock()
|
||||
|
||||
@@ -5,7 +5,6 @@ import shutil
|
||||
import socket
|
||||
import sys
|
||||
import tempfile
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
@@ -1659,20 +1658,23 @@ class InstallSslOptionsConfTest(util.ApacheTest):
|
||||
file has been manually edited by the user, and will refuse to update it.
|
||||
This test ensures that all necessary hashes are present.
|
||||
"""
|
||||
import pkg_resources
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
from certbot_apache._internal.constants import ALL_SSL_OPTIONS_HASHES
|
||||
|
||||
tls_configs_dir = pkg_resources.resource_filename(
|
||||
"certbot_apache", os.path.join("_internal", "tls_configs"))
|
||||
all_files = [os.path.join(tls_configs_dir, name) for name in os.listdir(tls_configs_dir)
|
||||
if name.endswith('options-ssl-apache.conf')]
|
||||
assert len(all_files) >= 1
|
||||
for one_file in all_files:
|
||||
file_hash = crypto_util.sha256sum(one_file)
|
||||
assert file_hash in ALL_SSL_OPTIONS_HASHES, \
|
||||
f"Constants.ALL_SSL_OPTIONS_HASHES must be appended with the sha256 " \
|
||||
f"hash of {one_file} when it is updated."
|
||||
ref = importlib_resources.files("certbot_apache") / "_internal" / "tls_configs"
|
||||
with importlib_resources.as_file(ref) as tls_configs_dir:
|
||||
all_files = [os.path.join(tls_configs_dir, name) for name in os.listdir(tls_configs_dir)
|
||||
if name.endswith('options-ssl-apache.conf')]
|
||||
assert len(all_files) >= 1
|
||||
for one_file in all_files:
|
||||
file_hash = crypto_util.sha256sum(one_file)
|
||||
assert file_hash in ALL_SSL_OPTIONS_HASHES, \
|
||||
f"Constants.ALL_SSL_OPTIONS_HASHES must be appended with the sha256 " \
|
||||
f"hash of {one_file} when it is updated."
|
||||
|
||||
def test_openssl_version(self):
|
||||
self.config._openssl_version = None
|
||||
|
||||
@@ -22,7 +22,7 @@ class ApacheTest(unittest.TestCase):
|
||||
# pylint: disable=arguments-differ
|
||||
self.temp_dir, self.config_dir, self.work_dir = common.dir_setup(
|
||||
test_dir=test_dir,
|
||||
pkg=__name__)
|
||||
pkg=__package__)
|
||||
|
||||
self.config_path = os.path.join(self.temp_dir, config_root)
|
||||
self.vhost_path = os.path.join(self.temp_dir, vhost_root)
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
# We specify the minimum acme and certbot version as the current plugin
|
||||
@@ -9,6 +9,7 @@ install_requires = [
|
||||
# https://github.com/certbot/certbot/issues/8761 for more info.
|
||||
f'acme>={version}',
|
||||
f'certbot>={version}',
|
||||
'importlib_resources>=1.3.1; python_version < "3.9"',
|
||||
'python-augeas',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
@@ -25,11 +26,11 @@ setup(
|
||||
name='certbot-apache',
|
||||
version=version,
|
||||
description="Apache plugin for Certbot",
|
||||
url='https://github.com/letsencrypt/letsencrypt',
|
||||
url='https://github.com/certbot/certbot',
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -38,11 +39,11 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
@@ -23,7 +23,6 @@ class IntegrationTestsContext:
|
||||
self.worker_id = 'primary'
|
||||
acme_xdist = request.config.acme_xdist # type: ignore[attr-defined]
|
||||
|
||||
self.acme_server = acme_xdist['acme_server']
|
||||
self.directory_url = acme_xdist['directory_url']
|
||||
self.tls_alpn_01_port = acme_xdist['https_port'][self.worker_id]
|
||||
self.http_01_port = acme_xdist['http_port'][self.worker_id]
|
||||
|
||||
@@ -7,7 +7,6 @@ import shutil
|
||||
import subprocess
|
||||
import time
|
||||
from typing import Generator
|
||||
from typing import Iterable
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
|
||||
@@ -82,11 +81,9 @@ def test_registration_override(context: IntegrationTestsContext) -> None:
|
||||
context.certbot(['update_account', '--email', 'ex1@domain.org,ex2@domain.org'])
|
||||
stdout2, _ = context.certbot(['show_account'])
|
||||
|
||||
# https://github.com/letsencrypt/boulder/issues/6144
|
||||
if context.acme_server != 'boulder-v2':
|
||||
assert 'example@domain.org' in stdout1, "New email should be present"
|
||||
assert 'example@domain.org' not in stdout2, "Old email should not be present"
|
||||
assert 'ex1@domain.org, ex2@domain.org' in stdout2, "New emails should be present"
|
||||
assert 'example@domain.org' in stdout1, "New email should be present"
|
||||
assert 'example@domain.org' not in stdout2, "Old email should not be present"
|
||||
assert 'ex1@domain.org, ex2@domain.org' in stdout2, "New emails should be present"
|
||||
|
||||
|
||||
def test_prepare_plugins(context: IntegrationTestsContext) -> None:
|
||||
@@ -566,19 +563,15 @@ def test_default_rsa_size(context: IntegrationTestsContext) -> None:
|
||||
assert_rsa_key(key1, 2048)
|
||||
|
||||
|
||||
@pytest.mark.parametrize('curve,curve_cls,skip_servers', [
|
||||
@pytest.mark.parametrize('curve,curve_cls', [
|
||||
# Curve name, Curve class, ACME servers to skip
|
||||
('secp256r1', SECP256R1, []),
|
||||
('secp384r1', SECP384R1, []),
|
||||
('secp521r1', SECP521R1, ['boulder-v2'])]
|
||||
('secp256r1', SECP256R1),
|
||||
('secp384r1', SECP384R1),
|
||||
('secp521r1', SECP521R1)]
|
||||
)
|
||||
def test_ecdsa_curves(context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve],
|
||||
skip_servers: Iterable[str]) -> None:
|
||||
def test_ecdsa_curves(context: IntegrationTestsContext, curve: str,
|
||||
curve_cls: Type[EllipticCurve]) -> None:
|
||||
"""Test issuance for each supported ECDSA curve"""
|
||||
if context.acme_server in skip_servers:
|
||||
pytest.skip('ACME server {} does not support ECDSA curve {}'
|
||||
.format(context.acme_server, curve))
|
||||
|
||||
domain = context.get_domain('curve')
|
||||
context.certbot([
|
||||
'certonly',
|
||||
@@ -640,9 +633,6 @@ def test_renew_with_ec_keys(context: IntegrationTestsContext) -> None:
|
||||
|
||||
def test_ocsp_must_staple(context: IntegrationTestsContext) -> None:
|
||||
"""Test that OCSP Must-Staple is correctly set in the generated certificate."""
|
||||
if context.acme_server == 'pebble':
|
||||
pytest.skip('Pebble does not support OCSP Must-Staple.')
|
||||
|
||||
certname = context.get_domain('must-staple')
|
||||
context.certbot(['auth', '--must-staple', '--domains', certname])
|
||||
|
||||
@@ -710,17 +700,14 @@ def test_revoke_and_unregister(context: IntegrationTestsContext) -> None:
|
||||
assert cert3 in stdout
|
||||
|
||||
|
||||
@pytest.mark.parametrize('curve,curve_cls,skip_servers', [
|
||||
('secp256r1', SECP256R1, []),
|
||||
('secp384r1', SECP384R1, []),
|
||||
('secp521r1', SECP521R1, ['boulder-v2'])]
|
||||
@pytest.mark.parametrize('curve,curve_cls', [
|
||||
('secp256r1', SECP256R1),
|
||||
('secp384r1', SECP384R1),
|
||||
('secp521r1', SECP521R1)]
|
||||
)
|
||||
def test_revoke_ecdsa_cert_key(
|
||||
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve],
|
||||
skip_servers: Iterable[str]) -> None:
|
||||
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve]) -> None:
|
||||
"""Test revoking a certificate """
|
||||
if context.acme_server in skip_servers:
|
||||
pytest.skip(f'ACME server {context.acme_server} does not support ECDSA curve {curve}')
|
||||
cert: str = context.get_domain('curve')
|
||||
context.certbot([
|
||||
'certonly',
|
||||
@@ -738,17 +725,14 @@ def test_revoke_ecdsa_cert_key(
|
||||
assert stdout.count('INVALID: REVOKED') == 1, 'Expected {0} to be REVOKED'.format(cert)
|
||||
|
||||
|
||||
@pytest.mark.parametrize('curve,curve_cls,skip_servers', [
|
||||
('secp256r1', SECP256R1, []),
|
||||
('secp384r1', SECP384R1, []),
|
||||
('secp521r1', SECP521R1, ['boulder-v2'])]
|
||||
@pytest.mark.parametrize('curve,curve_cls', [
|
||||
('secp256r1', SECP256R1),
|
||||
('secp384r1', SECP384R1),
|
||||
('secp521r1', SECP521R1)]
|
||||
)
|
||||
def test_revoke_ecdsa_cert_key_delete(
|
||||
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve],
|
||||
skip_servers: Iterable[str]) -> None:
|
||||
context: IntegrationTestsContext, curve: str, curve_cls: Type[EllipticCurve]) -> None:
|
||||
"""Test revoke and deletion for each supported curve type"""
|
||||
if context.acme_server in skip_servers:
|
||||
pytest.skip(f'ACME server {context.acme_server} does not support ECDSA curve {curve}')
|
||||
cert: str = context.get_domain('curve')
|
||||
context.certbot([
|
||||
'certonly',
|
||||
@@ -913,7 +897,7 @@ def test_dry_run_deactivate_authzs(context: IntegrationTestsContext) -> None:
|
||||
def test_preferred_chain(context: IntegrationTestsContext) -> None:
|
||||
"""Test that --preferred-chain results in the correct chain.pem being produced"""
|
||||
try:
|
||||
issuers = misc.get_acme_issuers(context)
|
||||
issuers = misc.get_acme_issuers()
|
||||
except NotImplementedError:
|
||||
pytest.skip('This ACME server does not support alternative issuers.')
|
||||
|
||||
|
||||
@@ -8,7 +8,6 @@ for a directory a specific configuration using built-in pytest hooks.
|
||||
See https://docs.pytest.org/en/latest/reference.html#hook-reference
|
||||
"""
|
||||
import contextlib
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
from certbot_integration_tests.utils import acme_server as acme_lib
|
||||
@@ -20,10 +19,6 @@ def pytest_addoption(parser):
|
||||
Standard pytest hook to add options to the pytest parser.
|
||||
:param parser: current pytest parser that will be used on the CLI
|
||||
"""
|
||||
parser.addoption('--acme-server', default='pebble',
|
||||
choices=['boulder-v2', 'pebble'],
|
||||
help='select the ACME server to use (boulder-v2, pebble), '
|
||||
'defaulting to pebble')
|
||||
parser.addoption('--dns-server', default='challtestsrv',
|
||||
choices=['bind', 'challtestsrv'],
|
||||
help='select the DNS server to use (bind, challtestsrv), '
|
||||
@@ -69,7 +64,7 @@ def _setup_primary_node(config):
|
||||
Setup the environment for integration tests.
|
||||
|
||||
This function will:
|
||||
- check runtime compatibility (Docker, docker-compose, Nginx)
|
||||
- check runtime compatibility (Docker, docker compose, Nginx)
|
||||
- create a temporary workspace and the persistent GIT repositories space
|
||||
- configure and start a DNS server using Docker, if configured
|
||||
- configure and start paralleled ACME CA servers using Docker
|
||||
@@ -80,22 +75,6 @@ def _setup_primary_node(config):
|
||||
|
||||
:param config: Configuration of the pytest primary node. Is modified by this function.
|
||||
"""
|
||||
# Check for runtime compatibility: some tools are required to be available in PATH
|
||||
if 'boulder' in config.option.acme_server:
|
||||
try:
|
||||
subprocess.check_output(['docker', '-v'], stderr=subprocess.STDOUT)
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
raise ValueError('Error: docker is required in PATH to launch the integration tests on'
|
||||
'boulder, but is not installed or not available for current user.')
|
||||
|
||||
try:
|
||||
subprocess.check_output(['docker-compose', '-v'], stderr=subprocess.STDOUT)
|
||||
except (subprocess.CalledProcessError, OSError):
|
||||
raise ValueError(
|
||||
'Error: docker-compose is required in PATH to launch the integration tests, '
|
||||
'but is not installed or not available for current user.'
|
||||
)
|
||||
|
||||
# Parameter numprocesses is added to option by pytest-xdist
|
||||
workers = ['primary'] if not config.option.numprocesses\
|
||||
else ['gw{0}'.format(i) for i in range(config.option.numprocesses)]
|
||||
@@ -115,8 +94,7 @@ def _setup_primary_node(config):
|
||||
|
||||
# By calling setup_acme_server we ensure that all necessary acme server instances will be
|
||||
# fully started. This runtime is reflected by the acme_xdist returned.
|
||||
acme_server = acme_lib.ACMEServer(config.option.acme_server, workers,
|
||||
dns_server=acme_dns_server)
|
||||
acme_server = acme_lib.ACMEServer(workers, dns_server=acme_dns_server)
|
||||
config.add_cleanup(acme_server.stop)
|
||||
print('ACME xdist config:\n{0}'.format(acme_server.acme_xdist))
|
||||
acme_server.start()
|
||||
|
||||
@@ -1,9 +1,15 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
"""General purpose nginx test configuration generator."""
|
||||
import atexit
|
||||
import getpass
|
||||
import sys
|
||||
from contextlib import ExitStack
|
||||
from typing import Optional
|
||||
|
||||
import pkg_resources
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
|
||||
def construct_nginx_config(nginx_root: str, nginx_webroot: str, http_port: int, https_port: int,
|
||||
@@ -23,10 +29,20 @@ def construct_nginx_config(nginx_root: str, nginx_webroot: str, http_port: int,
|
||||
:return: a string containing the full nginx configuration
|
||||
:rtype: str
|
||||
"""
|
||||
key_path = key_path if key_path \
|
||||
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/key.pem')
|
||||
cert_path = cert_path if cert_path \
|
||||
else pkg_resources.resource_filename('certbot_integration_tests', 'assets/cert.pem')
|
||||
if not key_path:
|
||||
file_manager = ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
ref = (importlib_resources.files('certbot_integration_tests').joinpath('assets')
|
||||
.joinpath('key.pem'))
|
||||
key_path = str(file_manager.enter_context(importlib_resources.as_file(ref)))
|
||||
|
||||
if not cert_path:
|
||||
file_manager = ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
ref = (importlib_resources.files('certbot_integration_tests').joinpath('assets')
|
||||
.joinpath('cert.pem'))
|
||||
cert_path = str(file_manager.enter_context(importlib_resources.as_file(ref)))
|
||||
|
||||
return '''\
|
||||
# This error log will be written regardless of server scope error_log
|
||||
# definitions, so we have to set this here in the main scope.
|
||||
|
||||
@@ -1,17 +1,21 @@
|
||||
"""Module to handle the context of RFC2136 integration tests."""
|
||||
|
||||
from contextlib import contextmanager
|
||||
import sys
|
||||
import tempfile
|
||||
from typing import Generator
|
||||
from typing import Iterable
|
||||
from typing import Tuple
|
||||
|
||||
from pkg_resources import resource_filename
|
||||
import pytest
|
||||
|
||||
from certbot_integration_tests.certbot_tests import context as certbot_context
|
||||
from certbot_integration_tests.utils import certbot_call
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
|
||||
class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
|
||||
"""Integration test context for certbot-dns-rfc2136"""
|
||||
@@ -44,15 +48,14 @@ class IntegrationTestsContext(certbot_context.IntegrationTestsContext):
|
||||
:yields: Path to credentials file
|
||||
:rtype: str
|
||||
"""
|
||||
src_file = resource_filename('certbot_integration_tests',
|
||||
'assets/bind-config/rfc2136-credentials-{}.ini.tpl'
|
||||
.format(label))
|
||||
|
||||
with open(src_file, 'r') as f:
|
||||
contents = f.read().format(
|
||||
server_address=self._dns_xdist['address'],
|
||||
server_port=self._dns_xdist['port']
|
||||
)
|
||||
src_ref_file = (importlib_resources.files('certbot_integration_tests').joinpath('assets')
|
||||
.joinpath('bind-config').joinpath(f'rfc2136-credentials-{label}.ini.tpl'))
|
||||
with importlib_resources.as_file(src_ref_file) as src_file:
|
||||
with open(src_file, 'r') as f:
|
||||
contents = f.read().format(
|
||||
server_address=self._dns_xdist['address'],
|
||||
server_port=self._dns_xdist['port']
|
||||
)
|
||||
|
||||
with tempfile.NamedTemporaryFile('w+', prefix='rfc2136-creds-{}'.format(label),
|
||||
suffix='.ini', dir=self.workspace) as fp:
|
||||
|
||||
@@ -5,7 +5,6 @@ import argparse
|
||||
import errno
|
||||
import json
|
||||
import os
|
||||
from os.path import join
|
||||
import shutil
|
||||
import subprocess
|
||||
import sys
|
||||
@@ -18,14 +17,12 @@ from typing import Dict
|
||||
from typing import List
|
||||
from typing import Mapping
|
||||
from typing import Optional
|
||||
from typing import Tuple
|
||||
from typing import Type
|
||||
|
||||
import requests
|
||||
|
||||
# pylint: disable=wildcard-import,unused-wildcard-import
|
||||
from certbot_integration_tests.utils import misc
|
||||
from certbot_integration_tests.utils import pebble_artifacts
|
||||
from certbot_integration_tests.utils import pebble_ocsp_server
|
||||
from certbot_integration_tests.utils import proxy
|
||||
from certbot_integration_tests.utils.constants import *
|
||||
|
||||
@@ -42,34 +39,30 @@ class ACMEServer:
|
||||
ACMEServer is also a context manager, and so can be used to ensure ACME server is
|
||||
started/stopped upon context enter/exit.
|
||||
"""
|
||||
def __init__(self, acme_server: str, nodes: List[str], http_proxy: bool = True,
|
||||
def __init__(self, nodes: List[str], http_proxy: bool = True,
|
||||
stdout: bool = False, dns_server: Optional[str] = None,
|
||||
http_01_port: Optional[int] = None) -> None:
|
||||
"""
|
||||
Create an ACMEServer instance.
|
||||
:param str acme_server: the type of acme server used (boulder-v2 or pebble)
|
||||
:param list nodes: list of node names that will be setup by pytest xdist
|
||||
:param bool http_proxy: if False do not start the HTTP proxy
|
||||
:param bool stdout: if True stream all subprocesses stdout to standard stdout
|
||||
:param str dns_server: if set, Pebble/Boulder will use it to resolve domains
|
||||
:param str dns_server: if set, Pebble will use it to resolve domains
|
||||
:param int http_01_port: port to use for http-01 validation; currently
|
||||
only supported for pebble without an HTTP proxy
|
||||
"""
|
||||
self._construct_acme_xdist(acme_server, nodes)
|
||||
self._construct_acme_xdist(nodes)
|
||||
|
||||
self._acme_type = 'pebble' if acme_server == 'pebble' else 'boulder'
|
||||
self._proxy = http_proxy
|
||||
self._workspace = tempfile.mkdtemp()
|
||||
self._processes: List[subprocess.Popen] = []
|
||||
self._stdout = sys.stdout if stdout else open(os.devnull, 'w') # pylint: disable=consider-using-with
|
||||
self._dns_server = dns_server
|
||||
self._preterminate_cmds_args: List[Tuple[Tuple[Any, ...], Dict[str, Any]]] = []
|
||||
self._http_01_port = BOULDER_HTTP_01_PORT if self._acme_type == 'boulder' \
|
||||
else DEFAULT_HTTP_01_PORT
|
||||
self._http_01_port = DEFAULT_HTTP_01_PORT
|
||||
if http_01_port:
|
||||
if (self._acme_type == 'pebble' and self._proxy) or self._acme_type == 'boulder':
|
||||
if self._proxy:
|
||||
raise ValueError('Setting http_01_port is not currently supported when '
|
||||
'using Boulder or the HTTP proxy')
|
||||
'using the HTTP proxy')
|
||||
self._http_01_port = http_01_port
|
||||
|
||||
def start(self) -> None:
|
||||
@@ -77,10 +70,7 @@ class ACMEServer:
|
||||
try:
|
||||
if self._proxy:
|
||||
self._prepare_http_proxy()
|
||||
if self._acme_type == 'pebble':
|
||||
self._prepare_pebble_server()
|
||||
if self._acme_type == 'boulder':
|
||||
self._prepare_boulder_server()
|
||||
self._prepare_pebble_server()
|
||||
except BaseException as e:
|
||||
self.stop()
|
||||
raise e
|
||||
@@ -89,7 +79,6 @@ class ACMEServer:
|
||||
"""Stop the test stack, and clean its resources"""
|
||||
print('=> Tear down the test infrastructure...')
|
||||
try:
|
||||
self._run_preterminate_cmds()
|
||||
for process in self._processes:
|
||||
try:
|
||||
process.terminate()
|
||||
@@ -115,19 +104,14 @@ class ACMEServer:
|
||||
traceback: Optional[TracebackType]) -> None:
|
||||
self.stop()
|
||||
|
||||
def _construct_acme_xdist(self, acme_server: str, nodes: List[str]) -> None:
|
||||
def _construct_acme_xdist(self, nodes: List[str]) -> None:
|
||||
"""Generate and return the acme_xdist dict"""
|
||||
acme_xdist: Dict[str, Any] = {'acme_server': acme_server}
|
||||
acme_xdist: Dict[str, Any] = {}
|
||||
|
||||
# Directory and ACME port are set implicitly in the docker-compose.yml
|
||||
# files of Boulder/Pebble.
|
||||
if acme_server == 'pebble':
|
||||
acme_xdist['directory_url'] = PEBBLE_DIRECTORY_URL
|
||||
acme_xdist['challtestsrv_url'] = PEBBLE_CHALLTESTSRV_URL
|
||||
else: # boulder
|
||||
acme_xdist['directory_url'] = BOULDER_V2_DIRECTORY_URL
|
||||
acme_xdist['challtestsrv_url'] = BOULDER_V2_CHALLTESTSRV_URL
|
||||
|
||||
# files of Pebble.
|
||||
acme_xdist['directory_url'] = PEBBLE_DIRECTORY_URL
|
||||
acme_xdist['challtestsrv_url'] = PEBBLE_CHALLTESTSRV_URL
|
||||
acme_xdist['http_port'] = dict(zip(nodes, range(5200, 5200 + len(nodes))))
|
||||
acme_xdist['https_port'] = dict(zip(nodes, range(5100, 5100 + len(nodes))))
|
||||
acme_xdist['other_port'] = dict(zip(nodes, range(5300, 5300 + len(nodes))))
|
||||
@@ -161,11 +145,6 @@ class ACMEServer:
|
||||
[pebble_path, '-config', pebble_config_path, '-dnsserver', dns_server, '-strict'],
|
||||
env=environ)
|
||||
|
||||
# pebble_ocsp_server is imported here and not at the top of module in order to avoid a
|
||||
# useless ImportError, in the case where cryptography dependency is too old to support
|
||||
# ocsp, but Boulder is used instead of Pebble, so pebble_ocsp_server is not used. This is
|
||||
# the typical situation of integration-certbot-oldest tox testenv.
|
||||
from certbot_integration_tests.utils import pebble_ocsp_server
|
||||
self._launch_process([sys.executable, pebble_ocsp_server.__file__])
|
||||
|
||||
# Wait for the ACME CA server to be up.
|
||||
@@ -174,68 +153,6 @@ class ACMEServer:
|
||||
|
||||
print('=> Finished pebble instance deployment.')
|
||||
|
||||
def _prepare_boulder_server(self) -> None:
|
||||
"""Configure and launch the Boulder server"""
|
||||
print('=> Starting boulder instance deployment...')
|
||||
instance_path = join(self._workspace, 'boulder')
|
||||
|
||||
# Load Boulder from git, that includes a docker-compose.yml ready for production.
|
||||
process = self._launch_process(['git', 'clone', 'https://github.com/letsencrypt/boulder',
|
||||
'--single-branch', '--depth=1', instance_path])
|
||||
process.wait(MAX_SUBPROCESS_WAIT)
|
||||
|
||||
# Allow Boulder to ignore usual limit rate policies, useful for tests.
|
||||
os.rename(join(instance_path, 'test/rate-limit-policies-b.yml'),
|
||||
join(instance_path, 'test/rate-limit-policies.yml'))
|
||||
|
||||
if self._dns_server:
|
||||
# Change Boulder config to use the provided DNS server
|
||||
for suffix in ["", "-remote-a", "-remote-b"]:
|
||||
with open(join(instance_path, 'test/config/va{}.json'.format(suffix)), 'r') as f:
|
||||
config = json.loads(f.read())
|
||||
config['va']['dnsResolvers'] = [self._dns_server]
|
||||
with open(join(instance_path, 'test/config/va{}.json'.format(suffix)), 'w') as f:
|
||||
f.write(json.dumps(config, indent=2, separators=(',', ': ')))
|
||||
|
||||
# This command needs to be run before we try and terminate running processes because
|
||||
# docker-compose up doesn't always respond to SIGTERM. See
|
||||
# https://github.com/certbot/certbot/pull/9435.
|
||||
self._register_preterminate_cmd(['docker-compose', 'down'], cwd=instance_path)
|
||||
# Boulder docker generates build artifacts owned by root with 0o744 permissions.
|
||||
# If we started the acme server from a normal user that has access to the Docker
|
||||
# daemon, this user will not be able to delete these artifacts from the host.
|
||||
# We need to do it through a docker.
|
||||
self._register_preterminate_cmd(['docker', 'run', '--rm', '-v',
|
||||
'{0}:/workspace'.format(self._workspace), 'alpine', 'rm',
|
||||
'-rf', '/workspace/boulder'])
|
||||
try:
|
||||
# Launch the Boulder server
|
||||
self._launch_process(['docker-compose', 'up', '--force-recreate'], cwd=instance_path)
|
||||
|
||||
# Wait for the ACME CA server to be up.
|
||||
print('=> Waiting for boulder instance to respond...')
|
||||
misc.check_until_timeout(
|
||||
self.acme_xdist['directory_url'], attempts=300)
|
||||
|
||||
if not self._dns_server:
|
||||
# Configure challtestsrv to answer any A record request with ip of the docker host.
|
||||
response = requests.post(
|
||||
f'{BOULDER_V2_CHALLTESTSRV_URL}/set-default-ipv4',
|
||||
json={'ip': '10.77.77.1'},
|
||||
timeout=10
|
||||
)
|
||||
response.raise_for_status()
|
||||
except BaseException:
|
||||
# If we failed to set up boulder, print its logs.
|
||||
print('=> Boulder setup failed. Boulder logs are:')
|
||||
process = self._launch_process([
|
||||
'docker-compose', 'logs'], cwd=instance_path, force_stderr=True
|
||||
)
|
||||
process.wait(MAX_SUBPROCESS_WAIT)
|
||||
raise
|
||||
|
||||
print('=> Finished boulder instance deployment.')
|
||||
|
||||
def _prepare_http_proxy(self) -> None:
|
||||
"""Configure and launch an HTTP proxy"""
|
||||
print(f'=> Configuring the HTTP proxy on port {self._http_01_port}...')
|
||||
@@ -260,26 +177,11 @@ class ACMEServer:
|
||||
self._processes.append(process)
|
||||
return process
|
||||
|
||||
def _register_preterminate_cmd(self, *args: Any, **kwargs: Any) -> None:
|
||||
self._preterminate_cmds_args.append((args, kwargs))
|
||||
|
||||
def _run_preterminate_cmds(self) -> None:
|
||||
for args, kwargs in self._preterminate_cmds_args:
|
||||
process = self._launch_process(*args, **kwargs)
|
||||
process.wait(MAX_SUBPROCESS_WAIT)
|
||||
# It's unlikely to matter, but let's clear the list of cleanup commands
|
||||
# once they've been run.
|
||||
self._preterminate_cmds_args.clear()
|
||||
|
||||
|
||||
def main() -> None:
|
||||
# pylint: disable=missing-function-docstring
|
||||
parser = argparse.ArgumentParser(
|
||||
description='CLI tool to start a local instance of Pebble or Boulder CA server.')
|
||||
parser.add_argument('--server-type', '-s',
|
||||
choices=['pebble', 'boulder-v2'], default='pebble',
|
||||
help='type of CA server to start: can be Pebble or Boulder. '
|
||||
'Pebble is used if not set.')
|
||||
description='CLI tool to start a local instance of Pebble CA server.')
|
||||
parser.add_argument('--dns-server', '-d',
|
||||
help='specify the DNS server as `IP:PORT` to use by '
|
||||
'Pebble; if not specified, a local mock DNS server will be used to '
|
||||
@@ -290,8 +192,8 @@ def main() -> None:
|
||||
args = parser.parse_args()
|
||||
|
||||
acme_server = ACMEServer(
|
||||
args.server_type, [], http_proxy=False, stdout=True,
|
||||
dns_server=args.dns_server, http_01_port=args.http_01_port,
|
||||
[], http_proxy=False, stdout=True, dns_server=args.dns_server,
|
||||
http_01_port=args.http_01_port,
|
||||
)
|
||||
|
||||
try:
|
||||
|
||||
@@ -6,11 +6,8 @@ import subprocess
|
||||
import sys
|
||||
from typing import Dict
|
||||
from typing import List
|
||||
from typing import Mapping
|
||||
from typing import Tuple
|
||||
|
||||
import pkg_resources
|
||||
|
||||
import certbot_integration_tests
|
||||
# pylint: disable=wildcard-import,unused-wildcard-import
|
||||
from certbot_integration_tests.utils.constants import *
|
||||
@@ -84,29 +81,14 @@ def _prepare_environ(workspace: str) -> Dict[str, str]:
|
||||
return new_environ
|
||||
|
||||
|
||||
def _compute_additional_args(workspace: str, environ: Mapping[str, str],
|
||||
force_renew: bool) -> List[str]:
|
||||
additional_args = []
|
||||
output = subprocess.check_output(['certbot', '--version'],
|
||||
universal_newlines=True, stderr=subprocess.STDOUT,
|
||||
cwd=workspace, env=environ)
|
||||
# Typical response is: output = 'certbot 0.31.0.dev0'
|
||||
version_str = output.split(' ')[1].strip()
|
||||
if pkg_resources.parse_version(version_str) >= pkg_resources.parse_version('0.30.0'):
|
||||
additional_args.append('--no-random-sleep-on-renew')
|
||||
|
||||
if force_renew:
|
||||
additional_args.append('--renew-by-default')
|
||||
|
||||
return additional_args
|
||||
|
||||
|
||||
def _prepare_args_env(certbot_args: List[str], directory_url: str, http_01_port: int,
|
||||
tls_alpn_01_port: int, config_dir: str, workspace: str,
|
||||
force_renew: bool) -> Tuple[List[str], Dict[str, str]]:
|
||||
|
||||
new_environ = _prepare_environ(workspace)
|
||||
additional_args = _compute_additional_args(workspace, new_environ, force_renew)
|
||||
additional_args = ['--no-random-sleep-on-renew']
|
||||
if force_renew:
|
||||
additional_args.append('--renew-by-default')
|
||||
|
||||
command = [
|
||||
'certbot',
|
||||
@@ -114,7 +96,6 @@ def _prepare_args_env(certbot_args: List[str], directory_url: str, http_01_port:
|
||||
'--no-verify-ssl',
|
||||
'--http-01-port', str(http_01_port),
|
||||
'--https-port', str(tls_alpn_01_port),
|
||||
'--manual-public-ip-logging-ok',
|
||||
'--config-dir', config_dir,
|
||||
'--work-dir', os.path.join(workspace, 'work'),
|
||||
'--logs-dir', os.path.join(workspace, 'logs'),
|
||||
|
||||
@@ -1,10 +1,7 @@
|
||||
"""Some useful constants to use throughout certbot-ci integration tests"""
|
||||
DEFAULT_HTTP_01_PORT = 5002
|
||||
BOULDER_HTTP_01_PORT = 80
|
||||
TLS_ALPN_01_PORT = 5001
|
||||
CHALLTESTSRV_PORT = 8055
|
||||
BOULDER_V2_CHALLTESTSRV_URL = f'http://10.77.77.77:{CHALLTESTSRV_PORT}'
|
||||
BOULDER_V2_DIRECTORY_URL = 'http://localhost:4001/directory'
|
||||
PEBBLE_DIRECTORY_URL = 'https://localhost:14000/dir'
|
||||
PEBBLE_MANAGEMENT_URL = 'https://localhost:15000'
|
||||
PEBBLE_CHALLTESTSRV_URL = f'http://localhost:{CHALLTESTSRV_PORT}'
|
||||
|
||||
@@ -15,11 +15,14 @@ from typing import List
|
||||
from typing import Optional
|
||||
from typing import Type
|
||||
|
||||
from pkg_resources import resource_filename
|
||||
|
||||
from certbot_integration_tests.utils import constants
|
||||
|
||||
BIND_DOCKER_IMAGE = "internetsystemsconsortium/bind9:9.16"
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
BIND_DOCKER_IMAGE = "internetsystemsconsortium/bind9:9.20"
|
||||
BIND_BIND_ADDRESS = ("127.0.0.1", 45953)
|
||||
|
||||
# A TCP DNS message which is a query for '. CH A' transaction ID 0xcb37. This is used
|
||||
@@ -80,13 +83,12 @@ class DNSServer:
|
||||
|
||||
def _configure_bind(self) -> None:
|
||||
"""Configure the BIND9 server based on the prebaked configuration"""
|
||||
bind_conf_src = resource_filename(
|
||||
"certbot_integration_tests", "assets/bind-config"
|
||||
)
|
||||
for directory in ("conf", "zones"):
|
||||
shutil.copytree(
|
||||
os.path.join(bind_conf_src, directory), os.path.join(self.bind_root, directory)
|
||||
)
|
||||
ref = importlib_resources.files("certbot_integration_tests") / "assets" / "bind-config"
|
||||
with importlib_resources.as_file(ref) as path:
|
||||
for directory in ("conf", "zones"):
|
||||
shutil.copytree(
|
||||
os.path.join(path, directory), os.path.join(self.bind_root, directory)
|
||||
)
|
||||
|
||||
def _start_bind(self) -> None:
|
||||
"""Launch the BIND9 server as a Docker container"""
|
||||
|
||||
@@ -2,6 +2,7 @@
|
||||
Misc module contains stateless functions that could be used during pytest execution,
|
||||
or outside during setup/teardown of the integration tests environment.
|
||||
"""
|
||||
import atexit
|
||||
import contextlib
|
||||
import errno
|
||||
import functools
|
||||
@@ -30,27 +31,23 @@ from cryptography.hazmat.primitives.serialization import PrivateFormat
|
||||
from cryptography.x509 import Certificate
|
||||
from cryptography.x509 import load_pem_x509_certificate
|
||||
from OpenSSL import crypto
|
||||
import pkg_resources
|
||||
import requests
|
||||
|
||||
from certbot_integration_tests.certbot_tests.context import IntegrationTestsContext
|
||||
from certbot_integration_tests.utils.constants import PEBBLE_ALTERNATE_ROOTS
|
||||
from certbot_integration_tests.utils.constants import PEBBLE_MANAGEMENT_URL
|
||||
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
RSA_KEY_TYPE = 'rsa'
|
||||
ECDSA_KEY_TYPE = 'ecdsa'
|
||||
|
||||
|
||||
def _suppress_x509_verification_warnings() -> None:
|
||||
try:
|
||||
import urllib3
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
except ImportError:
|
||||
# Handle old versions of request with vendorized urllib3
|
||||
# pylint: disable=no-member
|
||||
from requests.packages.urllib3.exceptions import InsecureRequestWarning
|
||||
requests.packages.urllib3.disable_warnings( # type: ignore[attr-defined]
|
||||
InsecureRequestWarning)
|
||||
import urllib3
|
||||
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
|
||||
|
||||
|
||||
def check_until_timeout(url: str, attempts: int = 30) -> None:
|
||||
@@ -125,7 +122,11 @@ def generate_test_file_hooks(config_dir: str, hook_probe: str) -> None:
|
||||
:param str config_dir: current certbot config directory
|
||||
:param str hook_probe: path to the hook probe to test hook scripts execution
|
||||
"""
|
||||
hook_path = pkg_resources.resource_filename('certbot_integration_tests', 'assets/hook.py')
|
||||
file_manager = contextlib.ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
hook_path_ref = (importlib_resources.files('certbot_integration_tests').joinpath('assets')
|
||||
.joinpath('hook.py'))
|
||||
hook_path = str(file_manager.enter_context(importlib_resources.as_file(hook_path_ref)))
|
||||
|
||||
for hook_dir in list_renewal_hooks_dirs(config_dir):
|
||||
# We want an equivalent of bash `chmod -p $HOOK_DIR, that does not fail if one folder of
|
||||
@@ -232,7 +233,7 @@ def generate_csr(domains: Iterable[str], key_path: str, csr_path: str,
|
||||
req.add_extensions([san_constraint])
|
||||
|
||||
req.set_pubkey(key)
|
||||
req.set_version(2)
|
||||
req.set_version(0)
|
||||
req.sign(key, 'sha256')
|
||||
|
||||
with open(csr_path, 'wb') as file_h:
|
||||
@@ -260,9 +261,11 @@ def load_sample_data_path(workspace: str) -> str:
|
||||
:returns: the path to the loaded sample data directory
|
||||
:rtype: str
|
||||
"""
|
||||
original = pkg_resources.resource_filename('certbot_integration_tests', 'assets/sample-config')
|
||||
copied = os.path.join(workspace, 'sample-config')
|
||||
shutil.copytree(original, copied, symlinks=True)
|
||||
original_ref = (importlib_resources.files('certbot_integration_tests').joinpath('assets')
|
||||
.joinpath('sample-config'))
|
||||
with importlib_resources.as_file(original_ref) as original:
|
||||
copied = os.path.join(workspace, 'sample-config')
|
||||
shutil.copytree(original, copied, symlinks=True)
|
||||
|
||||
if os.name == 'nt':
|
||||
# Fix the symlinks on Windows if GIT is not configured to create them upon checkout
|
||||
@@ -299,16 +302,12 @@ def echo(keyword: str, path: Optional[str] = None) -> str:
|
||||
os.path.basename(sys.executable), keyword, ' >> "{0}"'.format(path) if path else '')
|
||||
|
||||
|
||||
def get_acme_issuers(context: IntegrationTestsContext) -> List[Certificate]:
|
||||
def get_acme_issuers() -> List[Certificate]:
|
||||
"""Gets the list of one or more issuer certificates from the ACME server used by the
|
||||
context.
|
||||
:param context: the testing context.
|
||||
:return: the `list of x509.Certificate` representing the list of issuers.
|
||||
"""
|
||||
# TODO: in fact, Boulder has alternate chains in config-next/, just not yet in config/.
|
||||
if context.acme_server != "pebble":
|
||||
raise NotImplementedError()
|
||||
|
||||
_suppress_x509_verification_warnings()
|
||||
|
||||
issuers = []
|
||||
|
||||
@@ -1,54 +1,76 @@
|
||||
# pylint: disable=missing-module-docstring
|
||||
|
||||
import atexit
|
||||
import io
|
||||
import json
|
||||
import os
|
||||
import stat
|
||||
from typing import Tuple
|
||||
import sys
|
||||
import zipfile
|
||||
from contextlib import ExitStack
|
||||
from typing import Optional, Tuple
|
||||
|
||||
import pkg_resources
|
||||
import requests
|
||||
|
||||
from certbot_integration_tests.utils.constants import DEFAULT_HTTP_01_PORT
|
||||
from certbot_integration_tests.utils.constants import MOCK_OCSP_SERVER_PORT
|
||||
|
||||
PEBBLE_VERSION = 'v2.3.1'
|
||||
ASSETS_PATH = pkg_resources.resource_filename('certbot_integration_tests', 'assets')
|
||||
if sys.version_info >= (3, 9): # pragma: no cover
|
||||
import importlib.resources as importlib_resources
|
||||
else: # pragma: no cover
|
||||
import importlib_resources
|
||||
|
||||
PEBBLE_VERSION = 'v2.5.1'
|
||||
|
||||
|
||||
def fetch(workspace: str, http_01_port: int = DEFAULT_HTTP_01_PORT) -> Tuple[str, str, str]:
|
||||
# pylint: disable=missing-function-docstring
|
||||
suffix = 'linux-amd64' if os.name != 'nt' else 'windows-amd64.exe'
|
||||
file_manager = ExitStack()
|
||||
atexit.register(file_manager.close)
|
||||
pebble_path_ref = importlib_resources.files('certbot_integration_tests') / 'assets'
|
||||
assets_path = str(file_manager.enter_context(importlib_resources.as_file(pebble_path_ref)))
|
||||
|
||||
pebble_path = _fetch_asset('pebble', suffix)
|
||||
challtestsrv_path = _fetch_asset('pebble-challtestsrv', suffix)
|
||||
pebble_config_path = _build_pebble_config(workspace, http_01_port)
|
||||
pebble_path = _fetch_asset('pebble', assets_path)
|
||||
challtestsrv_path = _fetch_asset('pebble-challtestsrv', assets_path)
|
||||
pebble_config_path = _build_pebble_config(workspace, http_01_port, assets_path)
|
||||
|
||||
return pebble_path, challtestsrv_path, pebble_config_path
|
||||
|
||||
|
||||
def _fetch_asset(asset: str, suffix: str) -> str:
|
||||
asset_path = os.path.join(ASSETS_PATH, '{0}_{1}_{2}'.format(asset, PEBBLE_VERSION, suffix))
|
||||
def _fetch_asset(asset: str, assets_path: str) -> str:
|
||||
platform = 'linux-amd64'
|
||||
base_url = 'https://github.com/letsencrypt/pebble/releases/download'
|
||||
asset_path = os.path.join(assets_path, f'{asset}_{PEBBLE_VERSION}_{platform}')
|
||||
if not os.path.exists(asset_path):
|
||||
asset_url = ('https://github.com/letsencrypt/pebble/releases/download/{0}/{1}_{2}'
|
||||
.format(PEBBLE_VERSION, asset, suffix))
|
||||
asset_url = f'{base_url}/{PEBBLE_VERSION}/{asset}-{platform}.zip'
|
||||
response = requests.get(asset_url, timeout=30)
|
||||
response.raise_for_status()
|
||||
asset_data = _unzip_asset(response.content, asset)
|
||||
if asset_data is None:
|
||||
raise ValueError(f"zipfile {asset_url} didn't contain file {asset}")
|
||||
with open(asset_path, 'wb') as file_h:
|
||||
file_h.write(response.content)
|
||||
file_h.write(asset_data)
|
||||
os.chmod(asset_path, os.stat(asset_path).st_mode | stat.S_IEXEC)
|
||||
|
||||
return asset_path
|
||||
|
||||
|
||||
def _build_pebble_config(workspace: str, http_01_port: int) -> str:
|
||||
def _unzip_asset(zipped_data: bytes, asset_name: str) -> Optional[bytes]:
|
||||
with zipfile.ZipFile(io.BytesIO(zipped_data)) as zip_file:
|
||||
for entry in zip_file.filelist:
|
||||
if not entry.is_dir() and entry.filename.endswith(asset_name):
|
||||
return zip_file.read(entry)
|
||||
return None
|
||||
|
||||
|
||||
def _build_pebble_config(workspace: str, http_01_port: int, assets_path: str) -> str:
|
||||
config_path = os.path.join(workspace, 'pebble-config.json')
|
||||
with open(config_path, 'w') as file_h:
|
||||
file_h.write(json.dumps({
|
||||
'pebble': {
|
||||
'listenAddress': '0.0.0.0:14000',
|
||||
'managementListenAddress': '0.0.0.0:15000',
|
||||
'certificate': os.path.join(ASSETS_PATH, 'cert.pem'),
|
||||
'privateKey': os.path.join(ASSETS_PATH, 'key.pem'),
|
||||
'certificate': os.path.join(assets_path, 'cert.pem'),
|
||||
'privateKey': os.path.join(assets_path, 'key.pem'),
|
||||
'httpPort': http_01_port,
|
||||
'tlsPort': 5001,
|
||||
'ocspResponderURL': 'http://127.0.0.1:{0}'.format(MOCK_OCSP_SERVER_PORT),
|
||||
|
||||
@@ -5,6 +5,7 @@ to serve a mock OCSP responder during integration tests against Pebble.
|
||||
"""
|
||||
import datetime
|
||||
import http.server as BaseHTTPServer
|
||||
import pytz
|
||||
import re
|
||||
from typing import cast
|
||||
from typing import Union
|
||||
@@ -37,7 +38,9 @@ class _ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
verify=False, timeout=10)
|
||||
issuer_cert = x509.load_pem_x509_certificate(request.content, default_backend())
|
||||
|
||||
content_len = int(self.headers.get('Content-Length'))
|
||||
raw_content_len = self.headers.get('Content-Length')
|
||||
assert isinstance(raw_content_len, str)
|
||||
content_len = int(raw_content_len)
|
||||
|
||||
ocsp_request = ocsp.load_der_ocsp_request(self.rfile.read(content_len))
|
||||
response = requests.get('{0}/cert-status-by-serial/{1}'.format(
|
||||
@@ -52,7 +55,7 @@ class _ProxyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
|
||||
else:
|
||||
data = response.json()
|
||||
|
||||
now = datetime.datetime.utcnow()
|
||||
now = datetime.datetime.now(pytz.UTC)
|
||||
cert = x509.load_pem_x509_certificate(data['Certificate'].encode(), default_backend())
|
||||
if data['Status'] != 'Revoked':
|
||||
ocsp_status = ocsp.OCSPCertStatus.GOOD
|
||||
|
||||
@@ -1,20 +1,12 @@
|
||||
from pkg_resources import parse_version
|
||||
from setuptools import __version__ as setuptools_version
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '0.32.0.dev0'
|
||||
|
||||
# setuptools 36.2+ is needed for support for environment markers
|
||||
min_setuptools_version='36.2'
|
||||
# This conditional isn't necessary, but it provides better error messages to
|
||||
# people who try to install this package with older versions of setuptools.
|
||||
if parse_version(setuptools_version) < parse_version(min_setuptools_version):
|
||||
raise RuntimeError(f'setuptools {min_setuptools_version}+ is required')
|
||||
|
||||
install_requires = [
|
||||
'coverage',
|
||||
'cryptography',
|
||||
'importlib_resources>=1.3.1; python_version < "3.9"',
|
||||
'pyopenssl',
|
||||
'pytest',
|
||||
'pytest-cov',
|
||||
@@ -26,9 +18,12 @@ install_requires = [
|
||||
# installation on Linux.
|
||||
'pywin32>=300 ; sys_platform == "win32"',
|
||||
'pyyaml',
|
||||
'requests',
|
||||
'pytz>=2019.3',
|
||||
# requests unvendored its dependencies in version 2.16.0 and this code relies on that for
|
||||
# calling `urllib3.disable_warnings`.
|
||||
'requests>=2.16.0',
|
||||
'setuptools',
|
||||
'types-python-dateutil'
|
||||
'types-python-dateutil',
|
||||
]
|
||||
|
||||
setup(
|
||||
@@ -39,18 +34,18 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 3 - Alpha',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
],
|
||||
|
||||
@@ -1,9 +1,10 @@
|
||||
FROM debian:buster
|
||||
MAINTAINER Brad Warren <bmw@eff.org>
|
||||
FROM docker.io/python:3.8-buster
|
||||
LABEL maintainer="Brad Warren <bmw@eff.org>"
|
||||
|
||||
# This does not include the dependencies needed to build cryptography. See
|
||||
# https://cryptography.io/en/latest/installation/#building-cryptography-on-linux
|
||||
RUN apt-get update && \
|
||||
apt install python3-dev python3-venv gcc libaugeas0 libssl-dev \
|
||||
libffi-dev ca-certificates openssl -y
|
||||
apt install python3-venv libaugeas0 -y
|
||||
|
||||
WORKDIR /opt/certbot/src
|
||||
|
||||
|
||||
@@ -75,7 +75,7 @@ def _get_server_root(config: str) -> str:
|
||||
if os.path.isdir(os.path.join(config, name))]
|
||||
|
||||
if len(subdirs) != 1:
|
||||
errors.Error("Malformed configuration directory {0}".format(config))
|
||||
raise errors.Error("Malformed configuration directory {0}".format(config))
|
||||
|
||||
return os.path.join(config, subdirs[0].rstrip())
|
||||
|
||||
|
||||
Binary file not shown.
@@ -19,7 +19,6 @@
|
||||
|
||||
server {
|
||||
listen 80 default_server;
|
||||
listen [::]:80 default_server ipv6only=on;
|
||||
|
||||
root /usr/share/nginx/html;
|
||||
index index.html index.htm;
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'certbot',
|
||||
@@ -14,22 +14,22 @@ setup(
|
||||
name='certbot-compatibility-test',
|
||||
version=version,
|
||||
description="Compatibility tests for Certbot",
|
||||
url='https://github.com/letsencrypt/letsencrypt',
|
||||
url='https://github.com/certbot/certbot',
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 3 - Alpha',
|
||||
'Intended Audience :: Developers',
|
||||
'License :: OSI Approved :: Apache Software License',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
],
|
||||
|
||||
33
certbot-dns-cloudflare/.readthedocs.yaml
Normal file
33
certbot-dns-cloudflare/.readthedocs.yaml
Normal file
@@ -0,0 +1,33 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: certbot-dns-cloudflare/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: certbot-dns-cloudflare/readthedocs.org.requirements.txt
|
||||
@@ -16,7 +16,7 @@
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
import os
|
||||
# import os
|
||||
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
|
||||
'sphinx.ext.intersphinx',
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode']
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme']
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
autodoc_default_flags = ['show-inheritance']
|
||||
@@ -93,14 +94,7 @@ todo_include_todos = False
|
||||
# a list of builtin themes.
|
||||
#
|
||||
|
||||
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
import sphinx_rtd_theme
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
|
||||
@@ -4,14 +4,18 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'cloudflare>=1.5.1',
|
||||
# for now, do not upgrade to cloudflare>=2.20 to avoid deprecation warnings and the breaking
|
||||
# changes in version 3.0. see https://github.com/certbot/certbot/issues/9938
|
||||
'cloudflare>=1.5.1, <2.20',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if not os.environ.get('SNAP_BUILD'):
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
else:
|
||||
install_requires.extend([
|
||||
# We specify the minimum acme and certbot version as the current plugin
|
||||
# version for simplicity. See
|
||||
@@ -19,11 +23,6 @@ if not os.environ.get('SNAP_BUILD'):
|
||||
f'acme>={version}',
|
||||
f'certbot>={version}',
|
||||
])
|
||||
elif 'bdist_wheel' in sys.argv[1:]:
|
||||
raise RuntimeError('Unset SNAP_BUILD when building wheels '
|
||||
'to include certbot dependencies.')
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
|
||||
docs_extras = [
|
||||
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
|
||||
@@ -42,7 +41,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -51,11 +50,11 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
33
certbot-dns-digitalocean/.readthedocs.yaml
Normal file
33
certbot-dns-digitalocean/.readthedocs.yaml
Normal file
@@ -0,0 +1,33 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: certbot-dns-digitalocean/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: certbot-dns-digitalocean/readthedocs.org.requirements.txt
|
||||
@@ -2,6 +2,7 @@
|
||||
import logging
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import cast
|
||||
from typing import Optional
|
||||
|
||||
import digitalocean
|
||||
@@ -56,7 +57,7 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
def _get_digitalocean_client(self) -> "_DigitalOceanClient":
|
||||
if not self.credentials: # pragma: no cover
|
||||
raise errors.Error("Plugin has not been prepared.")
|
||||
return _DigitalOceanClient(self.credentials.conf('token'))
|
||||
return _DigitalOceanClient(cast(str, self.credentials.conf('token')))
|
||||
|
||||
|
||||
class _DigitalOceanClient:
|
||||
|
||||
@@ -16,7 +16,7 @@
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
import os
|
||||
# import os
|
||||
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
|
||||
'sphinx.ext.intersphinx',
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode']
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme']
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
autodoc_default_flags = ['show-inheritance']
|
||||
@@ -93,14 +94,7 @@ todo_include_todos = False
|
||||
# a list of builtin themes.
|
||||
#
|
||||
|
||||
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
import sphinx_rtd_theme
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
|
||||
@@ -4,14 +4,16 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'python-digitalocean>=1.11', # 1.15.0 or newer is recommended for TTL support
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if not os.environ.get('SNAP_BUILD'):
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
else:
|
||||
install_requires.extend([
|
||||
# We specify the minimum acme and certbot version as the current plugin
|
||||
# version for simplicity. See
|
||||
@@ -19,11 +21,6 @@ if not os.environ.get('SNAP_BUILD'):
|
||||
f'acme>={version}',
|
||||
f'certbot>={version}',
|
||||
])
|
||||
elif 'bdist_wheel' in sys.argv[1:]:
|
||||
raise RuntimeError('Unset SNAP_BUILD when building wheels '
|
||||
'to include certbot dependencies.')
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
|
||||
docs_extras = [
|
||||
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
|
||||
@@ -42,7 +39,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -51,11 +48,11 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
33
certbot-dns-dnsimple/.readthedocs.yaml
Normal file
33
certbot-dns-dnsimple/.readthedocs.yaml
Normal file
@@ -0,0 +1,33 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: certbot-dns-dnsimple/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: certbot-dns-dnsimple/readthedocs.org.requirements.txt
|
||||
@@ -2,33 +2,30 @@
|
||||
import logging
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import Optional
|
||||
|
||||
from lexicon.providers import dnsimple
|
||||
from requests import HTTPError
|
||||
|
||||
from certbot import errors
|
||||
from certbot.plugins import dns_common
|
||||
from certbot.plugins import dns_common_lexicon
|
||||
from certbot.plugins.dns_common import CredentialsConfiguration
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
ACCOUNT_URL = 'https://dnsimple.com/user'
|
||||
|
||||
|
||||
class Authenticator(dns_common.DNSAuthenticator):
|
||||
class Authenticator(dns_common_lexicon.LexiconDNSAuthenticator):
|
||||
"""DNS Authenticator for DNSimple
|
||||
|
||||
This Authenticator uses the DNSimple v2 API to fulfill a dns-01 challenge.
|
||||
"""
|
||||
|
||||
description = 'Obtain certificates using a DNS TXT record (if you are using DNSimple for DNS).'
|
||||
ttl = 60
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
super().__init__(*args, **kwargs)
|
||||
self.credentials: Optional[CredentialsConfiguration] = None
|
||||
self._add_provider_option('token',
|
||||
f'User access token for DNSimple v2 API. (See {ACCOUNT_URL}.)',
|
||||
'auth_token')
|
||||
|
||||
@classmethod
|
||||
def add_parser_arguments(cls, add: Callable[..., None],
|
||||
@@ -40,42 +37,9 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
|
||||
'the DNSimple API.'
|
||||
|
||||
def _setup_credentials(self) -> None:
|
||||
self.credentials = self._configure_credentials(
|
||||
'credentials',
|
||||
'DNSimple credentials INI file',
|
||||
{
|
||||
'token': 'User access token for DNSimple v2 API. (See {0}.)'.format(ACCOUNT_URL)
|
||||
}
|
||||
)
|
||||
|
||||
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_dnsimple_client().add_txt_record(domain, validation_name, validation)
|
||||
|
||||
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_dnsimple_client().del_txt_record(domain, validation_name, validation)
|
||||
|
||||
def _get_dnsimple_client(self) -> "_DNSimpleLexiconClient":
|
||||
if not self.credentials: # pragma: no cover
|
||||
raise errors.Error("Plugin has not been prepared.")
|
||||
return _DNSimpleLexiconClient(self.credentials.conf('token'), self.ttl)
|
||||
|
||||
|
||||
class _DNSimpleLexiconClient(dns_common_lexicon.LexiconClient):
|
||||
"""
|
||||
Encapsulates all communication with the DNSimple via Lexicon.
|
||||
"""
|
||||
|
||||
def __init__(self, token: str, ttl: int) -> None:
|
||||
super().__init__()
|
||||
|
||||
config = dns_common_lexicon.build_lexicon_config('dnssimple', {
|
||||
'ttl': ttl,
|
||||
}, {
|
||||
'auth_token': token,
|
||||
})
|
||||
|
||||
self.provider = dnsimple.Provider(config)
|
||||
@property
|
||||
def _provider_name(self) -> str:
|
||||
return 'dnsimple'
|
||||
|
||||
def _handle_http_error(self, e: HTTPError, domain_name: str) -> errors.PluginError:
|
||||
hint = None
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
"""Tests for certbot_dns_dnsimple._internal.dns_dnsimple."""
|
||||
|
||||
import sys
|
||||
import unittest
|
||||
from unittest import mock
|
||||
import sys
|
||||
|
||||
import pytest
|
||||
from requests import Response
|
||||
from requests.exceptions import HTTPError
|
||||
|
||||
from certbot.compat import os
|
||||
@@ -16,7 +15,9 @@ TOKEN = 'foo'
|
||||
|
||||
|
||||
class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
dns_test_common_lexicon.BaseLexiconAuthenticatorTest):
|
||||
dns_test_common_lexicon.BaseLexiconDNSAuthenticatorTest):
|
||||
|
||||
LOGIN_ERROR = HTTPError('401 Client Error: Unauthorized for url: ...', response=Response())
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
@@ -31,23 +32,6 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
|
||||
self.auth = Authenticator(self.config, "dnsimple")
|
||||
|
||||
self.mock_client = mock.MagicMock()
|
||||
# _get_dnsimple_client | pylint: disable=protected-access
|
||||
self.auth._get_dnsimple_client = mock.MagicMock(return_value=self.mock_client)
|
||||
|
||||
|
||||
class DNSimpleLexiconClientTest(unittest.TestCase, dns_test_common_lexicon.BaseLexiconClientTest):
|
||||
|
||||
LOGIN_ERROR = HTTPError('401 Client Error: Unauthorized for url: ...')
|
||||
|
||||
def setUp(self):
|
||||
from certbot_dns_dnsimple._internal.dns_dnsimple import _DNSimpleLexiconClient
|
||||
|
||||
self.client = _DNSimpleLexiconClient(TOKEN, 0)
|
||||
|
||||
self.provider_mock = mock.MagicMock()
|
||||
self.client.provider = self.provider_mock
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
|
||||
@@ -16,7 +16,7 @@
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
import os
|
||||
# import os
|
||||
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
|
||||
'sphinx.ext.intersphinx',
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode']
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme']
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
autodoc_default_flags = ['show-inheritance']
|
||||
@@ -93,14 +94,7 @@ todo_include_todos = False
|
||||
# a list of builtin themes.
|
||||
#
|
||||
|
||||
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
import sphinx_rtd_theme
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
|
||||
@@ -4,16 +4,18 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
# This version of lexicon is required to address the problem described in
|
||||
# https://github.com/AnalogJ/lexicon/issues/387.
|
||||
'dns-lexicon>=3.2.1',
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if not os.environ.get('SNAP_BUILD'):
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
else:
|
||||
install_requires.extend([
|
||||
# We specify the minimum acme and certbot version as the current plugin
|
||||
# version for simplicity. See
|
||||
@@ -21,11 +23,6 @@ if not os.environ.get('SNAP_BUILD'):
|
||||
f'acme>={version}',
|
||||
f'certbot>={version}',
|
||||
])
|
||||
elif 'bdist_wheel' in sys.argv[1:]:
|
||||
raise RuntimeError('Unset SNAP_BUILD when building wheels '
|
||||
'to include certbot dependencies.')
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
|
||||
docs_extras = [
|
||||
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
|
||||
@@ -44,7 +41,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -53,11 +50,11 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
33
certbot-dns-dnsmadeeasy/.readthedocs.yaml
Normal file
33
certbot-dns-dnsmadeeasy/.readthedocs.yaml
Normal file
@@ -0,0 +1,33 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: certbot-dns-dnsmadeeasy/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: certbot-dns-dnsmadeeasy/readthedocs.org.requirements.txt
|
||||
@@ -4,20 +4,17 @@ from typing import Any
|
||||
from typing import Callable
|
||||
from typing import Optional
|
||||
|
||||
from lexicon.providers import dnsmadeeasy
|
||||
from requests import HTTPError
|
||||
|
||||
from certbot import errors
|
||||
from certbot.plugins import dns_common
|
||||
from certbot.plugins import dns_common_lexicon
|
||||
from certbot.plugins.dns_common import CredentialsConfiguration
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
ACCOUNT_URL = 'https://cp.dnsmadeeasy.com/account/info'
|
||||
|
||||
|
||||
class Authenticator(dns_common.DNSAuthenticator):
|
||||
class Authenticator(dns_common_lexicon.LexiconDNSAuthenticator):
|
||||
"""DNS Authenticator for DNS Made Easy
|
||||
|
||||
This Authenticator uses the DNS Made Easy API to fulfill a dns-01 challenge.
|
||||
@@ -25,11 +22,17 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
|
||||
description = ('Obtain certificates using a DNS TXT record (if you are using DNS Made Easy for '
|
||||
'DNS).')
|
||||
ttl = 60
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
super().__init__(*args, **kwargs)
|
||||
self.credentials: Optional[CredentialsConfiguration] = None
|
||||
self._add_provider_option('api-key',
|
||||
'API key for DNS Made Easy account, '
|
||||
f'obtained from {ACCOUNT_URL}',
|
||||
'auth_username')
|
||||
self._add_provider_option('secret-key',
|
||||
'Secret key for DNS Made Easy account, '
|
||||
f'obtained from {ACCOUNT_URL}',
|
||||
'auth_token')
|
||||
|
||||
@classmethod
|
||||
def add_parser_arguments(cls, add: Callable[..., None],
|
||||
@@ -41,48 +44,9 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
|
||||
'the DNS Made Easy API.'
|
||||
|
||||
def _setup_credentials(self) -> None:
|
||||
self.credentials = self._configure_credentials(
|
||||
'credentials',
|
||||
'DNS Made Easy credentials INI file',
|
||||
{
|
||||
'api-key': 'API key for DNS Made Easy account, obtained from {0}'
|
||||
.format(ACCOUNT_URL),
|
||||
'secret-key': 'Secret key for DNS Made Easy account, obtained from {0}'
|
||||
.format(ACCOUNT_URL)
|
||||
}
|
||||
)
|
||||
|
||||
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_dnsmadeeasy_client().add_txt_record(domain, validation_name, validation)
|
||||
|
||||
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_dnsmadeeasy_client().del_txt_record(domain, validation_name, validation)
|
||||
|
||||
def _get_dnsmadeeasy_client(self) -> "_DNSMadeEasyLexiconClient":
|
||||
if not self.credentials: # pragma: no cover
|
||||
raise errors.Error("Plugin has not been prepared.")
|
||||
return _DNSMadeEasyLexiconClient(self.credentials.conf('api-key'),
|
||||
self.credentials.conf('secret-key'),
|
||||
self.ttl)
|
||||
|
||||
|
||||
class _DNSMadeEasyLexiconClient(dns_common_lexicon.LexiconClient):
|
||||
"""
|
||||
Encapsulates all communication with the DNS Made Easy via Lexicon.
|
||||
"""
|
||||
|
||||
def __init__(self, api_key: str, secret_key: str, ttl: int) -> None:
|
||||
super().__init__()
|
||||
|
||||
config = dns_common_lexicon.build_lexicon_config('dnsmadeeasy', {
|
||||
'ttl': ttl,
|
||||
}, {
|
||||
'auth_username': api_key,
|
||||
'auth_token': secret_key,
|
||||
})
|
||||
|
||||
self.provider = dnsmadeeasy.Provider(config)
|
||||
@property
|
||||
def _provider_name(self) -> str:
|
||||
return 'dnsmadeeasy'
|
||||
|
||||
def _handle_http_error(self, e: HTTPError, domain_name: str) -> Optional[errors.PluginError]:
|
||||
if domain_name in str(e) and str(e).startswith('404 Client Error: Not Found for url:'):
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
"""Tests for certbot_dns_dnsmadeeasy._internal.dns_dnsmadeeasy."""
|
||||
|
||||
import sys
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
from requests import Response
|
||||
from requests.exceptions import HTTPError
|
||||
|
||||
from certbot.compat import os
|
||||
@@ -18,7 +18,10 @@ SECRET_KEY = 'bar'
|
||||
|
||||
|
||||
class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
dns_test_common_lexicon.BaseLexiconAuthenticatorTest):
|
||||
dns_test_common_lexicon.BaseLexiconDNSAuthenticatorTest):
|
||||
|
||||
DOMAIN_NOT_FOUND = HTTPError(f'404 Client Error: Not Found for url: {DOMAIN}.', response=Response())
|
||||
LOGIN_ERROR = HTTPError(f'403 Client Error: Forbidden for url: {DOMAIN}.', response=Response())
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
@@ -35,24 +38,6 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
|
||||
self.auth = Authenticator(self.config, "dnsmadeeasy")
|
||||
|
||||
self.mock_client = mock.MagicMock()
|
||||
# _get_dnsmadeeasy_client | pylint: disable=protected-access
|
||||
self.auth._get_dnsmadeeasy_client = mock.MagicMock(return_value=self.mock_client)
|
||||
|
||||
|
||||
class DNSMadeEasyLexiconClientTest(unittest.TestCase,
|
||||
dns_test_common_lexicon.BaseLexiconClientTest):
|
||||
DOMAIN_NOT_FOUND = HTTPError('404 Client Error: Not Found for url: {0}.'.format(DOMAIN))
|
||||
LOGIN_ERROR = HTTPError('403 Client Error: Forbidden for url: {0}.'.format(DOMAIN))
|
||||
|
||||
def setUp(self):
|
||||
from certbot_dns_dnsmadeeasy._internal.dns_dnsmadeeasy import _DNSMadeEasyLexiconClient
|
||||
|
||||
self.client = _DNSMadeEasyLexiconClient(API_KEY, SECRET_KEY, 0)
|
||||
|
||||
self.provider_mock = mock.MagicMock()
|
||||
self.client.provider = self.provider_mock
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
|
||||
@@ -16,7 +16,7 @@
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
import os
|
||||
# import os
|
||||
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
|
||||
'sphinx.ext.intersphinx',
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode']
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme']
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
autodoc_default_flags = ['show-inheritance']
|
||||
@@ -93,14 +94,7 @@ todo_include_todos = False
|
||||
# a list of builtin themes.
|
||||
#
|
||||
|
||||
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
import sphinx_rtd_theme
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
|
||||
@@ -4,14 +4,16 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'dns-lexicon>=3.2.1',
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if not os.environ.get('SNAP_BUILD'):
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
else:
|
||||
install_requires.extend([
|
||||
# We specify the minimum acme and certbot version as the current plugin
|
||||
# version for simplicity. See
|
||||
@@ -19,11 +21,6 @@ if not os.environ.get('SNAP_BUILD'):
|
||||
f'acme>={version}',
|
||||
f'certbot>={version}',
|
||||
])
|
||||
elif 'bdist_wheel' in sys.argv[1:]:
|
||||
raise RuntimeError('Unset SNAP_BUILD when building wheels '
|
||||
'to include certbot dependencies.')
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
|
||||
docs_extras = [
|
||||
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
|
||||
@@ -42,7 +39,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -51,11 +48,11 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
33
certbot-dns-gehirn/.readthedocs.yaml
Normal file
33
certbot-dns-gehirn/.readthedocs.yaml
Normal file
@@ -0,0 +1,33 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: certbot-dns-gehirn/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: certbot-dns-gehirn/readthedocs.org.requirements.txt
|
||||
@@ -4,20 +4,17 @@ from typing import Any
|
||||
from typing import Callable
|
||||
from typing import Optional
|
||||
|
||||
from lexicon.providers import gehirn
|
||||
from requests import HTTPError
|
||||
|
||||
from certbot import errors
|
||||
from certbot.plugins import dns_common
|
||||
from certbot.plugins import dns_common_lexicon
|
||||
from certbot.plugins.dns_common import CredentialsConfiguration
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
DASHBOARD_URL = "https://gis.gehirn.jp/"
|
||||
|
||||
|
||||
class Authenticator(dns_common.DNSAuthenticator):
|
||||
class Authenticator(dns_common_lexicon.LexiconDNSAuthenticator):
|
||||
"""DNS Authenticator for Gehirn Infrastructure Service DNS
|
||||
|
||||
This Authenticator uses the Gehirn Infrastructure Service API to fulfill
|
||||
@@ -26,11 +23,17 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
|
||||
description = 'Obtain certificates using a DNS TXT record ' + \
|
||||
'(if you are using Gehirn Infrastructure Service for DNS).'
|
||||
ttl = 60
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
super().__init__(*args, **kwargs)
|
||||
self.credentials: Optional[CredentialsConfiguration] = None
|
||||
self._add_provider_option('api-token',
|
||||
'API token for Gehirn Infrastructure Service '
|
||||
f'API obtained from {DASHBOARD_URL}',
|
||||
'auth_token')
|
||||
self._add_provider_option('api-secret',
|
||||
'API secret for Gehirn Infrastructure Service '
|
||||
f'API obtained from {DASHBOARD_URL}',
|
||||
'auth_secret')
|
||||
|
||||
@classmethod
|
||||
def add_parser_arguments(cls, add: Callable[..., None],
|
||||
@@ -42,50 +45,9 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
|
||||
'the Gehirn Infrastructure Service API.'
|
||||
|
||||
def _setup_credentials(self) -> None:
|
||||
self.credentials = self._configure_credentials(
|
||||
'credentials',
|
||||
'Gehirn Infrastructure Service credentials file',
|
||||
{
|
||||
'api-token': 'API token for Gehirn Infrastructure Service ' + \
|
||||
'API obtained from {0}'.format(DASHBOARD_URL),
|
||||
'api-secret': 'API secret for Gehirn Infrastructure Service ' + \
|
||||
'API obtained from {0}'.format(DASHBOARD_URL),
|
||||
}
|
||||
)
|
||||
|
||||
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_gehirn_client().add_txt_record(domain, validation_name, validation)
|
||||
|
||||
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_gehirn_client().del_txt_record(domain, validation_name, validation)
|
||||
|
||||
def _get_gehirn_client(self) -> "_GehirnLexiconClient":
|
||||
if not self.credentials: # pragma: no cover
|
||||
raise errors.Error("Plugin has not been prepared.")
|
||||
return _GehirnLexiconClient(
|
||||
self.credentials.conf('api-token'),
|
||||
self.credentials.conf('api-secret'),
|
||||
self.ttl
|
||||
)
|
||||
|
||||
|
||||
class _GehirnLexiconClient(dns_common_lexicon.LexiconClient):
|
||||
"""
|
||||
Encapsulates all communication with the Gehirn Infrastructure Service via Lexicon.
|
||||
"""
|
||||
|
||||
def __init__(self, api_token: str, api_secret: str, ttl: int) -> None:
|
||||
super().__init__()
|
||||
|
||||
config = dns_common_lexicon.build_lexicon_config('gehirn', {
|
||||
'ttl': ttl,
|
||||
}, {
|
||||
'auth_token': api_token,
|
||||
'auth_secret': api_secret,
|
||||
})
|
||||
|
||||
self.provider = gehirn.Provider(config)
|
||||
@property
|
||||
def _provider_name(self) -> str:
|
||||
return 'gehirn'
|
||||
|
||||
def _handle_http_error(self, e: HTTPError, domain_name: str) -> Optional[errors.PluginError]:
|
||||
if domain_name in str(e) and (str(e).startswith('404 Client Error: Not Found for url:')):
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
"""Tests for certbot_dns_gehirn._internal.dns_gehirn."""
|
||||
|
||||
import sys
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
from requests import Response
|
||||
from requests.exceptions import HTTPError
|
||||
|
||||
from certbot.compat import os
|
||||
@@ -16,8 +16,12 @@ from certbot.tests import util as test_util
|
||||
API_TOKEN = '00000000-0000-0000-0000-000000000000'
|
||||
API_SECRET = 'MDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAwMDAw'
|
||||
|
||||
|
||||
class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
dns_test_common_lexicon.BaseLexiconAuthenticatorTest):
|
||||
dns_test_common_lexicon.BaseLexiconDNSAuthenticatorTest):
|
||||
|
||||
DOMAIN_NOT_FOUND = HTTPError(f'404 Client Error: Not Found for url: {DOMAIN}.', response=Response())
|
||||
LOGIN_ERROR = HTTPError(f'401 Client Error: Unauthorized for url: {DOMAIN}.', response=Response())
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
@@ -35,23 +39,6 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
|
||||
self.auth = Authenticator(self.config, "gehirn")
|
||||
|
||||
self.mock_client = mock.MagicMock()
|
||||
# _get_gehirn_client | pylint: disable=protected-access
|
||||
self.auth._get_gehirn_client = mock.MagicMock(return_value=self.mock_client)
|
||||
|
||||
|
||||
class GehirnLexiconClientTest(unittest.TestCase, dns_test_common_lexicon.BaseLexiconClientTest):
|
||||
DOMAIN_NOT_FOUND = HTTPError('404 Client Error: Not Found for url: {0}.'.format(DOMAIN))
|
||||
LOGIN_ERROR = HTTPError('401 Client Error: Unauthorized for url: {0}.'.format(DOMAIN))
|
||||
|
||||
def setUp(self):
|
||||
from certbot_dns_gehirn._internal.dns_gehirn import _GehirnLexiconClient
|
||||
|
||||
self.client = _GehirnLexiconClient(API_TOKEN, API_SECRET, 0)
|
||||
|
||||
self.provider_mock = mock.MagicMock()
|
||||
self.client.provider = self.provider_mock
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
|
||||
@@ -16,7 +16,7 @@
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
import os
|
||||
# import os
|
||||
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
|
||||
'sphinx.ext.intersphinx',
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode']
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme']
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
autodoc_default_flags = ['show-inheritance']
|
||||
@@ -93,14 +94,7 @@ todo_include_todos = False
|
||||
# a list of builtin themes.
|
||||
#
|
||||
|
||||
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
import sphinx_rtd_theme
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
|
||||
@@ -4,14 +4,16 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'dns-lexicon>=3.2.1',
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if not os.environ.get('SNAP_BUILD'):
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
else:
|
||||
install_requires.extend([
|
||||
# We specify the minimum acme and certbot version as the current plugin
|
||||
# version for simplicity. See
|
||||
@@ -19,11 +21,6 @@ if not os.environ.get('SNAP_BUILD'):
|
||||
f'acme>={version}',
|
||||
f'certbot>={version}',
|
||||
])
|
||||
elif 'bdist_wheel' in sys.argv[1:]:
|
||||
raise RuntimeError('Unset SNAP_BUILD when building wheels '
|
||||
'to include certbot dependencies.')
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
|
||||
docs_extras = [
|
||||
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
|
||||
@@ -42,7 +39,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -51,11 +48,11 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
33
certbot-dns-google/.readthedocs.yaml
Normal file
33
certbot-dns-google/.readthedocs.yaml
Normal file
@@ -0,0 +1,33 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: certbot-dns-google/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: certbot-dns-google/readthedocs.org.requirements.txt
|
||||
@@ -14,7 +14,14 @@ Named Arguments
|
||||
======================================== =====================================
|
||||
``--dns-google-credentials`` Google Cloud Platform credentials_
|
||||
JSON file.
|
||||
(Required - Optional on Google Compute Engine)
|
||||
|
||||
(Required if not using `Application Default
|
||||
Credentials <https://cloud.google.com/docs/authentication/
|
||||
application-default-credentials>`_.)
|
||||
``--dns-google-project`` The ID of the Google Cloud project that the Google
|
||||
Cloud DNS managed zone(s) reside in.
|
||||
|
||||
(Default: project that the Google credentials_ belong to)
|
||||
``--dns-google-propagation-seconds`` The number of seconds to wait for DNS
|
||||
to propagate before asking the ACME
|
||||
server to verify the DNS record.
|
||||
@@ -25,45 +32,37 @@ Named Arguments
|
||||
Credentials
|
||||
-----------
|
||||
|
||||
Use of this plugin requires Google Cloud Platform API credentials
|
||||
for an account with the following permissions:
|
||||
Use of this plugin requires Google Cloud Platform credentials with the ability to modify the Cloud
|
||||
DNS managed zone(s) for which certificates are being issued.
|
||||
|
||||
* ``dns.changes.create``
|
||||
* ``dns.changes.get``
|
||||
* ``dns.changes.list``
|
||||
* ``dns.managedZones.get``
|
||||
* ``dns.managedZones.list``
|
||||
* ``dns.resourceRecordSets.create``
|
||||
* ``dns.resourceRecordSets.delete``
|
||||
* ``dns.resourceRecordSets.list``
|
||||
* ``dns.resourceRecordSets.update``
|
||||
In most cases, configuring credentials for Certbot will require `creating a service account
|
||||
<https://cloud.google.com/iam/docs/service-accounts-create>`_, and then either `granting permissions
|
||||
with predefined roles`_ or `granting permissions with custom roles`_ using IAM.
|
||||
|
||||
(The closest role is `dns.admin <https://cloud.google.com/dns/docs/
|
||||
access-control#dns.admin>`_).
|
||||
|
||||
If the above permissions are assigned at the `resource level <https://cloud
|
||||
.google.com/dns/docs/zones/iam-per-resource-zones>`_, the same user must
|
||||
have, at the PROJECT level, the following permissions:
|
||||
Providing Credentials
|
||||
^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
* ``dns.managedZones.get``
|
||||
* ``dns.managedZones.list``
|
||||
The preferred method of providing credentials is to `set up Application Default Credentials
|
||||
<https://cloud.google.com/docs/authentication/provide-credentials-adc>`_ (ADC) in the environment
|
||||
that Certbot is running in.
|
||||
|
||||
(The closest role is `dns.reader <https://cloud.google.com/dns/docs/
|
||||
access-control#dns.reader>`_).
|
||||
If you are running Certbot on Google Cloud then a service account can be assigned directly to most
|
||||
types of workload, including `Compute Engine VMs <https://cloud.google.com/compute/docs/access/
|
||||
create-enable-service-accounts-for-instances>`_, `Kubernetes Engine Pods <https://cloud.google.com/
|
||||
kubernetes-engine/docs/how-to/workload-identity>`_, `Cloud Run jobs <https://cloud.google.com/run
|
||||
/docs/securing/service-identity>`_, `Cloud Functions <https://cloud.google.com/functions/docs/
|
||||
securing/function-identity>`_, and `Cloud Builds <https://cloud.google.com/build/docs/securing-
|
||||
builds/configure-user-specified-service-accounts>`_.
|
||||
|
||||
Google provides instructions for `creating a service account <https://developers
|
||||
.google.com/identity/protocols/OAuth2ServiceAccount#creatinganaccount>`_ and
|
||||
`information about the required permissions <https://cloud.google.com/dns/access
|
||||
-control#permissions_and_roles>`_. If you're running on Google Compute Engine,
|
||||
you can `assign the service account to the instance <https://cloud.google.com/
|
||||
compute/docs/access/create-enable-service-accounts-for-instances>`_ which
|
||||
is running certbot. A credentials file is not required in this case, as they
|
||||
are automatically obtained by certbot through the `metadata service
|
||||
<https://cloud.google.com/compute/docs/storing-retrieving-metadata>`_ .
|
||||
If you are not running Certbot on Google Cloud then a credentials file should be provided using the
|
||||
``--dns-google-credentials`` command-line argument. Google provides documentation for `creating
|
||||
service account keys <https://cloud.google.com/iam/docs/keys-create-delete#creating>`_, which is the
|
||||
most common method of using a service account outside of Google Cloud.
|
||||
|
||||
.. code-block:: json
|
||||
:name: credentials.json
|
||||
:caption: Example credentials file:
|
||||
:name: credentials-sa.json
|
||||
:caption: Example service account key file:
|
||||
|
||||
{
|
||||
"type": "service_account",
|
||||
@@ -78,12 +77,8 @@ are automatically obtained by certbot through the `metadata service
|
||||
"client_x509_cert_url": "..."
|
||||
}
|
||||
|
||||
The path to this file can be provided interactively or using the
|
||||
``--dns-google-credentials`` command-line argument. Certbot records the path
|
||||
to this file for use during renewal, but does not store the file's contents.
|
||||
|
||||
.. caution::
|
||||
You should protect these API credentials as you would a password. Users who
|
||||
You should protect these credentials as you would a password. Users who
|
||||
can read this file can use these credentials to issue some types of API calls
|
||||
on your behalf, limited by the permissions assigned to the account. Users who
|
||||
can cause Certbot to run using these credentials can complete a ``dns-01``
|
||||
@@ -97,35 +92,132 @@ file. This warning will be emitted each time Certbot uses the credentials file,
|
||||
including for renewal, and cannot be silenced except by addressing the issue
|
||||
(e.g., by using a command like ``chmod 600`` to restrict access to the file).
|
||||
|
||||
If you are running Certbot within another cloud platform, a CI platform, or any other platform that
|
||||
supports issuing OpenID Connect Tokens, then you may also have the option of securely authenticating
|
||||
with `workload identity federation <https://cloud.google.com/iam/docs/workload-identity-
|
||||
federation>`_. Instructions are generally available for most platforms, including `AWS or Azure
|
||||
<https://cloud.google.com/iam/docs/workload-identity-federation-with-other-clouds>`_, `GitHub
|
||||
Actions <https://cloud.google.com/blog/products/identity-security/enabling-keyless-authentication
|
||||
-from-github-actions>`_, and `GitLab CI <https://docs.gitlab.com/ee/ci/cloud_services/
|
||||
google_cloud/>`_.
|
||||
|
||||
|
||||
Granting Permissions with Predefined Roles
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
The simplest method of granting the required permissions to the user or service account that Certbot
|
||||
is authenticating with is to use either of these predefined role strategies:
|
||||
|
||||
* `dns.admin <https://cloud.google.com/dns/docs/access-control#dns.admin>`_ against the *DNS
|
||||
zone(s)* that Certbot will be issuing certificates for.
|
||||
* `dns.reader <https://cloud.google.com/dns/docs/access-control#dns.reader>`_ against the *project*
|
||||
containing the relevant DNS zones.
|
||||
|
||||
*or*
|
||||
|
||||
* `dns.admin <https://cloud.google.com/dns/docs/access-control#dns.admin>`_ against the *project*
|
||||
containing the relevant DNS zones
|
||||
|
||||
For instructions on how to grant roles, please read the Google provided documentation for `granting
|
||||
access roles against a project <https://cloud.google.com/iam/docs/granting-changing-revoking-access
|
||||
#single-role>`_ and `granting access roles against zones <https://cloud.google.com/dns/docs/zones/
|
||||
iam-per-resource-zones#set_access_control_policy_for_a_specific_resource>`_.
|
||||
|
||||
.. caution::
|
||||
Granting the ``dns.admin`` role at the project level can present a significant security risk. It
|
||||
provides full administrative access to all DNS zones within the project, granting the ability to
|
||||
perform any action up to and including deleting all zones within a project.
|
||||
|
||||
|
||||
Granting Permissions with Custom Roles
|
||||
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
|
||||
|
||||
Custom roles are an alternative to predefined roles that provide the ability to define fine grained
|
||||
permission sets for specific use cases. They should generally be used when it is desirable to adhere
|
||||
to the principle of least privilege, such as within production or other security sensitive
|
||||
workloads.
|
||||
|
||||
The following is an example strategy for granting granular permissions to Certbot using custom
|
||||
roles. If you are not already familiar with how to do so, Google provides documentation for
|
||||
`creating a custom IAM role <https://cloud.google.com/iam/docs/creating-custom-roles#creating>`_.
|
||||
|
||||
Firstly, create a custom role containing the permissions required to make DNS record updates. We
|
||||
suggest naming the custom role ``Certbot - Zone Editor`` with the ID ``certbot.zoneEditor``. The
|
||||
following permissions are required:
|
||||
|
||||
* ``dns.changes.create``
|
||||
* ``dns.changes.get``
|
||||
* ``dns.changes.list``
|
||||
* ``dns.resourceRecordSets.create``
|
||||
* ``dns.resourceRecordSets.delete``
|
||||
* ``dns.resourceRecordSets.list``
|
||||
* ``dns.resourceRecordSets.update``
|
||||
|
||||
Next, create a custom role granting Certbot the ability to discover DNS zones. We suggest naming the
|
||||
custom role ``Certbot - Zone Lister`` with the ID ``certbot.zoneLister``. The following permissions
|
||||
are required:
|
||||
|
||||
* ``dns.managedZones.get``
|
||||
* ``dns.managedZones.list``
|
||||
|
||||
Finally, grant the custom roles to the user or service account that Certbot is authenticating with:
|
||||
|
||||
* Grant your custom ``Certbot - Zone Editor`` role against the *DNS zone(s)* that Certbot will be
|
||||
issuing certificates for.
|
||||
* Grant your custom ``Certbot - Zone Lister`` role against the *project* containing the relevant DNS
|
||||
zones.
|
||||
|
||||
For instructions on how to grant roles, please read the Google provided documentation for `granting
|
||||
access roles against a project <https://cloud.google.com/iam/docs/granting-changing-revoking-access
|
||||
#single-role>`_ and `granting access roles against zones <https://cloud.google.com/dns/docs/zones/
|
||||
iam-per-resource-zones#set_access_control_policy_for_a_specific_resource>`_.
|
||||
|
||||
|
||||
Examples
|
||||
--------
|
||||
|
||||
.. code-block:: bash
|
||||
:caption: To acquire a certificate for ``example.com``
|
||||
:caption: To acquire a certificate for ``example.com``, providing a credentials file
|
||||
|
||||
certbot certonly \\
|
||||
--dns-google \\
|
||||
--dns-google-credentials ~/.secrets/certbot/google.json \\
|
||||
-d example.com
|
||||
|
||||
.. code-block:: bash
|
||||
:caption: To acquire a certificate for ``example.com``, where ADC is available and
|
||||
a credentials file is not required
|
||||
|
||||
certbot certonly \\
|
||||
--dns-google \\
|
||||
-d example.com
|
||||
|
||||
.. code-block:: bash
|
||||
:caption: To acquire a single certificate for both ``example.com`` and
|
||||
``www.example.com``
|
||||
|
||||
certbot certonly \\
|
||||
--dns-google \\
|
||||
--dns-google-credentials ~/.secrets/certbot/google.json \\
|
||||
-d example.com \\
|
||||
-d www.example.com
|
||||
|
||||
.. code-block:: bash
|
||||
:caption: To acquire a certificate for ``example.com``, where the managed DNS
|
||||
zone resides in another Google Cloud project.
|
||||
|
||||
certbot certonly \\
|
||||
--dns-google \\
|
||||
--dns-google-credentials ~/.secrets/certbot/google-project-test-foo.json \\
|
||||
--dns-google-project test-bar \\
|
||||
-d example.com
|
||||
|
||||
|
||||
.. code-block:: bash
|
||||
:caption: To acquire a certificate for ``example.com``, waiting 120 seconds
|
||||
for DNS propagation
|
||||
|
||||
certbot certonly \\
|
||||
--dns-google \\
|
||||
--dns-google-credentials ~/.secrets/certbot/google.json \\
|
||||
--dns-google-propagation-seconds 120 \\
|
||||
-d example.com
|
||||
|
||||
|
||||
@@ -1,21 +1,22 @@
|
||||
"""DNS Authenticator for Google Cloud DNS."""
|
||||
import json
|
||||
import logging
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import Dict
|
||||
from typing import Optional
|
||||
|
||||
import google.auth
|
||||
|
||||
from google.auth import exceptions as googleauth_exceptions
|
||||
from googleapiclient import discovery
|
||||
from googleapiclient import errors as googleapiclient_errors
|
||||
import httplib2
|
||||
from oauth2client.service_account import ServiceAccountCredentials
|
||||
|
||||
from certbot import errors
|
||||
from certbot.plugins import dns_common
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
ADC_URL = 'https://cloud.google.com/docs/authentication/application-default-credentials'
|
||||
ACCT_URL = 'https://developers.google.com/identity/protocols/OAuth2ServiceAccount#creatinganaccount'
|
||||
PERMISSIONS_URL = 'https://cloud.google.com/dns/access-control#permissions_and_roles'
|
||||
METADATA_URL = 'http://metadata.google.internal/computeMetadata/v1/'
|
||||
@@ -31,15 +32,23 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
description = ('Obtain certificates using a DNS TXT record (if you are using Google Cloud DNS '
|
||||
'for DNS).')
|
||||
ttl = 60
|
||||
google_client = None
|
||||
|
||||
@classmethod
|
||||
def add_parser_arguments(cls, add: Callable[..., None],
|
||||
default_propagation_seconds: int = 60) -> None:
|
||||
super().add_parser_arguments(add, default_propagation_seconds=60)
|
||||
add('credentials',
|
||||
help=('Path to Google Cloud DNS service account JSON file. (See {0} for' +
|
||||
'information about creating a service account and {1} for information about the' +
|
||||
'required permissions.)').format(ACCT_URL, PERMISSIONS_URL),
|
||||
help=('Path to Google Cloud DNS service account JSON file to use instead of relying on'
|
||||
' Application Default Credentials (ADC). (See {0} for information about ADC, {1}'
|
||||
' for information about creating a service account, and {2} for information about'
|
||||
' the permissions required to modify Cloud DNS records.)')
|
||||
.format(ADC_URL, ACCT_URL, PERMISSIONS_URL),
|
||||
default=None)
|
||||
|
||||
add('project',
|
||||
help=('The ID of the Google Cloud project that the Google Cloud DNS managed zone(s)' +
|
||||
' reside in. This will be determined automatically if not specified.'),
|
||||
default=None)
|
||||
|
||||
def more_info(self) -> str:
|
||||
@@ -47,22 +56,19 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
'the Google Cloud DNS API.'
|
||||
|
||||
def _setup_credentials(self) -> None:
|
||||
if self.conf('credentials') is None:
|
||||
try:
|
||||
# use project_id query to check for availability of google metadata server
|
||||
# we won't use the result but know we're not on GCP when an exception is thrown
|
||||
_GoogleClient.get_project_id()
|
||||
except (ValueError, httplib2.ServerNotFoundError):
|
||||
raise errors.PluginError('Unable to get Google Cloud Metadata and no credentials'
|
||||
' specified. Automatic credential lookup is only '
|
||||
'available on Google Cloud Platform. Please configure'
|
||||
' credentials using --dns-google-credentials <file>')
|
||||
else:
|
||||
if self.conf('credentials') is not None:
|
||||
self._configure_file('credentials',
|
||||
'path to Google Cloud DNS service account JSON file')
|
||||
|
||||
dns_common.validate_file_permissions(self.conf('credentials'))
|
||||
|
||||
try:
|
||||
self._get_google_client()
|
||||
except googleauth_exceptions.DefaultCredentialsError as e:
|
||||
raise errors.PluginError('Authentication using Google Application Default Credentials '
|
||||
'failed ({}). Please configure credentials using'
|
||||
' --dns-google-credentials <file>'.format(e))
|
||||
|
||||
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_google_client().add_txt_record(domain, validation_name, validation, self.ttl)
|
||||
|
||||
@@ -70,7 +76,10 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
self._get_google_client().del_txt_record(domain, validation_name, validation, self.ttl)
|
||||
|
||||
def _get_google_client(self) -> '_GoogleClient':
|
||||
return _GoogleClient(self.conf('credentials'))
|
||||
if self.google_client is None:
|
||||
self.google_client = _GoogleClient(self.conf('credentials'), self.conf('project'))
|
||||
return self.google_client
|
||||
|
||||
|
||||
|
||||
class _GoogleClient:
|
||||
@@ -79,20 +88,31 @@ class _GoogleClient:
|
||||
"""
|
||||
|
||||
def __init__(self, account_json: Optional[str] = None,
|
||||
dns_project_id: Optional[str] = None,
|
||||
dns_api: Optional[discovery.Resource] = None) -> None:
|
||||
|
||||
scopes = ['https://www.googleapis.com/auth/ndev.clouddns.readwrite']
|
||||
credentials = None
|
||||
project_id = None
|
||||
|
||||
if account_json is not None:
|
||||
try:
|
||||
credentials = ServiceAccountCredentials.from_json_keyfile_name(account_json, scopes)
|
||||
with open(account_json) as account:
|
||||
self.project_id = json.load(account)['project_id']
|
||||
except Exception as e:
|
||||
credentials, project_id = google.auth.load_credentials_from_file(
|
||||
account_json, scopes=scopes)
|
||||
except googleauth_exceptions.GoogleAuthError as e:
|
||||
raise errors.PluginError(
|
||||
"Error parsing credentials file '{}': {}".format(account_json, e))
|
||||
"Error loading credentials file '{}': {}".format(account_json, e))
|
||||
else:
|
||||
credentials = None
|
||||
self.project_id = self.get_project_id()
|
||||
credentials, project_id = google.auth.default(scopes=scopes)
|
||||
|
||||
if dns_project_id is not None:
|
||||
project_id = dns_project_id
|
||||
|
||||
if not project_id:
|
||||
raise errors.PluginError('The Google Cloud project could not be automatically '
|
||||
'determined. Please configure it using --dns-google-project'
|
||||
' <project>.')
|
||||
self.project_id = project_id
|
||||
|
||||
if not dns_api:
|
||||
self.dns = discovery.build('dns', 'v1',
|
||||
@@ -286,32 +306,9 @@ class _GoogleClient:
|
||||
|
||||
for zone in zones:
|
||||
zone_id = zone['id']
|
||||
if 'privateVisibilityConfig' not in zone:
|
||||
if zone['visibility'] == "public":
|
||||
logger.debug('Found id of %s for %s using name %s', zone_id, domain, zone_name)
|
||||
return zone_id
|
||||
|
||||
raise errors.PluginError('Unable to determine managed zone for {0} using zone names: {1}.'
|
||||
.format(domain, zone_dns_name_guesses))
|
||||
|
||||
@staticmethod
|
||||
def get_project_id() -> str:
|
||||
"""
|
||||
Query the google metadata service for the current project ID
|
||||
|
||||
This only works on Google Cloud Platform
|
||||
|
||||
:raises ServerNotFoundError: Not running on Google Compute or DNS not available
|
||||
:raises ValueError: Server is found, but response code is not 200
|
||||
:returns: project id
|
||||
"""
|
||||
url = '{0}project/project-id'.format(METADATA_URL)
|
||||
|
||||
# Request an access token from the metadata server.
|
||||
http = httplib2.Http()
|
||||
r, content = http.request(url, headers=METADATA_HEADERS)
|
||||
if r.status != 200:
|
||||
raise ValueError("Invalid status code: {0}".format(r))
|
||||
|
||||
if isinstance(content, bytes):
|
||||
return content.decode()
|
||||
return content
|
||||
|
||||
@@ -4,10 +4,10 @@ import sys
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
from google.auth import exceptions as googleauth_exceptions
|
||||
from googleapiclient import discovery
|
||||
from googleapiclient.errors import Error
|
||||
from googleapiclient.http import HttpMock
|
||||
from httplib2 import ServerNotFoundError
|
||||
import pytest
|
||||
|
||||
from certbot import errors
|
||||
@@ -20,7 +20,7 @@ from certbot.tests import util as test_util
|
||||
ACCOUNT_JSON_PATH = '/not/a/real/path.json'
|
||||
API_ERROR = Error()
|
||||
PROJECT_ID = "test-test-1"
|
||||
|
||||
SCOPES = ['https://www.googleapis.com/auth/ndev.clouddns.readwrite']
|
||||
|
||||
class AuthenticatorTest(test_util.TempDirTestCase, dns_test_common.BaseAuthenticatorTest):
|
||||
|
||||
@@ -34,22 +34,25 @@ class AuthenticatorTest(test_util.TempDirTestCase, dns_test_common.BaseAuthentic
|
||||
|
||||
super().setUp()
|
||||
self.config = mock.MagicMock(google_credentials=path,
|
||||
google_project=PROJECT_ID,
|
||||
google_propagation_seconds=0) # don't wait during tests
|
||||
|
||||
self.auth = Authenticator(self.config, "google")
|
||||
|
||||
self.mock_client = mock.MagicMock()
|
||||
# _get_google_client | pylint: disable=protected-access
|
||||
self.auth._get_google_client = mock.MagicMock(return_value=self.mock_client)
|
||||
|
||||
@test_util.patch_display_util()
|
||||
def test_perform(self, unused_mock_get_utility):
|
||||
# _get_google_client | pylint: disable=protected-access
|
||||
self.auth._get_google_client = mock.MagicMock(return_value=self.mock_client)
|
||||
self.auth.perform([self.achall])
|
||||
|
||||
expected = [mock.call.add_txt_record(DOMAIN, '_acme-challenge.'+DOMAIN, mock.ANY, mock.ANY)]
|
||||
assert expected == self.mock_client.mock_calls
|
||||
|
||||
def test_cleanup(self):
|
||||
# _get_google_client | pylint: disable=protected-access
|
||||
self.auth._get_google_client = mock.MagicMock(return_value=self.mock_client)
|
||||
# _attempt_cleanup | pylint: disable=protected-access
|
||||
self.auth._attempt_cleanup = True
|
||||
self.auth.cleanup([self.achall])
|
||||
@@ -57,13 +60,27 @@ class AuthenticatorTest(test_util.TempDirTestCase, dns_test_common.BaseAuthentic
|
||||
expected = [mock.call.del_txt_record(DOMAIN, '_acme-challenge.'+DOMAIN, mock.ANY, mock.ANY)]
|
||||
assert expected == self.mock_client.mock_calls
|
||||
|
||||
@mock.patch('httplib2.Http.request', side_effect=ServerNotFoundError)
|
||||
@test_util.patch_display_util()
|
||||
def test_without_auth(self, unused_mock_get_utility, unused_mock):
|
||||
def test_without_auth(self, unused_mock_get_utility):
|
||||
self.auth._get_google_client = mock.MagicMock(side_effect=googleauth_exceptions.DefaultCredentialsError)
|
||||
self.config.google_credentials = None
|
||||
with pytest.raises(PluginError):
|
||||
self.auth.perform([self.achall])
|
||||
|
||||
@mock.patch('certbot_dns_google._internal.dns_google._GoogleClient')
|
||||
def test_get_google_client(self, client_mock):
|
||||
test_client = mock.MagicMock()
|
||||
client_mock.return_value = test_client
|
||||
|
||||
self.auth._get_google_client()
|
||||
assert client_mock.called
|
||||
assert self.auth.google_client is test_client
|
||||
|
||||
def test_get_google_client_cached(self):
|
||||
test_client = mock.MagicMock()
|
||||
self.auth.google_client = test_client
|
||||
assert self.auth._get_google_client() is test_client
|
||||
|
||||
|
||||
class GoogleClientTest(unittest.TestCase):
|
||||
record_name = "foo"
|
||||
@@ -71,6 +88,7 @@ class GoogleClientTest(unittest.TestCase):
|
||||
record_ttl = 42
|
||||
zone = "ZONE_ID"
|
||||
change = "an-id"
|
||||
visibility = "public"
|
||||
|
||||
def _setUp_client_with_mock(self, zone_request_side_effect, rrs_list_side_effect=None):
|
||||
from certbot_dns_google._internal.dns_google import _GoogleClient
|
||||
@@ -81,7 +99,7 @@ class GoogleClientTest(unittest.TestCase):
|
||||
http_mock = HttpMock(discovery_file, {'status': '200'})
|
||||
dns_api = discovery.build('dns', 'v1', http=http_mock)
|
||||
|
||||
client = _GoogleClient(ACCOUNT_JSON_PATH, dns_api)
|
||||
client = _GoogleClient(ACCOUNT_JSON_PATH, None, dns_api)
|
||||
|
||||
# Setup
|
||||
mock_mz = mock.MagicMock()
|
||||
@@ -107,32 +125,67 @@ class GoogleClientTest(unittest.TestCase):
|
||||
return client, mock_changes
|
||||
|
||||
@mock.patch('googleapiclient.discovery.build')
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google._GoogleClient.get_project_id')
|
||||
def test_client_without_credentials(self, get_project_id_mock, credential_mock,
|
||||
unused_discovery_mock):
|
||||
@mock.patch('google.auth.default')
|
||||
def test_client_with_default_credentials(self, credential_mock, discovery_mock):
|
||||
test_credentials = mock.MagicMock()
|
||||
credential_mock.return_value = (test_credentials, PROJECT_ID)
|
||||
from certbot_dns_google._internal.dns_google import _GoogleClient
|
||||
_GoogleClient(None)
|
||||
assert not credential_mock.called
|
||||
assert get_project_id_mock.called
|
||||
client = _GoogleClient(None)
|
||||
credential_mock.assert_called_once_with(scopes=SCOPES)
|
||||
assert client.project_id == PROJECT_ID
|
||||
discovery_mock.assert_called_once_with('dns', 'v1',
|
||||
credentials=test_credentials,
|
||||
cache_discovery=False)
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('googleapiclient.discovery.build')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
def test_client_with_json_credentials(self, credential_mock, discovery_mock):
|
||||
test_credentials = mock.MagicMock()
|
||||
credential_mock.return_value = (test_credentials, PROJECT_ID)
|
||||
from certbot_dns_google._internal.dns_google import _GoogleClient
|
||||
client = _GoogleClient(ACCOUNT_JSON_PATH)
|
||||
credential_mock.assert_called_once_with(ACCOUNT_JSON_PATH, scopes=SCOPES)
|
||||
assert credential_mock.called
|
||||
assert client.project_id == PROJECT_ID
|
||||
discovery_mock.assert_called_once_with('dns', 'v1',
|
||||
credentials=test_credentials,
|
||||
cache_discovery=False)
|
||||
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
def test_client_bad_credentials_file(self, credential_mock):
|
||||
credential_mock.side_effect = ValueError('Some exception buried in oauth2client')
|
||||
credential_mock.side_effect = googleauth_exceptions.DefaultCredentialsError('Some exception buried in google.auth')
|
||||
with pytest.raises(errors.PluginError) as exc_info:
|
||||
self._setUp_client_with_mock([])
|
||||
assert str(exc_info.value) == \
|
||||
"Error parsing credentials file '/not/a/real/path.json': " \
|
||||
"Some exception buried in oauth2client"
|
||||
"Error loading credentials file '/not/a/real/path.json': " \
|
||||
"Some exception buried in google.auth"
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
def test_client_missing_project_id(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), "")
|
||||
with pytest.raises(errors.PluginError) as exc_info:
|
||||
self._setUp_client_with_mock([])
|
||||
assert str(exc_info.value) == \
|
||||
"The Google Cloud project could not be automatically determined. " \
|
||||
"Please configure it using --dns-google-project <project>."
|
||||
|
||||
@mock.patch('googleapiclient.discovery.build')
|
||||
@mock.patch('google.auth.default')
|
||||
def test_client_with_project_id(self, credential_mock, unused_discovery_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
from certbot_dns_google._internal.dns_google import _GoogleClient
|
||||
client = _GoogleClient(None, "test-project-2")
|
||||
assert credential_mock.called
|
||||
assert client.project_id == "test-project-2"
|
||||
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
@mock.patch('certbot_dns_google._internal.dns_google._GoogleClient.get_project_id')
|
||||
def test_add_txt_record(self, get_project_id_mock, credential_mock):
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
|
||||
credential_mock.assert_called_once_with('/not/a/real/path.json', mock.ANY)
|
||||
assert not get_project_id_mock.called
|
||||
def test_add_txt_record(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
credential_mock.assert_called_once_with('/not/a/real/path.json', scopes=SCOPES)
|
||||
|
||||
client.add_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
|
||||
|
||||
@@ -153,11 +206,13 @@ class GoogleClientTest(unittest.TestCase):
|
||||
managedZone=self.zone,
|
||||
project=PROJECT_ID)
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_add_txt_record_and_poll(self, unused_credential_mock):
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
|
||||
def test_add_txt_record_and_poll(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
changes.create.return_value.execute.return_value = {'status': 'pending', 'id': self.change}
|
||||
changes.get.return_value.execute.return_value = {'status': 'done'}
|
||||
|
||||
@@ -171,12 +226,34 @@ class GoogleClientTest(unittest.TestCase):
|
||||
managedZone=self.zone,
|
||||
project=PROJECT_ID)
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_add_txt_record_delete_old(self, unused_credential_mock):
|
||||
def test_add_txt_record_and_poll_split_horizon(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': '{zone}-private'.format(zone=self.zone), 'dnsName': DOMAIN, 'visibility': 'private'},{'id': '{zone}-public'.format(zone=self.zone), 'dnsName': DOMAIN, 'visibility': self.visibility}]}])
|
||||
changes.create.return_value.execute.return_value = {'status': 'pending', 'id': self.change}
|
||||
changes.get.return_value.execute.return_value = {'status': 'done'}
|
||||
|
||||
client.add_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
|
||||
|
||||
changes.create.assert_called_with(body=mock.ANY,
|
||||
managedZone='{zone}-public'.format(zone=self.zone),
|
||||
project=PROJECT_ID)
|
||||
|
||||
changes.get.assert_called_with(changeId=self.change,
|
||||
managedZone='{zone}-public'.format(zone=self.zone),
|
||||
project=PROJECT_ID)
|
||||
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_add_txt_record_delete_old(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock(
|
||||
[{'managedZones': [{'id': self.zone}]}])
|
||||
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
# pylint: disable=line-too-long
|
||||
mock_get_rrs = "certbot_dns_google._internal.dns_google._GoogleClient.get_existing_txt_rrset"
|
||||
with mock.patch(mock_get_rrs) as mock_rrs:
|
||||
@@ -187,12 +264,14 @@ class GoogleClientTest(unittest.TestCase):
|
||||
assert "sample-txt-contents" in deletions["rrdatas"]
|
||||
assert self.record_ttl == deletions["ttl"]
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_add_txt_record_delete_old_ttl_case(self, unused_credential_mock):
|
||||
def test_add_txt_record_delete_old_ttl_case(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock(
|
||||
[{'managedZones': [{'id': self.zone}]}])
|
||||
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
# pylint: disable=line-too-long
|
||||
mock_get_rrs = "certbot_dns_google._internal.dns_google._GoogleClient.get_existing_txt_rrset"
|
||||
with mock.patch(mock_get_rrs) as mock_rrs:
|
||||
@@ -204,50 +283,60 @@ class GoogleClientTest(unittest.TestCase):
|
||||
assert "sample-txt-contents" in deletions["rrdatas"]
|
||||
assert custom_ttl == deletions["ttl"] #otherwise HTTP 412
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_add_txt_record_noop(self, unused_credential_mock):
|
||||
def test_add_txt_record_noop(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock(
|
||||
[{'managedZones': [{'id': self.zone}]}])
|
||||
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
client.add_txt_record(DOMAIN, "_acme-challenge.example.org",
|
||||
"example-txt-contents", self.record_ttl)
|
||||
assert changes.create.called is False
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_add_txt_record_error_during_zone_lookup(self, unused_credential_mock):
|
||||
def test_add_txt_record_error_during_zone_lookup(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, unused_changes = self._setUp_client_with_mock(API_ERROR)
|
||||
|
||||
with pytest.raises(errors.PluginError):
|
||||
client.add_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_add_txt_record_zone_not_found(self, unused_credential_mock):
|
||||
def test_add_txt_record_zone_not_found(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, unused_changes = self._setUp_client_with_mock([{'managedZones': []},
|
||||
{'managedZones': []}])
|
||||
|
||||
with pytest.raises(errors.PluginError):
|
||||
client.add_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_add_txt_record_error_during_add(self, unused_credential_mock):
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
|
||||
def test_add_txt_record_error_during_add(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
changes.create.side_effect = API_ERROR
|
||||
|
||||
with pytest.raises(errors.PluginError):
|
||||
client.add_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_del_txt_record_multi_rrdatas(self, unused_credential_mock):
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
|
||||
def test_del_txt_record_multi_rrdatas(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
# pylint: disable=line-too-long
|
||||
mock_get_rrs = "certbot_dns_google._internal.dns_google._GoogleClient.get_existing_txt_rrset"
|
||||
with mock.patch(mock_get_rrs) as mock_rrs:
|
||||
@@ -282,11 +371,13 @@ class GoogleClientTest(unittest.TestCase):
|
||||
managedZone=self.zone,
|
||||
project=PROJECT_ID)
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_del_txt_record_single_rrdatas(self, unused_credential_mock):
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
|
||||
def test_del_txt_record_single_rrdatas(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
# pylint: disable=line-too-long
|
||||
mock_get_rrs = "certbot_dns_google._internal.dns_google._GoogleClient.get_existing_txt_rrset"
|
||||
with mock.patch(mock_get_rrs) as mock_rrs:
|
||||
@@ -311,106 +402,84 @@ class GoogleClientTest(unittest.TestCase):
|
||||
managedZone=self.zone,
|
||||
project=PROJECT_ID)
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_del_txt_record_error_during_zone_lookup(self, unused_credential_mock):
|
||||
def test_del_txt_record_error_during_zone_lookup(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock(API_ERROR)
|
||||
client.del_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
|
||||
changes.create.assert_not_called()
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_del_txt_record_zone_not_found(self, unused_credential_mock):
|
||||
def test_del_txt_record_zone_not_found(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': []},
|
||||
{'managedZones': []}])
|
||||
client.del_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
|
||||
changes.create.assert_not_called()
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_del_txt_record_error_during_delete(self, unused_credential_mock):
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone}]}])
|
||||
def test_del_txt_record_error_during_delete(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, changes = self._setUp_client_with_mock([{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
changes.create.side_effect = API_ERROR
|
||||
|
||||
client.del_txt_record(DOMAIN, self.record_name, self.record_content, self.record_ttl)
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_get_existing_found(self, unused_credential_mock):
|
||||
def test_get_existing_found(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, unused_changes = self._setUp_client_with_mock(
|
||||
[{'managedZones': [{'id': self.zone}]}])
|
||||
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
# Record name mocked in setUp
|
||||
found = client.get_existing_txt_rrset(self.zone, "_acme-challenge.example.org")
|
||||
assert found["rrdatas"] == ["\"example-txt-contents\""]
|
||||
assert found["ttl"] == 60
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_get_existing_not_found(self, unused_credential_mock):
|
||||
def test_get_existing_not_found(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, unused_changes = self._setUp_client_with_mock(
|
||||
[{'managedZones': [{'id': self.zone}]}])
|
||||
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}])
|
||||
not_found = client.get_existing_txt_rrset(self.zone, "nonexistent.tld")
|
||||
assert not_found is None
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_get_existing_with_error(self, unused_credential_mock):
|
||||
def test_get_existing_with_error(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, unused_changes = self._setUp_client_with_mock(
|
||||
[{'managedZones': [{'id': self.zone}]}], API_ERROR)
|
||||
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}], API_ERROR)
|
||||
# Record name mocked in setUp
|
||||
found = client.get_existing_txt_rrset(self.zone, "_acme-challenge.example.org")
|
||||
assert found is None
|
||||
|
||||
@mock.patch('oauth2client.service_account.ServiceAccountCredentials.from_json_keyfile_name')
|
||||
@mock.patch('google.auth.load_credentials_from_file')
|
||||
@mock.patch('certbot_dns_google._internal.dns_google.open',
|
||||
mock.mock_open(read_data='{"project_id": "' + PROJECT_ID + '"}'), create=True)
|
||||
def test_get_existing_fallback(self, unused_credential_mock):
|
||||
def test_get_existing_fallback(self, credential_mock):
|
||||
credential_mock.return_value = (mock.MagicMock(), PROJECT_ID)
|
||||
|
||||
client, unused_changes = self._setUp_client_with_mock(
|
||||
[{'managedZones': [{'id': self.zone}]}], API_ERROR)
|
||||
[{'managedZones': [{'id': self.zone, 'visibility': self.visibility}]}], API_ERROR)
|
||||
rrset = client.get_existing_txt_rrset(self.zone, "_acme-challenge.example.org")
|
||||
assert not rrset
|
||||
|
||||
def test_get_project_id(self):
|
||||
from certbot_dns_google._internal.dns_google import _GoogleClient
|
||||
|
||||
response = DummyResponse()
|
||||
response.status = 200
|
||||
|
||||
with mock.patch('httplib2.Http.request', return_value=(response, 'test-test-1')):
|
||||
project_id = _GoogleClient.get_project_id()
|
||||
assert project_id == 'test-test-1'
|
||||
|
||||
with mock.patch('httplib2.Http.request', return_value=(response, b'test-test-1')):
|
||||
project_id = _GoogleClient.get_project_id()
|
||||
assert project_id == 'test-test-1'
|
||||
|
||||
failed_response = DummyResponse()
|
||||
failed_response.status = 404
|
||||
|
||||
with mock.patch('httplib2.Http.request',
|
||||
return_value=(failed_response, "some detailed http error response")):
|
||||
with pytest.raises(ValueError):
|
||||
_GoogleClient.get_project_id()
|
||||
|
||||
with mock.patch('httplib2.Http.request', side_effect=ServerNotFoundError):
|
||||
with pytest.raises(ServerNotFoundError):
|
||||
_GoogleClient.get_project_id()
|
||||
|
||||
|
||||
class DummyResponse:
|
||||
"""
|
||||
Dummy object to create a fake HTTPResponse (the actual one requires a socket and we only
|
||||
need the status attribute)
|
||||
"""
|
||||
def __init__(self):
|
||||
self.status = 200
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
|
||||
@@ -389,6 +389,11 @@
|
||||
"items": {
|
||||
"type": "string"
|
||||
}
|
||||
},
|
||||
"visibility": {
|
||||
"type": "string",
|
||||
"description": "The zone's visibility: public zones are exposed to the Internet, while private zones are visible only to Virtual Private Cloud resources.",
|
||||
"default": "public"
|
||||
}
|
||||
}
|
||||
},
|
||||
|
||||
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
|
||||
'sphinx.ext.intersphinx',
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode']
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme']
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
autodoc_default_flags = ['show-inheritance']
|
||||
@@ -93,14 +94,7 @@ todo_include_todos = False
|
||||
# a list of builtin themes.
|
||||
#
|
||||
|
||||
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
import sphinx_rtd_theme
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
|
||||
@@ -4,17 +4,17 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'google-api-python-client>=1.5.5',
|
||||
'oauth2client>=4.0',
|
||||
'google-api-python-client>=1.6.5',
|
||||
'google-auth>=2.16.0',
|
||||
'setuptools>=41.6.0',
|
||||
# already a dependency of google-api-python-client, but added for consistency
|
||||
'httplib2'
|
||||
]
|
||||
|
||||
if not os.environ.get('SNAP_BUILD'):
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
else:
|
||||
install_requires.extend([
|
||||
# We specify the minimum acme and certbot version as the current plugin
|
||||
# version for simplicity. See
|
||||
@@ -22,11 +22,6 @@ if not os.environ.get('SNAP_BUILD'):
|
||||
f'acme>={version}',
|
||||
f'certbot>={version}',
|
||||
])
|
||||
elif 'bdist_wheel' in sys.argv[1:]:
|
||||
raise RuntimeError('Unset SNAP_BUILD when building wheels '
|
||||
'to include certbot dependencies.')
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
|
||||
docs_extras = [
|
||||
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
|
||||
@@ -45,7 +40,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -54,11 +49,11 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
33
certbot-dns-linode/.readthedocs.yaml
Normal file
33
certbot-dns-linode/.readthedocs.yaml
Normal file
@@ -0,0 +1,33 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: certbot-dns-linode/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: certbot-dns-linode/readthedocs.org.requirements.txt
|
||||
@@ -3,16 +3,12 @@ import logging
|
||||
import re
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import cast
|
||||
from typing import Optional
|
||||
from typing import Union
|
||||
|
||||
from lexicon.providers import linode
|
||||
from lexicon.providers import linode4
|
||||
|
||||
from certbot import errors
|
||||
from certbot.plugins import dns_common
|
||||
from certbot.plugins import dns_common_lexicon
|
||||
from certbot.plugins.dns_common import CredentialsConfiguration
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
@@ -20,7 +16,7 @@ API_KEY_URL = 'https://manager.linode.com/profile/api'
|
||||
API_KEY_URL_V4 = 'https://cloud.linode.com/profile/tokens'
|
||||
|
||||
|
||||
class Authenticator(dns_common.DNSAuthenticator):
|
||||
class Authenticator(dns_common_lexicon.LexiconDNSAuthenticator):
|
||||
"""DNS Authenticator for Linode
|
||||
|
||||
This Authenticator uses the Linode API to fulfill a dns-01 challenge.
|
||||
@@ -30,7 +26,10 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
super().__init__(*args, **kwargs)
|
||||
self.credentials: Optional[CredentialsConfiguration] = None
|
||||
self._add_provider_option('key',
|
||||
'API key for Linode account, '
|
||||
f'obtained from {API_KEY_URL} or {API_KEY_URL_V4}',
|
||||
'auth_token')
|
||||
|
||||
@classmethod
|
||||
def add_parser_arguments(cls, add: Callable[..., None],
|
||||
@@ -42,29 +41,13 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
|
||||
'the Linode API.'
|
||||
|
||||
def _setup_credentials(self) -> None:
|
||||
self.credentials = self._configure_credentials(
|
||||
'credentials',
|
||||
'Linode credentials INI file',
|
||||
{
|
||||
'key': 'API key for Linode account, obtained from {0} or {1}'
|
||||
.format(API_KEY_URL, API_KEY_URL_V4)
|
||||
}
|
||||
)
|
||||
@property
|
||||
def _provider_name(self) -> str:
|
||||
if not hasattr(self, '_credentials'): # pragma: no cover
|
||||
self._setup_credentials()
|
||||
|
||||
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_linode_client().add_txt_record(domain, validation_name, validation)
|
||||
|
||||
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_linode_client().del_txt_record(domain, validation_name, validation)
|
||||
|
||||
def _get_linode_client(self) -> '_LinodeLexiconClient':
|
||||
if not self.credentials: # pragma: no cover
|
||||
raise errors.Error("Plugin has not been prepared.")
|
||||
api_key = self.credentials.conf('key')
|
||||
api_version: Optional[Union[str, int]] = self.credentials.conf('version')
|
||||
if api_version == '':
|
||||
api_version = None
|
||||
api_key = cast(str, self._credentials.conf('key'))
|
||||
api_version: Optional[Union[str, int]] = self._credentials.conf('version')
|
||||
|
||||
if not api_version:
|
||||
api_version = 3
|
||||
@@ -77,34 +60,19 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
else:
|
||||
api_version = int(api_version)
|
||||
|
||||
return _LinodeLexiconClient(api_key, api_version)
|
||||
|
||||
|
||||
class _LinodeLexiconClient(dns_common_lexicon.LexiconClient):
|
||||
"""
|
||||
Encapsulates all communication with the Linode API.
|
||||
"""
|
||||
|
||||
def __init__(self, api_key: str, api_version: int) -> None:
|
||||
super().__init__()
|
||||
|
||||
self.api_version = api_version
|
||||
|
||||
if api_version == 3:
|
||||
config = dns_common_lexicon.build_lexicon_config('linode', {}, {
|
||||
'auth_token': api_key,
|
||||
})
|
||||
|
||||
self.provider = linode.Provider(config)
|
||||
return 'linode'
|
||||
elif api_version == 4:
|
||||
config = dns_common_lexicon.build_lexicon_config('linode4', {}, {
|
||||
'auth_token': api_key,
|
||||
})
|
||||
return 'linode4'
|
||||
|
||||
self.provider = linode4.Provider(config)
|
||||
else:
|
||||
raise errors.PluginError('Invalid api version specified: {0}. (Supported: 3, 4)'
|
||||
.format(api_version))
|
||||
raise errors.PluginError(f'Invalid api version specified: {api_version}. (Supported: 3, 4)')
|
||||
|
||||
def _setup_credentials(self) -> None:
|
||||
self._credentials = self._configure_credentials(
|
||||
key='credentials',
|
||||
label='Credentials INI file for linode DNS authenticator',
|
||||
required_variables={item[0]: item[1] for item in self._provider_options},
|
||||
)
|
||||
|
||||
def _handle_general_error(self, e: Exception, domain_name: str) -> Optional[errors.PluginError]:
|
||||
if not str(e).startswith('Domain not found'):
|
||||
|
||||
@@ -19,7 +19,9 @@ TOKEN_V4 = '0123456789abcdef0123456789abcdef0123456789abcdef0123456789abcdef'
|
||||
|
||||
|
||||
class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
dns_test_common_lexicon.BaseLexiconAuthenticatorTest):
|
||||
dns_test_common_lexicon.BaseLexiconDNSAuthenticatorTest):
|
||||
|
||||
DOMAIN_NOT_FOUND = Exception('Domain not found')
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
@@ -32,10 +34,6 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
|
||||
self.auth = Authenticator(self.config, "linode")
|
||||
|
||||
self.mock_client = mock.MagicMock()
|
||||
# _get_linode_client | pylint: disable=protected-access
|
||||
self.auth._get_linode_client = mock.MagicMock(return_value=self.mock_client)
|
||||
|
||||
# pylint: disable=protected-access
|
||||
def test_api_version_3_detection(self):
|
||||
path = os.path.join(self.tempdir, 'file_3_auto.ini')
|
||||
@@ -44,9 +42,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
config = mock.MagicMock(linode_credentials=path,
|
||||
linode_propagation_seconds=0)
|
||||
auth = Authenticator(config, "linode")
|
||||
auth._setup_credentials()
|
||||
client = auth._get_linode_client()
|
||||
assert 3 == client.api_version
|
||||
|
||||
assert auth._provider_name == "linode"
|
||||
|
||||
# pylint: disable=protected-access
|
||||
def test_api_version_4_detection(self):
|
||||
@@ -56,9 +53,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
config = mock.MagicMock(linode_credentials=path,
|
||||
linode_propagation_seconds=0)
|
||||
auth = Authenticator(config, "linode")
|
||||
auth._setup_credentials()
|
||||
client = auth._get_linode_client()
|
||||
assert 4 == client.api_version
|
||||
|
||||
assert auth._provider_name == "linode4"
|
||||
|
||||
# pylint: disable=protected-access
|
||||
def test_api_version_3_detection_empty_version(self):
|
||||
@@ -68,9 +64,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
config = mock.MagicMock(linode_credentials=path,
|
||||
linode_propagation_seconds=0)
|
||||
auth = Authenticator(config, "linode")
|
||||
auth._setup_credentials()
|
||||
client = auth._get_linode_client()
|
||||
assert 3 == client.api_version
|
||||
|
||||
assert auth._provider_name == "linode"
|
||||
|
||||
# pylint: disable=protected-access
|
||||
def test_api_version_4_detection_empty_version(self):
|
||||
@@ -80,9 +75,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
config = mock.MagicMock(linode_credentials=path,
|
||||
linode_propagation_seconds=0)
|
||||
auth = Authenticator(config, "linode")
|
||||
auth._setup_credentials()
|
||||
client = auth._get_linode_client()
|
||||
assert 4 == client.api_version
|
||||
|
||||
assert auth._provider_name == "linode4"
|
||||
|
||||
# pylint: disable=protected-access
|
||||
def test_api_version_3_manual(self):
|
||||
@@ -92,9 +86,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
config = mock.MagicMock(linode_credentials=path,
|
||||
linode_propagation_seconds=0)
|
||||
auth = Authenticator(config, "linode")
|
||||
auth._setup_credentials()
|
||||
client = auth._get_linode_client()
|
||||
assert 3 == client.api_version
|
||||
|
||||
assert auth._provider_name == "linode"
|
||||
|
||||
# pylint: disable=protected-access
|
||||
def test_api_version_4_manual(self):
|
||||
@@ -104,9 +97,8 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
config = mock.MagicMock(linode_credentials=path,
|
||||
linode_propagation_seconds=0)
|
||||
auth = Authenticator(config, "linode")
|
||||
auth._setup_credentials()
|
||||
client = auth._get_linode_client()
|
||||
assert 4 == client.api_version
|
||||
|
||||
assert auth._provider_name == "linode4"
|
||||
|
||||
# pylint: disable=protected-access
|
||||
def test_api_version_error(self):
|
||||
@@ -116,35 +108,9 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
config = mock.MagicMock(linode_credentials=path,
|
||||
linode_propagation_seconds=0)
|
||||
auth = Authenticator(config, "linode")
|
||||
auth._setup_credentials()
|
||||
|
||||
with pytest.raises(errors.PluginError):
|
||||
auth._get_linode_client()
|
||||
|
||||
|
||||
class LinodeLexiconClientTest(unittest.TestCase, dns_test_common_lexicon.BaseLexiconClientTest):
|
||||
|
||||
DOMAIN_NOT_FOUND = Exception('Domain not found')
|
||||
|
||||
def setUp(self):
|
||||
from certbot_dns_linode._internal.dns_linode import _LinodeLexiconClient
|
||||
|
||||
self.client = _LinodeLexiconClient(TOKEN, 3)
|
||||
|
||||
self.provider_mock = mock.MagicMock()
|
||||
self.client.provider = self.provider_mock
|
||||
|
||||
|
||||
class Linode4LexiconClientTest(unittest.TestCase, dns_test_common_lexicon.BaseLexiconClientTest):
|
||||
|
||||
DOMAIN_NOT_FOUND = Exception('Domain not found')
|
||||
|
||||
def setUp(self):
|
||||
from certbot_dns_linode._internal.dns_linode import _LinodeLexiconClient
|
||||
|
||||
self.client = _LinodeLexiconClient(TOKEN, 4)
|
||||
|
||||
self.provider_mock = mock.MagicMock()
|
||||
self.client.provider = self.provider_mock
|
||||
assert auth._provider_name == "linode4"
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
|
||||
@@ -16,7 +16,7 @@
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
import os
|
||||
# import os
|
||||
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
|
||||
'sphinx.ext.intersphinx',
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode']
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme']
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
autodoc_default_flags = ['show-inheritance']
|
||||
@@ -93,14 +94,7 @@ todo_include_todos = False
|
||||
# a list of builtin themes.
|
||||
#
|
||||
|
||||
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
import sphinx_rtd_theme
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
|
||||
@@ -4,14 +4,16 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'dns-lexicon>=3.2.1',
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if not os.environ.get('SNAP_BUILD'):
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
else:
|
||||
install_requires.extend([
|
||||
# We specify the minimum acme and certbot version as the current plugin
|
||||
# version for simplicity. See
|
||||
@@ -19,11 +21,6 @@ if not os.environ.get('SNAP_BUILD'):
|
||||
f'acme>={version}',
|
||||
f'certbot>={version}',
|
||||
])
|
||||
elif 'bdist_wheel' in sys.argv[1:]:
|
||||
raise RuntimeError('Unset SNAP_BUILD when building wheels '
|
||||
'to include certbot dependencies.')
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
|
||||
docs_extras = [
|
||||
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
|
||||
@@ -42,7 +39,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -51,11 +48,11 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
33
certbot-dns-luadns/.readthedocs.yaml
Normal file
33
certbot-dns-luadns/.readthedocs.yaml
Normal file
@@ -0,0 +1,33 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: certbot-dns-luadns/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: certbot-dns-luadns/readthedocs.org.requirements.txt
|
||||
@@ -2,33 +2,33 @@
|
||||
import logging
|
||||
from typing import Any
|
||||
from typing import Callable
|
||||
from typing import Optional
|
||||
|
||||
from lexicon.providers import luadns
|
||||
from requests import HTTPError
|
||||
|
||||
from certbot import errors
|
||||
from certbot.plugins import dns_common
|
||||
from certbot.plugins import dns_common_lexicon
|
||||
from certbot.plugins.dns_common import CredentialsConfiguration
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
ACCOUNT_URL = 'https://api.luadns.com/settings'
|
||||
|
||||
|
||||
class Authenticator(dns_common.DNSAuthenticator):
|
||||
class Authenticator(dns_common_lexicon.LexiconDNSAuthenticator):
|
||||
"""DNS Authenticator for LuaDNS
|
||||
|
||||
This Authenticator uses the LuaDNS API to fulfill a dns-01 challenge.
|
||||
"""
|
||||
|
||||
description = 'Obtain certificates using a DNS TXT record (if you are using LuaDNS for DNS).'
|
||||
ttl = 60
|
||||
|
||||
def __init__(self, *args: Any, **kwargs: Any) -> None:
|
||||
super().__init__(*args, **kwargs)
|
||||
self.credentials: Optional[CredentialsConfiguration] = None
|
||||
self._add_provider_option('email',
|
||||
'email address associated with LuaDNS account',
|
||||
'auth_username')
|
||||
self._add_provider_option('token',
|
||||
f'API token for LuaDNS account, obtained from {ACCOUNT_URL}',
|
||||
'auth_token')
|
||||
|
||||
@classmethod
|
||||
def add_parser_arguments(cls, add: Callable[..., None],
|
||||
@@ -40,46 +40,9 @@ class Authenticator(dns_common.DNSAuthenticator):
|
||||
return 'This plugin configures a DNS TXT record to respond to a dns-01 challenge using ' + \
|
||||
'the LuaDNS API.'
|
||||
|
||||
def _setup_credentials(self) -> None:
|
||||
self.credentials = self._configure_credentials(
|
||||
'credentials',
|
||||
'LuaDNS credentials INI file',
|
||||
{
|
||||
'email': 'email address associated with LuaDNS account',
|
||||
'token': 'API token for LuaDNS account, obtained from {0}'.format(ACCOUNT_URL)
|
||||
}
|
||||
)
|
||||
|
||||
def _perform(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_luadns_client().add_txt_record(domain, validation_name, validation)
|
||||
|
||||
def _cleanup(self, domain: str, validation_name: str, validation: str) -> None:
|
||||
self._get_luadns_client().del_txt_record(domain, validation_name, validation)
|
||||
|
||||
def _get_luadns_client(self) -> "_LuaDNSLexiconClient":
|
||||
if not self.credentials: # pragma: no cover
|
||||
raise errors.Error("Plugin has not been prepared.")
|
||||
return _LuaDNSLexiconClient(self.credentials.conf('email'),
|
||||
self.credentials.conf('token'),
|
||||
self.ttl)
|
||||
|
||||
|
||||
class _LuaDNSLexiconClient(dns_common_lexicon.LexiconClient):
|
||||
"""
|
||||
Encapsulates all communication with the LuaDNS via Lexicon.
|
||||
"""
|
||||
|
||||
def __init__(self, email: str, token: str, ttl: int) -> None:
|
||||
super().__init__()
|
||||
|
||||
config = dns_common_lexicon.build_lexicon_config('luadns', {
|
||||
'ttl': ttl,
|
||||
}, {
|
||||
'auth_username': email,
|
||||
'auth_token': token,
|
||||
})
|
||||
|
||||
self.provider = luadns.Provider(config)
|
||||
@property
|
||||
def _provider_name(self) -> str:
|
||||
return 'luadns'
|
||||
|
||||
def _handle_http_error(self, e: HTTPError, domain_name: str) -> errors.PluginError:
|
||||
hint = None
|
||||
|
||||
@@ -1,10 +1,9 @@
|
||||
"""Tests for certbot_dns_luadns._internal.dns_luadns."""
|
||||
|
||||
import sys
|
||||
import unittest
|
||||
from unittest import mock
|
||||
|
||||
import pytest
|
||||
from requests import Response
|
||||
from requests.exceptions import HTTPError
|
||||
|
||||
from certbot.compat import os
|
||||
@@ -17,7 +16,9 @@ TOKEN = 'foo'
|
||||
|
||||
|
||||
class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
dns_test_common_lexicon.BaseLexiconAuthenticatorTest):
|
||||
dns_test_common_lexicon.BaseLexiconDNSAuthenticatorTest):
|
||||
|
||||
LOGIN_ERROR = HTTPError("401 Client Error: Unauthorized for url: ...", response=Response())
|
||||
|
||||
def setUp(self):
|
||||
super().setUp()
|
||||
@@ -32,23 +33,6 @@ class AuthenticatorTest(test_util.TempDirTestCase,
|
||||
|
||||
self.auth = Authenticator(self.config, "luadns")
|
||||
|
||||
self.mock_client = mock.MagicMock()
|
||||
# _get_luadns_client | pylint: disable=protected-access
|
||||
self.auth._get_luadns_client = mock.MagicMock(return_value=self.mock_client)
|
||||
|
||||
|
||||
class LuaDNSLexiconClientTest(unittest.TestCase, dns_test_common_lexicon.BaseLexiconClientTest):
|
||||
|
||||
LOGIN_ERROR = HTTPError("401 Client Error: Unauthorized for url: ...")
|
||||
|
||||
def setUp(self):
|
||||
from certbot_dns_luadns._internal.dns_luadns import _LuaDNSLexiconClient
|
||||
|
||||
self.client = _LuaDNSLexiconClient(EMAIL, TOKEN, 0)
|
||||
|
||||
self.provider_mock = mock.MagicMock()
|
||||
self.client.provider = self.provider_mock
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(pytest.main(sys.argv[1:] + [__file__])) # pragma: no cover
|
||||
|
||||
@@ -16,7 +16,7 @@
|
||||
# add these directories to sys.path here. If the directory is relative to the
|
||||
# documentation root, use os.path.abspath to make it absolute, like shown here.
|
||||
#
|
||||
import os
|
||||
# import os
|
||||
|
||||
# import sys
|
||||
# sys.path.insert(0, os.path.abspath('.'))
|
||||
@@ -35,7 +35,8 @@ extensions = ['sphinx.ext.autodoc',
|
||||
'sphinx.ext.intersphinx',
|
||||
'sphinx.ext.todo',
|
||||
'sphinx.ext.coverage',
|
||||
'sphinx.ext.viewcode']
|
||||
'sphinx.ext.viewcode',
|
||||
'sphinx_rtd_theme']
|
||||
|
||||
autodoc_member_order = 'bysource'
|
||||
autodoc_default_flags = ['show-inheritance']
|
||||
@@ -93,14 +94,7 @@ todo_include_todos = False
|
||||
# a list of builtin themes.
|
||||
#
|
||||
|
||||
# https://docs.readthedocs.io/en/stable/faq.html#i-want-to-use-the-read-the-docs-theme-locally
|
||||
# on_rtd is whether we are on readthedocs.org
|
||||
on_rtd = os.environ.get('READTHEDOCS', None) == 'True'
|
||||
if not on_rtd: # only import and set the theme if we're building docs locally
|
||||
import sphinx_rtd_theme
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
html_theme_path = [sphinx_rtd_theme.get_html_theme_path()]
|
||||
# otherwise, readthedocs.org uses their theme by default, so no need to specify it
|
||||
html_theme = 'sphinx_rtd_theme'
|
||||
|
||||
# Theme options are theme-specific and customize the look and feel of a theme
|
||||
# further. For a list of options available for each theme, see the
|
||||
|
||||
@@ -4,14 +4,16 @@ import sys
|
||||
from setuptools import find_packages
|
||||
from setuptools import setup
|
||||
|
||||
version = '2.6.0.dev0'
|
||||
version = '2.12.0.dev0'
|
||||
|
||||
install_requires = [
|
||||
'dns-lexicon>=3.2.1',
|
||||
'dns-lexicon>=3.14.1',
|
||||
'setuptools>=41.6.0',
|
||||
]
|
||||
|
||||
if not os.environ.get('SNAP_BUILD'):
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
else:
|
||||
install_requires.extend([
|
||||
# We specify the minimum acme and certbot version as the current plugin
|
||||
# version for simplicity. See
|
||||
@@ -19,11 +21,6 @@ if not os.environ.get('SNAP_BUILD'):
|
||||
f'acme>={version}',
|
||||
f'certbot>={version}',
|
||||
])
|
||||
elif 'bdist_wheel' in sys.argv[1:]:
|
||||
raise RuntimeError('Unset SNAP_BUILD when building wheels '
|
||||
'to include certbot dependencies.')
|
||||
if os.environ.get('SNAP_BUILD'):
|
||||
install_requires.append('packaging')
|
||||
|
||||
docs_extras = [
|
||||
'Sphinx>=1.0', # autodoc_member_order = 'bysource', autodoc_default_flags
|
||||
@@ -42,7 +39,7 @@ setup(
|
||||
author="Certbot Project",
|
||||
author_email='certbot-dev@eff.org',
|
||||
license='Apache License 2.0',
|
||||
python_requires='>=3.7',
|
||||
python_requires='>=3.8',
|
||||
classifiers=[
|
||||
'Development Status :: 5 - Production/Stable',
|
||||
'Environment :: Plugins',
|
||||
@@ -51,11 +48,11 @@ setup(
|
||||
'Operating System :: POSIX :: Linux',
|
||||
'Programming Language :: Python',
|
||||
'Programming Language :: Python :: 3',
|
||||
'Programming Language :: Python :: 3.7',
|
||||
'Programming Language :: Python :: 3.8',
|
||||
'Programming Language :: Python :: 3.9',
|
||||
'Programming Language :: Python :: 3.10',
|
||||
'Programming Language :: Python :: 3.11',
|
||||
'Programming Language :: Python :: 3.12',
|
||||
'Topic :: Internet :: WWW/HTTP',
|
||||
'Topic :: Security',
|
||||
'Topic :: System :: Installation/Setup',
|
||||
|
||||
33
certbot-dns-nsone/.readthedocs.yaml
Normal file
33
certbot-dns-nsone/.readthedocs.yaml
Normal file
@@ -0,0 +1,33 @@
|
||||
# Read the Docs configuration file for Sphinx projects
|
||||
# See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
|
||||
|
||||
# Required
|
||||
version: 2
|
||||
|
||||
# Set the OS, Python version and other tools you might need
|
||||
build:
|
||||
os: ubuntu-22.04
|
||||
tools:
|
||||
python: "3.11"
|
||||
# You can also specify other tool versions:
|
||||
|
||||
|
||||
# Build documentation in the "docs/" directory with Sphinx
|
||||
sphinx:
|
||||
configuration: certbot-dns-nsone/docs/conf.py
|
||||
# You can configure Sphinx to use a different builder, for instance use the dirhtml builder for simpler URLs
|
||||
# builder: "dirhtml"
|
||||
# Fail on all warnings to avoid broken references
|
||||
fail_on_warning: true
|
||||
|
||||
# Optionally build your docs in additional formats such as PDF and ePub
|
||||
formats:
|
||||
- pdf
|
||||
- epub
|
||||
|
||||
# Optional but recommended, declare the Python requirements required
|
||||
# to build your documentation
|
||||
# See https://docs.readthedocs.io/en/stable/guides/reproducible-builds.html
|
||||
python:
|
||||
install:
|
||||
- requirements: certbot-dns-nsone/readthedocs.org.requirements.txt
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user