[0KRunning with gitlab-runner 15.7.0 (259d2fd4)[0;m [0K on traces00.prd.aws.subcom.link uHrvfFsB[0;m section_start:1678435733:prepare_executor [0K[0K[36;1mPreparing the "docker" executor[0;m[0;m [0KUsing Docker executor with image python:3.8-slim-buster ...[0;m [0KPulling docker image python:3.8-slim-buster ...[0;m [0KUsing docker image sha256:48d520b650a800fe7fd19f1160e51a1b49fac63bc1db79fd39a0b6725d6c1c92 for python:3.8-slim-buster with digest python@sha256:f51af6b4116b1c3a5a7934070b0761d1dcb82592ee7e8ecee8ab93d3e2a3cfe2 ...[0;m section_end:1678435736:prepare_executor [0Ksection_start:1678435736:prepare_script [0K[0K[36;1mPreparing environment[0;m[0;m Running on runner-uhrvffsb-project-138-concurrent-0 via ip-172-26-3-123... section_end:1678435737:prepare_script [0Ksection_start:1678435737:get_sources [0K[0K[36;1mGetting source from Git repository[0;m[0;m [32;1mFetching changes with git depth set to 20...[0;m Reinitialized existing Git repository in /builds/bitia/bitia-cli/.git/ [32;1mChecking out f9bfe49a as v0.2.3...[0;m Removing .mypy_cache/ Removing .pytest_cache/ Removing bitia/__pycache__/ Removing dist/ Removing poetry.lock Removing tests/__pycache__/ [32;1mSkipping Git submodules setup[0;m section_end:1678435738:get_sources [0Ksection_start:1678435738:step_script [0K[0K[36;1mExecuting "step_script" stage of the job script[0;m[0;m [0KUsing docker image sha256:48d520b650a800fe7fd19f1160e51a1b49fac63bc1db79fd39a0b6725d6c1c92 for python:3.8-slim-buster with digest python@sha256:f51af6b4116b1c3a5a7934070b0761d1dcb82592ee7e8ecee8ab93d3e2a3cfe2 ...[0;m [32;1m$ apt update && apt install -y make git[0;m WARNING: apt does not have a stable CLI interface. Use with caution in scripts. Get:1 http://deb.debian.org/debian buster InRelease [122 kB] Get:2 http://deb.debian.org/debian-security buster/updates InRelease [34.8 kB] Get:3 http://deb.debian.org/debian buster-updates InRelease [56.6 kB] Get:4 http://deb.debian.org/debian buster/main amd64 Packages [7909 kB] Get:5 http://deb.debian.org/debian-security buster/updates/main amd64 Packages [470 kB] Get:6 http://deb.debian.org/debian buster-updates/main amd64 Packages [8788 B] Fetched 8601 kB in 2s (4056 kB/s) Reading package lists... Building dependency tree... Reading state information... All packages are up to date. WARNING: apt does not have a stable CLI interface. Use with caution in scripts. Reading package lists... Building dependency tree... Reading state information... The following additional packages will be installed: git-man krb5-locales less libbsd0 libcurl3-gnutls libedit2 liberror-perl libgdbm-compat4 libgssapi-krb5-2 libk5crypto3 libkeyutils1 libkrb5-3 libkrb5support0 libldap-2.4-2 libldap-common libnghttp2-14 libpcre2-8-0 libperl5.28 libpsl5 librtmp1 libsasl2-2 libsasl2-modules libsasl2-modules-db libssh2-1 libx11-6 libx11-data libxau6 libxcb1 libxdmcp6 libxext6 libxmuu1 openssh-client patch perl perl-modules-5.28 publicsuffix xauth Suggested packages: gettext-base git-daemon-run | git-daemon-sysvinit git-doc git-el git-email git-gui gitk gitweb git-cvs git-mediawiki git-svn krb5-doc krb5-user sensible-utils libsasl2-modules-gssapi-mit | libsasl2-modules-gssapi-heimdal libsasl2-modules-ldap libsasl2-modules-otp libsasl2-modules-sql make-doc keychain libpam-ssh monkeysphere ssh-askpass ed diffutils-doc perl-doc libterm-readline-gnu-perl | libterm-readline-perl-perl libb-debug-perl liblocale-codes-perl The following NEW packages will be installed: git git-man krb5-locales less libbsd0 libcurl3-gnutls libedit2 liberror-perl libgdbm-compat4 libgssapi-krb5-2 libk5crypto3 libkeyutils1 libkrb5-3 libkrb5support0 libldap-2.4-2 libldap-common libnghttp2-14 libpcre2-8-0 libperl5.28 libpsl5 librtmp1 libsasl2-2 libsasl2-modules libsasl2-modules-db libssh2-1 libx11-6 libx11-data libxau6 libxcb1 libxdmcp6 libxext6 libxmuu1 make openssh-client patch perl perl-modules-5.28 publicsuffix xauth 0 upgraded, 39 newly installed, 0 to remove and 0 not upgraded. Need to get 19.7 MB of archives. After this operation, 101 MB of additional disk space will be used. Get:1 http://deb.debian.org/debian buster/main amd64 perl-modules-5.28 all 5.28.1-6+deb10u1 [2873 kB] Get:2 http://deb.debian.org/debian buster/main amd64 libgdbm-compat4 amd64 1.18.1-4 [44.1 kB] Get:3 http://deb.debian.org/debian buster/main amd64 libperl5.28 amd64 5.28.1-6+deb10u1 [3894 kB] Get:4 http://deb.debian.org/debian buster/main amd64 perl amd64 5.28.1-6+deb10u1 [204 kB] Get:5 http://deb.debian.org/debian buster/main amd64 less amd64 487-0.1+b1 [129 kB] Get:6 http://deb.debian.org/debian-security buster/updates/main amd64 krb5-locales all 1.17-3+deb10u5 [95.7 kB] Get:7 http://deb.debian.org/debian buster/main amd64 libkeyutils1 amd64 1.6-6 [15.0 kB] Get:8 http://deb.debian.org/debian-security buster/updates/main amd64 libkrb5support0 amd64 1.17-3+deb10u5 [66.0 kB] Get:9 http://deb.debian.org/debian-security buster/updates/main amd64 libk5crypto3 amd64 1.17-3+deb10u5 [122 kB] Get:10 http://deb.debian.org/debian-security buster/updates/main amd64 libkrb5-3 amd64 1.17-3+deb10u5 [369 kB] Get:11 http://deb.debian.org/debian-security buster/updates/main amd64 libgssapi-krb5-2 amd64 1.17-3+deb10u5 [159 kB] Get:12 http://deb.debian.org/debian buster/main amd64 libbsd0 amd64 0.9.1-2+deb10u1 [99.5 kB] Get:13 http://deb.debian.org/debian buster/main amd64 libedit2 amd64 3.1-20181209-1 [94.0 kB] Get:14 http://deb.debian.org/debian buster/main amd64 openssh-client amd64 1:7.9p1-10+deb10u2 [782 kB] Get:15 http://deb.debian.org/debian buster/main amd64 libsasl2-modules-db amd64 2.1.27+dfsg-1+deb10u2 [69.2 kB] Get:16 http://deb.debian.org/debian buster/main amd64 libsasl2-2 amd64 2.1.27+dfsg-1+deb10u2 [106 kB] Get:17 http://deb.debian.org/debian buster/main amd64 libldap-common all 2.4.47+dfsg-3+deb10u7 [90.1 kB] Get:18 http://deb.debian.org/debian buster/main amd64 libldap-2.4-2 amd64 2.4.47+dfsg-3+deb10u7 [224 kB] Get:19 http://deb.debian.org/debian buster/main amd64 libnghttp2-14 amd64 1.36.0-2+deb10u1 [85.0 kB] Get:20 http://deb.debian.org/debian buster/main amd64 libpsl5 amd64 0.20.2-2 [53.7 kB] Get:21 http://deb.debian.org/debian buster/main amd64 librtmp1 amd64 2.4+20151223.gitfa8646d.1-2 [60.5 kB] Get:22 http://deb.debian.org/debian buster/main amd64 libssh2-1 amd64 1.8.0-2.1 [140 kB] Get:23 http://deb.debian.org/debian-security buster/updates/main amd64 libcurl3-gnutls amd64 7.64.0-4+deb10u5 [331 kB] Get:24 http://deb.debian.org/debian buster/main amd64 libpcre2-8-0 amd64 10.32-5 [213 kB] Get:25 http://deb.debian.org/debian buster/main amd64 liberror-perl all 0.17027-2 [30.9 kB] Get:26 http://deb.debian.org/debian-security buster/updates/main amd64 git-man all 1:2.20.1-2+deb10u8 [1623 kB] Get:27 http://deb.debian.org/debian-security buster/updates/main amd64 git amd64 1:2.20.1-2+deb10u8 [5631 kB] Get:28 http://deb.debian.org/debian buster/main amd64 libsasl2-modules amd64 2.1.27+dfsg-1+deb10u2 [104 kB] Get:29 http://deb.debian.org/debian buster/main amd64 libxau6 amd64 1:1.0.8-1+b2 [19.9 kB] Get:30 http://deb.debian.org/debian buster/main amd64 libxdmcp6 amd64 1:1.1.2-3 [26.3 kB] Get:31 http://deb.debian.org/debian buster/main amd64 libxcb1 amd64 1.13.1-2 [137 kB] Get:32 http://deb.debian.org/debian buster/main amd64 libx11-data all 2:1.6.7-1+deb10u2 [299 kB] Get:33 http://deb.debian.org/debian buster/main amd64 libx11-6 amd64 2:1.6.7-1+deb10u2 [757 kB] Get:34 http://deb.debian.org/debian buster/main amd64 libxext6 amd64 2:1.3.3-1+b2 [52.5 kB] Get:35 http://deb.debian.org/debian buster/main amd64 libxmuu1 amd64 2:1.1.2-2+b3 [23.9 kB] Get:36 http://deb.debian.org/debian buster/main amd64 make amd64 4.2.1-1.2 [341 kB] Get:37 http://deb.debian.org/debian buster/main amd64 patch amd64 2.7.6-3+deb10u1 [126 kB] Get:38 http://deb.debian.org/debian buster/main amd64 publicsuffix all 20220811.1734-0+deb10u1 [127 kB] Get:39 http://deb.debian.org/debian buster/main amd64 xauth amd64 1:1.0.10-1 [40.3 kB] debconf: delaying package configuration, since apt-utils is not installed Fetched 19.7 MB in 1s (31.0 MB/s) Selecting previously unselected package perl-modules-5.28. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 6840 files and directories currently installed.) Preparing to unpack .../00-perl-modules-5.28_5.28.1-6+deb10u1_all.deb ... Unpacking perl-modules-5.28 (5.28.1-6+deb10u1) ... Selecting previously unselected package libgdbm-compat4:amd64. Preparing to unpack .../01-libgdbm-compat4_1.18.1-4_amd64.deb ... Unpacking libgdbm-compat4:amd64 (1.18.1-4) ... Selecting previously unselected package libperl5.28:amd64. Preparing to unpack .../02-libperl5.28_5.28.1-6+deb10u1_amd64.deb ... Unpacking libperl5.28:amd64 (5.28.1-6+deb10u1) ... Selecting previously unselected package perl. Preparing to unpack .../03-perl_5.28.1-6+deb10u1_amd64.deb ... Unpacking perl (5.28.1-6+deb10u1) ... Selecting previously unselected package less. Preparing to unpack .../04-less_487-0.1+b1_amd64.deb ... Unpacking less (487-0.1+b1) ... Selecting previously unselected package krb5-locales. Preparing to unpack .../05-krb5-locales_1.17-3+deb10u5_all.deb ... Unpacking krb5-locales (1.17-3+deb10u5) ... Selecting previously unselected package libkeyutils1:amd64. Preparing to unpack .../06-libkeyutils1_1.6-6_amd64.deb ... Unpacking libkeyutils1:amd64 (1.6-6) ... Selecting previously unselected package libkrb5support0:amd64. Preparing to unpack .../07-libkrb5support0_1.17-3+deb10u5_amd64.deb ... Unpacking libkrb5support0:amd64 (1.17-3+deb10u5) ... Selecting previously unselected package libk5crypto3:amd64. Preparing to unpack .../08-libk5crypto3_1.17-3+deb10u5_amd64.deb ... Unpacking libk5crypto3:amd64 (1.17-3+deb10u5) ... Selecting previously unselected package libkrb5-3:amd64. Preparing to unpack .../09-libkrb5-3_1.17-3+deb10u5_amd64.deb ... Unpacking libkrb5-3:amd64 (1.17-3+deb10u5) ... Selecting previously unselected package libgssapi-krb5-2:amd64. Preparing to unpack .../10-libgssapi-krb5-2_1.17-3+deb10u5_amd64.deb ... Unpacking libgssapi-krb5-2:amd64 (1.17-3+deb10u5) ... Selecting previously unselected package libbsd0:amd64. Preparing to unpack .../11-libbsd0_0.9.1-2+deb10u1_amd64.deb ... Unpacking libbsd0:amd64 (0.9.1-2+deb10u1) ... Selecting previously unselected package libedit2:amd64. Preparing to unpack .../12-libedit2_3.1-20181209-1_amd64.deb ... Unpacking libedit2:amd64 (3.1-20181209-1) ... Selecting previously unselected package openssh-client. Preparing to unpack .../13-openssh-client_1%3a7.9p1-10+deb10u2_amd64.deb ... Unpacking openssh-client (1:7.9p1-10+deb10u2) ... Selecting previously unselected package libsasl2-modules-db:amd64. Preparing to unpack .../14-libsasl2-modules-db_2.1.27+dfsg-1+deb10u2_amd64.deb ... Unpacking libsasl2-modules-db:amd64 (2.1.27+dfsg-1+deb10u2) ... Selecting previously unselected package libsasl2-2:amd64. Preparing to unpack .../15-libsasl2-2_2.1.27+dfsg-1+deb10u2_amd64.deb ... Unpacking libsasl2-2:amd64 (2.1.27+dfsg-1+deb10u2) ... Selecting previously unselected package libldap-common. Preparing to unpack .../16-libldap-common_2.4.47+dfsg-3+deb10u7_all.deb ... Unpacking libldap-common (2.4.47+dfsg-3+deb10u7) ... Selecting previously unselected package libldap-2.4-2:amd64. Preparing to unpack .../17-libldap-2.4-2_2.4.47+dfsg-3+deb10u7_amd64.deb ... Unpacking libldap-2.4-2:amd64 (2.4.47+dfsg-3+deb10u7) ... Selecting previously unselected package libnghttp2-14:amd64. Preparing to unpack .../18-libnghttp2-14_1.36.0-2+deb10u1_amd64.deb ... Unpacking libnghttp2-14:amd64 (1.36.0-2+deb10u1) ... Selecting previously unselected package libpsl5:amd64. Preparing to unpack .../19-libpsl5_0.20.2-2_amd64.deb ... Unpacking libpsl5:amd64 (0.20.2-2) ... Selecting previously unselected package librtmp1:amd64. Preparing to unpack .../20-librtmp1_2.4+20151223.gitfa8646d.1-2_amd64.deb ... Unpacking librtmp1:amd64 (2.4+20151223.gitfa8646d.1-2) ... Selecting previously unselected package libssh2-1:amd64. Preparing to unpack .../21-libssh2-1_1.8.0-2.1_amd64.deb ... Unpacking libssh2-1:amd64 (1.8.0-2.1) ... Selecting previously unselected package libcurl3-gnutls:amd64. Preparing to unpack .../22-libcurl3-gnutls_7.64.0-4+deb10u5_amd64.deb ... Unpacking libcurl3-gnutls:amd64 (7.64.0-4+deb10u5) ... Selecting previously unselected package libpcre2-8-0:amd64. Preparing to unpack .../23-libpcre2-8-0_10.32-5_amd64.deb ... Unpacking libpcre2-8-0:amd64 (10.32-5) ... Selecting previously unselected package liberror-perl. Preparing to unpack .../24-liberror-perl_0.17027-2_all.deb ... Unpacking liberror-perl (0.17027-2) ... Selecting previously unselected package git-man. Preparing to unpack .../25-git-man_1%3a2.20.1-2+deb10u8_all.deb ... Unpacking git-man (1:2.20.1-2+deb10u8) ... Selecting previously unselected package git. Preparing to unpack .../26-git_1%3a2.20.1-2+deb10u8_amd64.deb ... Unpacking git (1:2.20.1-2+deb10u8) ... Selecting previously unselected package libsasl2-modules:amd64. Preparing to unpack .../27-libsasl2-modules_2.1.27+dfsg-1+deb10u2_amd64.deb ... Unpacking libsasl2-modules:amd64 (2.1.27+dfsg-1+deb10u2) ... Selecting previously unselected package libxau6:amd64. Preparing to unpack .../28-libxau6_1%3a1.0.8-1+b2_amd64.deb ... Unpacking libxau6:amd64 (1:1.0.8-1+b2) ... Selecting previously unselected package libxdmcp6:amd64. Preparing to unpack .../29-libxdmcp6_1%3a1.1.2-3_amd64.deb ... Unpacking libxdmcp6:amd64 (1:1.1.2-3) ... Selecting previously unselected package libxcb1:amd64. Preparing to unpack .../30-libxcb1_1.13.1-2_amd64.deb ... Unpacking libxcb1:amd64 (1.13.1-2) ... Selecting previously unselected package libx11-data. Preparing to unpack .../31-libx11-data_2%3a1.6.7-1+deb10u2_all.deb ... Unpacking libx11-data (2:1.6.7-1+deb10u2) ... Selecting previously unselected package libx11-6:amd64. Preparing to unpack .../32-libx11-6_2%3a1.6.7-1+deb10u2_amd64.deb ... Unpacking libx11-6:amd64 (2:1.6.7-1+deb10u2) ... Selecting previously unselected package libxext6:amd64. Preparing to unpack .../33-libxext6_2%3a1.3.3-1+b2_amd64.deb ... Unpacking libxext6:amd64 (2:1.3.3-1+b2) ... Selecting previously unselected package libxmuu1:amd64. Preparing to unpack .../34-libxmuu1_2%3a1.1.2-2+b3_amd64.deb ... Unpacking libxmuu1:amd64 (2:1.1.2-2+b3) ... Selecting previously unselected package make. Preparing to unpack .../35-make_4.2.1-1.2_amd64.deb ... Unpacking make (4.2.1-1.2) ... Selecting previously unselected package patch. Preparing to unpack .../36-patch_2.7.6-3+deb10u1_amd64.deb ... Unpacking patch (2.7.6-3+deb10u1) ... Selecting previously unselected package publicsuffix. Preparing to unpack .../37-publicsuffix_20220811.1734-0+deb10u1_all.deb ... Unpacking publicsuffix (20220811.1734-0+deb10u1) ... Selecting previously unselected package xauth. Preparing to unpack .../38-xauth_1%3a1.0.10-1_amd64.deb ... Unpacking xauth (1:1.0.10-1) ... Setting up perl-modules-5.28 (5.28.1-6+deb10u1) ... Setting up libxau6:amd64 (1:1.0.8-1+b2) ... Setting up libkeyutils1:amd64 (1.6-6) ... Setting up libpsl5:amd64 (0.20.2-2) ... Setting up libsasl2-modules:amd64 (2.1.27+dfsg-1+deb10u2) ... Setting up libnghttp2-14:amd64 (1.36.0-2+deb10u1) ... Setting up less (487-0.1+b1) ... debconf: unable to initialize frontend: Dialog debconf: (TERM is not set, so the dialog frontend is not usable.) debconf: falling back to frontend: Readline Setting up krb5-locales (1.17-3+deb10u5) ... Setting up libldap-common (2.4.47+dfsg-3+deb10u7) ... Setting up libkrb5support0:amd64 (1.17-3+deb10u5) ... Setting up libsasl2-modules-db:amd64 (2.1.27+dfsg-1+deb10u2) ... Setting up libx11-data (2:1.6.7-1+deb10u2) ... Setting up make (4.2.1-1.2) ... Setting up librtmp1:amd64 (2.4+20151223.gitfa8646d.1-2) ... Setting up patch (2.7.6-3+deb10u1) ... Setting up libgdbm-compat4:amd64 (1.18.1-4) ... Setting up libpcre2-8-0:amd64 (10.32-5) ... Setting up libk5crypto3:amd64 (1.17-3+deb10u5) ... Setting up libsasl2-2:amd64 (2.1.27+dfsg-1+deb10u2) ... Setting up libperl5.28:amd64 (5.28.1-6+deb10u1) ... Setting up git-man (1:2.20.1-2+deb10u8) ... Setting up libssh2-1:amd64 (1.8.0-2.1) ... Setting up libkrb5-3:amd64 (1.17-3+deb10u5) ... Setting up libbsd0:amd64 (0.9.1-2+deb10u1) ... Setting up publicsuffix (20220811.1734-0+deb10u1) ... Setting up libxdmcp6:amd64 (1:1.1.2-3) ... Setting up libxcb1:amd64 (1.13.1-2) ... Setting up libedit2:amd64 (3.1-20181209-1) ... Setting up libldap-2.4-2:amd64 (2.4.47+dfsg-3+deb10u7) ... Setting up perl (5.28.1-6+deb10u1) ... Setting up libgssapi-krb5-2:amd64 (1.17-3+deb10u5) ... Setting up libx11-6:amd64 (2:1.6.7-1+deb10u2) ... Setting up libxmuu1:amd64 (2:1.1.2-2+b3) ... Setting up openssh-client (1:7.9p1-10+deb10u2) ... Setting up libxext6:amd64 (2:1.3.3-1+b2) ... Setting up libcurl3-gnutls:amd64 (7.64.0-4+deb10u5) ... Setting up liberror-perl (0.17027-2) ... Setting up git (1:2.20.1-2+deb10u8) ... Setting up xauth (1:1.0.10-1) ... Processing triggers for libc-bin (2.28-10+deb10u2) ... [32;1m$ python3 -m pip install poetry --upgrade[0;m Collecting poetry Downloading poetry-1.4.0-py3-none-any.whl (221 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 221.9/221.9 KB 3.1 MB/s eta 0:00:00 Collecting keyring<24.0.0,>=23.9.0 Downloading keyring-23.13.1-py3-none-any.whl (37 kB) Collecting filelock<4.0.0,>=3.8.0 Downloading filelock-3.9.0-py3-none-any.whl (9.7 kB) Collecting requests-toolbelt<0.11.0,>=0.9.1 Downloading requests_toolbelt-0.10.1-py2.py3-none-any.whl (54 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 54.5/54.5 KB 6.8 MB/s eta 0:00:00 Collecting platformdirs<3.0.0,>=2.5.2 Downloading platformdirs-2.6.2-py3-none-any.whl (14 kB) Collecting poetry-plugin-export<2.0.0,>=1.3.0 Downloading poetry_plugin_export-1.3.0-py3-none-any.whl (10 kB) Collecting packaging>=20.4 Downloading packaging-23.0-py3-none-any.whl (42 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 42.7/42.7 KB 6.7 MB/s eta 0:00:00 Collecting html5lib<2.0,>=1.0 Downloading html5lib-1.1-py2.py3-none-any.whl (112 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 112.2/112.2 KB 11.5 MB/s eta 0:00:00 Collecting crashtest<0.5.0,>=0.4.1 Downloading crashtest-0.4.1-py3-none-any.whl (7.6 kB) Collecting virtualenv!=20.4.5,!=20.4.6,<21.0.0,>=20.4.3 Downloading virtualenv-20.20.0-py3-none-any.whl (8.7 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.7/8.7 MB 39.0 MB/s eta 0:00:00 Collecting pexpect<5.0.0,>=4.7.0 Downloading pexpect-4.8.0-py2.py3-none-any.whl (59 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 59.0/59.0 KB 9.6 MB/s eta 0:00:00 Collecting poetry-core==1.5.1 Downloading poetry_core-1.5.1-py3-none-any.whl (465 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 465.2/465.2 KB 8.0 MB/s eta 0:00:00 Collecting build<0.11.0,>=0.10.0 Downloading build-0.10.0-py3-none-any.whl (17 kB) Collecting importlib-metadata>=4.4 Downloading importlib_metadata-6.0.0-py3-none-any.whl (21 kB) Collecting tomlkit!=0.11.2,!=0.11.3,<1.0.0,>=0.11.1 Downloading tomlkit-0.11.6-py3-none-any.whl (35 kB) Collecting pyproject-hooks<2.0.0,>=1.0.0 Downloading pyproject_hooks-1.0.0-py3-none-any.whl (9.3 kB) Collecting pkginfo<2.0.0,>=1.9.4 Downloading pkginfo-1.9.6-py3-none-any.whl (30 kB) Collecting cachecontrol[filecache]<0.13.0,>=0.12.9 Downloading CacheControl-0.12.11-py2.py3-none-any.whl (21 kB) Collecting shellingham<2.0,>=1.5 Downloading shellingham-1.5.0.post1-py2.py3-none-any.whl (9.4 kB) Collecting requests<3.0,>=2.18 Downloading requests-2.28.2-py3-none-any.whl (62 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.8/62.8 KB 8.4 MB/s eta 0:00:00 Collecting urllib3<2.0.0,>=1.26.0 Downloading urllib3-1.26.14-py2.py3-none-any.whl (140 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 140.6/140.6 KB 15.0 MB/s eta 0:00:00 Collecting dulwich<0.22.0,>=0.21.2 Downloading dulwich-0.21.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (508 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 508.7/508.7 KB 39.7 MB/s eta 0:00:00 Collecting jsonschema<5.0.0,>=4.10.0 Downloading jsonschema-4.17.3-py3-none-any.whl (90 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 90.4/90.4 KB 12.9 MB/s eta 0:00:00 Collecting cleo<3.0.0,>=2.0.0 Downloading cleo-2.0.1-py3-none-any.whl (77 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 77.3/77.3 KB 12.2 MB/s eta 0:00:00 Collecting installer<0.7.0,>=0.6.0 Downloading installer-0.6.0-py3-none-any.whl (452 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 452.6/452.6 KB 38.0 MB/s eta 0:00:00 Collecting trove-classifiers>=2022.5.19 Downloading trove_classifiers-2023.3.9-py3-none-any.whl (13 kB) Collecting tomli<3.0.0,>=2.0.1 Downloading tomli-2.0.1-py3-none-any.whl (12 kB) Collecting lockfile<0.13.0,>=0.12.2 Downloading lockfile-0.12.2-py2.py3-none-any.whl (13 kB) Collecting msgpack>=0.5.2 Downloading msgpack-1.0.5-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (322 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 322.4/322.4 KB 29.3 MB/s eta 0:00:00 Collecting rapidfuzz<3.0.0,>=2.2.0 Downloading rapidfuzz-2.13.7-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.2/2.2 MB 52.7 MB/s eta 0:00:00 Collecting webencodings Downloading webencodings-0.5.1-py2.py3-none-any.whl (11 kB) Collecting six>=1.9 Downloading six-1.16.0-py2.py3-none-any.whl (11 kB) Collecting zipp>=0.5 Downloading zipp-3.15.0-py3-none-any.whl (6.8 kB) Collecting attrs>=17.4.0 Downloading attrs-22.2.0-py3-none-any.whl (60 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 60.0/60.0 KB 9.2 MB/s eta 0:00:00 Collecting pkgutil-resolve-name>=1.3.10 Downloading pkgutil_resolve_name-1.3.10-py3-none-any.whl (4.7 kB) Collecting pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0 Downloading pyrsistent-0.19.3-py3-none-any.whl (57 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.5/57.5 KB 8.7 MB/s eta 0:00:00 Collecting importlib-resources>=1.4.0 Downloading importlib_resources-5.12.0-py3-none-any.whl (36 kB) Collecting SecretStorage>=3.2 Downloading SecretStorage-3.3.3-py3-none-any.whl (15 kB) Collecting jeepney>=0.4.2 Downloading jeepney-0.8.0-py3-none-any.whl (48 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.4/48.4 KB 7.8 MB/s eta 0:00:00 Collecting jaraco.classes Downloading jaraco.classes-3.2.3-py3-none-any.whl (6.0 kB) Collecting ptyprocess>=0.5 Downloading ptyprocess-0.7.0-py2.py3-none-any.whl (13 kB) Collecting certifi>=2017.4.17 Downloading certifi-2022.12.7-py3-none-any.whl (155 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 155.3/155.3 KB 21.3 MB/s eta 0:00:00 Collecting charset-normalizer<4,>=2 Downloading charset_normalizer-3.1.0-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (195 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 195.9/195.9 KB 23.8 MB/s eta 0:00:00 Collecting idna<4,>=2.5 Downloading idna-3.4-py3-none-any.whl (61 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.5/61.5 KB 9.0 MB/s eta 0:00:00 Collecting distlib<1,>=0.3.6 Downloading distlib-0.3.6-py2.py3-none-any.whl (468 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 468.5/468.5 KB 34.6 MB/s eta 0:00:00 Collecting cryptography>=2.0 Downloading cryptography-39.0.2-cp36-abi3-manylinux_2_28_x86_64.whl (4.2 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.2/4.2 MB 52.6 MB/s eta 0:00:00 Collecting more-itertools Downloading more_itertools-9.1.0-py3-none-any.whl (54 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 54.2/54.2 KB 8.7 MB/s eta 0:00:00 Collecting cffi>=1.12 Downloading cffi-1.15.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (442 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 442.7/442.7 KB 25.2 MB/s eta 0:00:00 Collecting pycparser Downloading pycparser-2.21-py2.py3-none-any.whl (118 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 118.7/118.7 KB 18.1 MB/s eta 0:00:00 Installing collected packages: webencodings, trove-classifiers, ptyprocess, msgpack, lockfile, distlib, zipp, urllib3, tomlkit, tomli, six, shellingham, rapidfuzz, pyrsistent, pycparser, poetry-core, platformdirs, pkgutil-resolve-name, pkginfo, pexpect, packaging, more-itertools, jeepney, installer, idna, filelock, crashtest, charset-normalizer, certifi, attrs, virtualenv, requests, pyproject-hooks, jaraco.classes, importlib-resources, importlib-metadata, html5lib, dulwich, cleo, cffi, requests-toolbelt, jsonschema, cryptography, cachecontrol, build, SecretStorage, keyring, poetry-plugin-export, poetry Successfully installed SecretStorage-3.3.3 attrs-22.2.0 build-0.10.0 cachecontrol-0.12.11 certifi-2022.12.7 cffi-1.15.1 charset-normalizer-3.1.0 cleo-2.0.1 crashtest-0.4.1 cryptography-39.0.2 distlib-0.3.6 dulwich-0.21.3 filelock-3.9.0 html5lib-1.1 idna-3.4 importlib-metadata-6.0.0 importlib-resources-5.12.0 installer-0.6.0 jaraco.classes-3.2.3 jeepney-0.8.0 jsonschema-4.17.3 keyring-23.13.1 lockfile-0.12.2 more-itertools-9.1.0 msgpack-1.0.5 packaging-23.0 pexpect-4.8.0 pkginfo-1.9.6 pkgutil-resolve-name-1.3.10 platformdirs-2.6.2 poetry-1.4.0 poetry-core-1.5.1 poetry-plugin-export-1.3.0 ptyprocess-0.7.0 pycparser-2.21 pyproject-hooks-1.0.0 pyrsistent-0.19.3 rapidfuzz-2.13.7 requests-2.28.2 requests-toolbelt-0.10.1 shellingham-1.5.0.post1 six-1.16.0 tomli-2.0.1 tomlkit-0.11.6 trove-classifiers-2023.3.9 urllib3-1.26.14 virtualenv-20.20.0 webencodings-0.5.1 zipp-3.15.0 WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv WARNING: You are using pip version 22.0.4; however, version 23.0.1 is available. You should consider upgrading via the '/usr/local/bin/python3 -m pip install --upgrade pip' command. [32;1m$ make ci[0;m poetry install Creating virtualenv bitia-xNCUUZ2P-py3.8 in /root/.cache/pypoetry/virtualenvs Updating dependencies Resolving dependencies... Writing lock file Package operations: 75 installs, 1 update, 0 removals • Installing pycparser (2.21) • Installing cffi (1.15.1) • Installing certifi (2022.12.7) • Installing charset-normalizer (3.1.0) • Installing cryptography (39.0.2) • Installing idna (3.4) • Installing jeepney (0.8.0) • Installing markupsafe (2.1.2) • Installing mdurl (0.1.2) • Installing more-itertools (9.1.0) • Installing pytz (2022.7.1) • Installing six (1.16.0) • Installing urllib3 (1.26.14) • Installing webencodings (0.5.1) • Installing zipp (3.15.0) • Installing alabaster (0.7.13) • Installing babel (2.12.1) • Installing bleach (6.0.0) • Installing commonmark (0.9.1) • Installing docutils (0.18.1) • Installing imagesize (1.4.1) • Installing importlib-metadata (6.0.0) • Installing importlib-resources (5.12.0) • Installing jaraco-classes (3.2.3) • Installing jinja2 (3.1.2) • Installing lazy-object-proxy (1.9.0) • Installing markdown-it-py (2.2.0) • Installing packaging (23.0) • Installing pygments (2.14.0) • Installing requests (2.28.2) • Installing secretstorage (3.3.3) • Updating setuptools (67.4.0 -> 67.6.0) • Installing snowballstemmer (2.2.0) • Installing sphinxcontrib-applehelp (1.0.4) • Installing sphinxcontrib-devhelp (1.0.2) • Installing sphinxcontrib-htmlhelp (2.0.1) • Installing sphinxcontrib-jsmath (1.0.1) • Installing sphinxcontrib-qthelp (1.0.3) • Installing sphinxcontrib-serializinghtml (1.1.5) • Installing typing-extensions (4.5.0) • Installing wrapt (1.15.0) • Installing astroid (2.15.0) • Installing attrs (22.2.0) • Installing click (8.1.3) • Installing dill (0.3.6) • Installing exceptiongroup (1.1.0) • Installing iniconfig (2.0.0) • Installing isort (5.12.0) • Installing decorator (5.1.1) • Installing keyring (23.13.1) • Installing mccabe (0.7.0) • Installing mdit-py-plugins (0.3.5) • Installing mypy-extensions (1.0.0) • Installing pkginfo (1.9.6) • Installing platformdirs (3.1.0) • Installing pluggy (1.0.0) • Installing pyyaml (6.0) • Installing readme-renderer (37.3) • Installing requests-toolbelt (0.10.1) • Installing rfc3986 (2.0.0) • Installing rich (12.6.0) • Installing sphinx (5.3.0) • Installing sphinxcontrib-jquery (2.0.0) • Installing tomli (2.0.1) • Installing tomlkit (0.11.6) • Installing mypy (0.981) • Installing myst-parser (0.18.1) • Installing pylint (2.17.0) • Installing pytest (7.2.2) • Installing sphinx-copybutton (0.5.1) • Installing pyparsing (3.0.9) • Installing sphinx-favicon (0.2) • Installing sphinx-rtd-theme (1.2.0) • Installing twine (4.0.2) • Installing typer (0.6.1) • Installing validators (0.20.0) Installing the current project: bitia (0.2.3) poetry run pylint -E bitia tests poetry install Installing dependencies from lock file No dependencies to install or update Installing the current project: bitia (0.2.3) poetry run mypy --ignore-missing-imports --install-types --non-interactive bitia tests Collecting types-requests Downloading types_requests-2.28.11.15-py3-none-any.whl (14 kB) Collecting types-urllib3<1.27 Downloading types_urllib3-1.26.25.8-py3-none-any.whl (15 kB) Installing collected packages: types-urllib3, types-requests Successfully installed types-requests-2.28.11.15 types-urllib3-1.26.25.8 Installing missing stub packages: /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/bin/python -m pip install types-requests Success: no issues found in 14 source files poetry build Building bitia (0.2.3) - Building sdist - Built bitia-0.2.3.tar.gz - Building wheel - Built bitia-0.2.3-py3-none-any.whl poetry run pytest tests bitia ============================= test session starts ============================== platform linux -- Python 3.8.16, pytest-7.2.2, pluggy-1.0.0 rootdir: /builds/bitia/bitia-cli collected 6 items tests/test_methods.py . [ 16%] tests/test_pipeline.py .. [ 50%] tests/test_sanity.py .FF [100%] =================================== FAILURES =================================== _______________________________ test_run_repeat ________________________________ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa3cfbb97c0> method = 'POST', url = '/api/v1/container/create?recreate=true' body = b'--08d4d18f7dacef8ceea1067ea9f346ce\r\nContent-Disposition: form-data; name="pipeline_zip"; filename="438fbf1fc8437c0...5\x06\x00\x00\x00\x00\x01\x00\x01\x00R\x00\x00\x00\x9e\x00\x00\x00\x00\x00\r\n--08d4d18f7dacef8ceea1067ea9f346ce--\r\n' headers = {'User-Agent': 'python-requests/2.28.2', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '474', 'Content-Type': 'multipart/form-data; boundary=08d4d18f7dacef8ceea1067ea9f346ce'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/v1/container/create', query='recreate=true', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] <https://github.com/urllib3/urllib3/issues/651> release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. > httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connectionpool.py:703: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa3cfbb97c0> conn = <urllib3.connection.HTTPSConnection object at 0x7fa3cf3f3880> method = 'POST', url = '/api/v1/container/create?recreate=true' timeout = Timeout(connect=None, read=None, total=None), chunked = False httplib_request_kw = {'body': b'--08d4d18f7dacef8ceea1067ea9f346ce\r\nContent-Disposition: form-data; name="pipeline_zip"; filename="438fbf...eep-alive', 'Content-Length': '474', 'Content-Type': 'multipart/form-data; boundary=08d4d18f7dacef8ceea1067ea9f346ce'}} timeout_obj = Timeout(connect=None, read=None, total=None) def _make_request( self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw ): """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param timeout: Socket timeout in seconds for the request. This can be a float or integer, which will set the same timeout value for the socket connect and the socket read, or an instance of :class:`urllib3.util.Timeout`, which gives you more fine-grained control over your timeouts. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = timeout_obj.connect_timeout # Trigger any extra validation we need to do. try: > self._validate_conn(conn) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connectionpool.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa3cfbb97c0> conn = <urllib3.connection.HTTPSConnection object at 0x7fa3cf3f3880> def _validate_conn(self, conn): """ Called right before a request is made, after the socket is created. """ super(HTTPSConnectionPool, self)._validate_conn(conn) # Force connect early to allow us to validate the connection. if not getattr(conn, "sock", None): # AppEngine might not have `.sock` > conn.connect() /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connectionpool.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connection.HTTPSConnection object at 0x7fa3cf3f3880> def connect(self): # Add certificate verification self.sock = conn = self._new_conn() hostname = self.host tls_in_tls = False if self._is_using_tunnel(): if self.tls_in_tls_required: self.sock = conn = self._connect_tls_proxy(hostname, conn) tls_in_tls = True # Calls self._set_hostport(), so self.host is # self._tunnel_host below. self._tunnel() # Mark this connection as not reusable self.auto_open = 0 # Override the host with the one we're requesting data from. hostname = self._tunnel_host server_hostname = hostname if self.server_hostname is not None: server_hostname = self.server_hostname is_time_off = datetime.date.today() < RECENT_DATE if is_time_off: warnings.warn( ( "System time is way off (before {0}). This will probably " "lead to SSL verification errors" ).format(RECENT_DATE), SystemTimeWarning, ) # Wrap socket using verification with the root certs in # trusted_root_certs default_ssl_context = False if self.ssl_context is None: default_ssl_context = True self.ssl_context = create_urllib3_context( ssl_version=resolve_ssl_version(self.ssl_version), cert_reqs=resolve_cert_reqs(self.cert_reqs), ) context = self.ssl_context context.verify_mode = resolve_cert_reqs(self.cert_reqs) # Try to load OS default certs if none are given. # Works well on Windows (requires Python3.4+) if ( not self.ca_certs and not self.ca_cert_dir and not self.ca_cert_data and default_ssl_context and hasattr(context, "load_default_certs") ): context.load_default_certs() self.sock = ssl_wrap_socket( sock=conn, keyfile=self.key_file, certfile=self.cert_file, key_password=self.key_password, ca_certs=self.ca_certs, ca_cert_dir=self.ca_cert_dir, ca_cert_data=self.ca_cert_data, server_hostname=server_hostname, ssl_context=context, tls_in_tls=tls_in_tls, ) # If we're using all defaults and the connection # is TLSv1 or TLSv1.1 we throw a DeprecationWarning # for the host. if ( default_ssl_context and self.ssl_version is None and hasattr(self.sock, "version") and self.sock.version() in {"TLSv1", "TLSv1.1"} ): warnings.warn( "Negotiating TLSv1/TLSv1.1 by default is deprecated " "and will be disabled in urllib3 v2.0.0. Connecting to " "'%s' with '%s' can be enabled by explicitly opting-in " "with 'ssl_version'" % (self.host, self.sock.version()), DeprecationWarning, ) if self.assert_fingerprint: assert_fingerprint( self.sock.getpeercert(binary_form=True), self.assert_fingerprint ) elif ( context.verify_mode != ssl.CERT_NONE and not getattr(context, "check_hostname", False) and self.assert_hostname is not False ): # While urllib3 attempts to always turn off hostname matching from # the TLS library, this cannot always be done. So we check whether # the TLS Library still thinks it's matching hostnames. cert = self.sock.getpeercert() if not cert.get("subjectAltName", ()): warnings.warn( ( "Certificate for {0} has no `subjectAltName`, falling back to check for a " "`commonName` for now. This feature is being removed by major browsers and " "deprecated by RFC 2818. (See https://github.com/urllib3/urllib3/issues/497 " "for details.)".format(hostname) ), SubjectAltNameWarning, ) > _match_hostname(cert, self.assert_hostname or server_hostname) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connection.py:467: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cert = {'OCSP': ('http://r3.o.lencr.org',), 'caIssuers': ('http://r3.i.lencr.org/',), 'issuer': ((('countryName', 'US'),), (('organizationName', "Let's Encrypt"),), (('commonName', 'R3'),)), 'notAfter': 'May 31 08:28:01 2023 GMT', ...} asserted_hostname = 'public.bitia.link' def _match_hostname(cert, asserted_hostname): # Our upstream implementation of ssl.match_hostname() # only applies this normalization to IP addresses so it doesn't # match DNS SANs so we do the same thing! stripped_hostname = asserted_hostname.strip("u[]") if is_ipaddress(stripped_hostname): asserted_hostname = stripped_hostname try: > match_hostname(cert, asserted_hostname) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connection.py:540: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cert = {'OCSP': ('http://r3.o.lencr.org',), 'caIssuers': ('http://r3.i.lencr.org/',), 'issuer': ((('countryName', 'US'),), (('organizationName', "Let's Encrypt"),), (('commonName', 'R3'),)), 'notAfter': 'May 31 08:28:01 2023 GMT', ...} hostname = 'public.bitia.link' def match_hostname(cert, hostname): """Verify that *cert* (in decoded format as returned by SSLSocket.getpeercert()) matches the *hostname*. RFC 2818 and RFC 6125 rules are followed, but IP addresses are not accepted for *hostname*. CertificateError is raised on failure. On success, the function returns nothing. """ if not cert: raise ValueError( "empty or no certificate, match_hostname needs a " "SSL socket or SSL context with either " "CERT_OPTIONAL or CERT_REQUIRED" ) try: # Divergence from upstream: ipaddress can't handle byte str host_ip = ipaddress.ip_address(_to_unicode(hostname)) except (UnicodeError, ValueError): # ValueError: Not an IP address (common case) # UnicodeError: Divergence from upstream: Have to deal with ipaddress not taking # byte strings. addresses should be all ascii, so we consider it not # an ipaddress in this case host_ip = None except AttributeError: # Divergence from upstream: Make ipaddress library optional if ipaddress is None: host_ip = None else: # Defensive raise dnsnames = [] san = cert.get("subjectAltName", ()) for key, value in san: if key == "DNS": if host_ip is None and _dnsname_match(value, hostname): return dnsnames.append(value) elif key == "IP Address": if host_ip is not None and _ipaddress_match(value, host_ip): return dnsnames.append(value) if not dnsnames: # The subject is only checked when there is no dNSName entry # in subjectAltName for sub in cert.get("subject", ()): for key, value in sub: # XXX according to RFC 2818, the most specific Common Name # must be used. if key == "commonName": if _dnsname_match(value, hostname): return dnsnames.append(value) if len(dnsnames) > 1: > raise CertificateError( "hostname %r " "doesn't match either of %s" % (hostname, ", ".join(map(repr, dnsnames))) E urllib3.util.ssl_match_hostname.CertificateError: hostname 'public.bitia.link' doesn't match either of 'subcom.link', 'www.subcom.link' /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/util/ssl_match_hostname.py:150: CertificateError During handling of the above exception, another exception occurred: self = <requests.adapters.HTTPAdapter object at 0x7fa3cf48f7f0> request = <PreparedRequest [POST]>, stream = True timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection(request.url, proxies) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: if not chunked: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, ) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/requests/adapters.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa3cfbb97c0> method = 'POST', url = '/api/v1/container/create?recreate=true' body = b'--08d4d18f7dacef8ceea1067ea9f346ce\r\nContent-Disposition: form-data; name="pipeline_zip"; filename="438fbf1fc8437c0...5\x06\x00\x00\x00\x00\x01\x00\x01\x00R\x00\x00\x00\x9e\x00\x00\x00\x00\x00\r\n--08d4d18f7dacef8ceea1067ea9f346ce--\r\n' headers = {'User-Agent': 'python-requests/2.28.2', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '474', 'Content-Type': 'multipart/form-data; boundary=08d4d18f7dacef8ceea1067ea9f346ce'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/v1/container/create', query='recreate=true', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] <https://github.com/urllib3/urllib3/issues/651> release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Pass method to Response for length checking response_kw["request_method"] = method # Import httplib's response into our own wrapper object response = self.ResponseCls.from_httplib( httplib_response, pool=self, connection=response_conn, retries=retries, **response_kw ) # Everything went great! clean_exit = True except EmptyPoolError: # Didn't get a connection from the pool, no need to clean up clean_exit = True release_this_conn = False raise except ( TimeoutError, HTTPException, SocketError, ProtocolError, BaseSSLError, SSLError, CertificateError, ) as e: # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False def _is_ssl_error_message_from_http_proxy(ssl_error): # We're trying to detect the message 'WRONG_VERSION_NUMBER' but # SSLErrors are kinda all over the place when it comes to the message, # so we try to cover our bases here! message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) return ( "wrong version number" in message or "unknown protocol" in message ) # Try to detect a common user error with proxies which is to # set an HTTP proxy to be HTTPS when it should be 'http://' # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) # Instead we add a nice error message and point to a URL. if ( isinstance(e, BaseSSLError) and self.proxy and _is_ssl_error_message_from_http_proxy(e) and conn.proxy and conn.proxy.scheme == "https" ): e = ProxyError( "Your proxy appears to only use HTTP and not HTTPS, " "try changing your proxy URL to be HTTP. See: " "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" "#https-proxy-error-http-proxy", SSLError(e), ) elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) elif isinstance(e, (SocketError, HTTPException)): e = ProtocolError("Connection aborted.", e) > retries = retries.increment( method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] ) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connectionpool.py:787: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST', url = '/api/v1/container/create?recreate=true', response = None error = SSLError(CertificateError("hostname 'public.bitia.link' doesn't match either of 'subcom.link', 'www.subcom.link'")) _pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa3cfbb97c0> _stacktrace = <traceback object at 0x7fa3cf417b40> def increment( self, method=None, url=None, response=None, error=None, _pool=None, _stacktrace=None, ): """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.HTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise six.reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise six.reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or not self._is_method_retryable(method): raise six.reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" redirect_location = response.get_redirect_location() status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): > raise MaxRetryError(_pool, url, error or ResponseError(cause)) E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='public.bitia.link', port=443): Max retries exceeded with url: /api/v1/container/create?recreate=true (Caused by SSLError(CertificateError("hostname 'public.bitia.link' doesn't match either of 'subcom.link', 'www.subcom.link'"))) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/util/retry.py:592: MaxRetryError During handling of the above exception, another exception occurred: capsys = <_pytest.capture.CaptureFixture object at 0x7fa3cf89b790> def test_run_repeat(capsys): bconfig.set_config("plain", True) > bitia.__main__.run_user_input("ls -ltr /", rerun=False) tests/test_sanity.py:18: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ bitia/__main__.py:47: in wrapper retval = func(*args, **kwargs) bitia/__main__.py:175: in run_user_input create_remote_container(user_input, recreate=rerun, output_lines=output_lines) bitia/__main__.py:47: in wrapper retval = func(*args, **kwargs) bitia/__main__.py:69: in create_remote_container res = bhelper.post_pipeline_task( bitia/helper.py:58: in post_pipeline_task return bsession.post( bitia/session.py:47: in post return g_session.post(*args, **kwargs) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/requests/sessions.py:635: in post return self.request("POST", url, data=data, json=json, **kwargs) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <requests.adapters.HTTPAdapter object at 0x7fa3cf48f7f0> request = <PreparedRequest [POST]>, stream = True timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection(request.url, proxies) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: if not chunked: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, ) # Send the request. else: if hasattr(conn, "proxy_pool"): conn = conn.proxy_pool low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT) try: skip_host = "Host" in request.headers low_conn.putrequest( request.method, url, skip_accept_encoding=True, skip_host=skip_host, ) for header, value in request.headers.items(): low_conn.putheader(header, value) low_conn.endheaders() for i in request.body: low_conn.send(hex(len(i))[2:].encode("utf-8")) low_conn.send(b"\r\n") low_conn.send(i) low_conn.send(b"\r\n") low_conn.send(b"0\r\n\r\n") # Receive the response from the server r = low_conn.getresponse() resp = HTTPResponse.from_httplib( r, pool=conn, connection=low_conn, preload_content=False, decode_content=False, ) except Exception: # If we hit any problems here, clean up the connection. # Then, raise so that we can handle the actual exception. low_conn.close() raise except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. > raise SSLError(e, request=request) E requests.exceptions.SSLError: HTTPSConnectionPool(host='public.bitia.link', port=443): Max retries exceeded with url: /api/v1/container/create?recreate=true (Caused by SSLError(CertificateError("hostname 'public.bitia.link' doesn't match either of 'subcom.link', 'www.subcom.link'"))) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/requests/adapters.py:563: SSLError ------------------------------ Captured log call ------------------------------- WARNING urllib3.connection:connection.py:542 Certificate did not match expected hostname: public.bitia.link. Certificate: {'subject': ((('commonName', 'subcom.link'),),), 'issuer': ((('countryName', 'US'),), (('organizationName', "Let's Encrypt"),), (('commonName', 'R3'),)), 'version': 3, 'serialNumber': '0458DF12E0741A8E317D7F4AFB02C9B7FD5F', 'notBefore': 'Mar 2 08:28:02 2023 GMT', 'notAfter': 'May 31 08:28:01 2023 GMT', 'subjectAltName': (('DNS', 'subcom.link'), ('DNS', 'www.subcom.link')), 'OCSP': ('http://r3.o.lencr.org',), 'caIssuers': ('http://r3.i.lencr.org/',)} _______________________________ test_run_simple ________________________________ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa3cf128c70> method = 'POST', url = '/api/v1/container/create?recreate=true' body = b'--4f8d1c82d76d234eafe6bd07219da21b\r\nContent-Disposition: form-data; name="pipeline_zip"; filename="438fbf1fc8437c0...5\x06\x00\x00\x00\x00\x01\x00\x01\x00R\x00\x00\x00\x9e\x00\x00\x00\x00\x00\r\n--4f8d1c82d76d234eafe6bd07219da21b--\r\n' headers = {'User-Agent': 'python-requests/2.28.2', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '474', 'Content-Type': 'multipart/form-data; boundary=4f8d1c82d76d234eafe6bd07219da21b'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/v1/container/create', query='recreate=true', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] <https://github.com/urllib3/urllib3/issues/651> release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. > httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connectionpool.py:703: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa3cf128c70> conn = <urllib3.connection.HTTPSConnection object at 0x7fa3cf227b80> method = 'POST', url = '/api/v1/container/create?recreate=true' timeout = Timeout(connect=None, read=None, total=None), chunked = False httplib_request_kw = {'body': b'--4f8d1c82d76d234eafe6bd07219da21b\r\nContent-Disposition: form-data; name="pipeline_zip"; filename="438fbf...eep-alive', 'Content-Length': '474', 'Content-Type': 'multipart/form-data; boundary=4f8d1c82d76d234eafe6bd07219da21b'}} timeout_obj = Timeout(connect=None, read=None, total=None) def _make_request( self, conn, method, url, timeout=_Default, chunked=False, **httplib_request_kw ): """ Perform a request on a given urllib connection object taken from our pool. :param conn: a connection from one of our connection pools :param timeout: Socket timeout in seconds for the request. This can be a float or integer, which will set the same timeout value for the socket connect and the socket read, or an instance of :class:`urllib3.util.Timeout`, which gives you more fine-grained control over your timeouts. """ self.num_requests += 1 timeout_obj = self._get_timeout(timeout) timeout_obj.start_connect() conn.timeout = timeout_obj.connect_timeout # Trigger any extra validation we need to do. try: > self._validate_conn(conn) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connectionpool.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa3cf128c70> conn = <urllib3.connection.HTTPSConnection object at 0x7fa3cf227b80> def _validate_conn(self, conn): """ Called right before a request is made, after the socket is created. """ super(HTTPSConnectionPool, self)._validate_conn(conn) # Force connect early to allow us to validate the connection. if not getattr(conn, "sock", None): # AppEngine might not have `.sock` > conn.connect() /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connectionpool.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connection.HTTPSConnection object at 0x7fa3cf227b80> def connect(self): # Add certificate verification self.sock = conn = self._new_conn() hostname = self.host tls_in_tls = False if self._is_using_tunnel(): if self.tls_in_tls_required: self.sock = conn = self._connect_tls_proxy(hostname, conn) tls_in_tls = True # Calls self._set_hostport(), so self.host is # self._tunnel_host below. self._tunnel() # Mark this connection as not reusable self.auto_open = 0 # Override the host with the one we're requesting data from. hostname = self._tunnel_host server_hostname = hostname if self.server_hostname is not None: server_hostname = self.server_hostname is_time_off = datetime.date.today() < RECENT_DATE if is_time_off: warnings.warn( ( "System time is way off (before {0}). This will probably " "lead to SSL verification errors" ).format(RECENT_DATE), SystemTimeWarning, ) # Wrap socket using verification with the root certs in # trusted_root_certs default_ssl_context = False if self.ssl_context is None: default_ssl_context = True self.ssl_context = create_urllib3_context( ssl_version=resolve_ssl_version(self.ssl_version), cert_reqs=resolve_cert_reqs(self.cert_reqs), ) context = self.ssl_context context.verify_mode = resolve_cert_reqs(self.cert_reqs) # Try to load OS default certs if none are given. # Works well on Windows (requires Python3.4+) if ( not self.ca_certs and not self.ca_cert_dir and not self.ca_cert_data and default_ssl_context and hasattr(context, "load_default_certs") ): context.load_default_certs() self.sock = ssl_wrap_socket( sock=conn, keyfile=self.key_file, certfile=self.cert_file, key_password=self.key_password, ca_certs=self.ca_certs, ca_cert_dir=self.ca_cert_dir, ca_cert_data=self.ca_cert_data, server_hostname=server_hostname, ssl_context=context, tls_in_tls=tls_in_tls, ) # If we're using all defaults and the connection # is TLSv1 or TLSv1.1 we throw a DeprecationWarning # for the host. if ( default_ssl_context and self.ssl_version is None and hasattr(self.sock, "version") and self.sock.version() in {"TLSv1", "TLSv1.1"} ): warnings.warn( "Negotiating TLSv1/TLSv1.1 by default is deprecated " "and will be disabled in urllib3 v2.0.0. Connecting to " "'%s' with '%s' can be enabled by explicitly opting-in " "with 'ssl_version'" % (self.host, self.sock.version()), DeprecationWarning, ) if self.assert_fingerprint: assert_fingerprint( self.sock.getpeercert(binary_form=True), self.assert_fingerprint ) elif ( context.verify_mode != ssl.CERT_NONE and not getattr(context, "check_hostname", False) and self.assert_hostname is not False ): # While urllib3 attempts to always turn off hostname matching from # the TLS library, this cannot always be done. So we check whether # the TLS Library still thinks it's matching hostnames. cert = self.sock.getpeercert() if not cert.get("subjectAltName", ()): warnings.warn( ( "Certificate for {0} has no `subjectAltName`, falling back to check for a " "`commonName` for now. This feature is being removed by major browsers and " "deprecated by RFC 2818. (See https://github.com/urllib3/urllib3/issues/497 " "for details.)".format(hostname) ), SubjectAltNameWarning, ) > _match_hostname(cert, self.assert_hostname or server_hostname) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connection.py:467: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cert = {'OCSP': ('http://r3.o.lencr.org',), 'caIssuers': ('http://r3.i.lencr.org/',), 'issuer': ((('countryName', 'US'),), (('organizationName', "Let's Encrypt"),), (('commonName', 'R3'),)), 'notAfter': 'May 31 08:28:01 2023 GMT', ...} asserted_hostname = 'public.bitia.link' def _match_hostname(cert, asserted_hostname): # Our upstream implementation of ssl.match_hostname() # only applies this normalization to IP addresses so it doesn't # match DNS SANs so we do the same thing! stripped_hostname = asserted_hostname.strip("u[]") if is_ipaddress(stripped_hostname): asserted_hostname = stripped_hostname try: > match_hostname(cert, asserted_hostname) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connection.py:540: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ cert = {'OCSP': ('http://r3.o.lencr.org',), 'caIssuers': ('http://r3.i.lencr.org/',), 'issuer': ((('countryName', 'US'),), (('organizationName', "Let's Encrypt"),), (('commonName', 'R3'),)), 'notAfter': 'May 31 08:28:01 2023 GMT', ...} hostname = 'public.bitia.link' def match_hostname(cert, hostname): """Verify that *cert* (in decoded format as returned by SSLSocket.getpeercert()) matches the *hostname*. RFC 2818 and RFC 6125 rules are followed, but IP addresses are not accepted for *hostname*. CertificateError is raised on failure. On success, the function returns nothing. """ if not cert: raise ValueError( "empty or no certificate, match_hostname needs a " "SSL socket or SSL context with either " "CERT_OPTIONAL or CERT_REQUIRED" ) try: # Divergence from upstream: ipaddress can't handle byte str host_ip = ipaddress.ip_address(_to_unicode(hostname)) except (UnicodeError, ValueError): # ValueError: Not an IP address (common case) # UnicodeError: Divergence from upstream: Have to deal with ipaddress not taking # byte strings. addresses should be all ascii, so we consider it not # an ipaddress in this case host_ip = None except AttributeError: # Divergence from upstream: Make ipaddress library optional if ipaddress is None: host_ip = None else: # Defensive raise dnsnames = [] san = cert.get("subjectAltName", ()) for key, value in san: if key == "DNS": if host_ip is None and _dnsname_match(value, hostname): return dnsnames.append(value) elif key == "IP Address": if host_ip is not None and _ipaddress_match(value, host_ip): return dnsnames.append(value) if not dnsnames: # The subject is only checked when there is no dNSName entry # in subjectAltName for sub in cert.get("subject", ()): for key, value in sub: # XXX according to RFC 2818, the most specific Common Name # must be used. if key == "commonName": if _dnsname_match(value, hostname): return dnsnames.append(value) if len(dnsnames) > 1: > raise CertificateError( "hostname %r " "doesn't match either of %s" % (hostname, ", ".join(map(repr, dnsnames))) E urllib3.util.ssl_match_hostname.CertificateError: hostname 'public.bitia.link' doesn't match either of 'subcom.link', 'www.subcom.link' /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/util/ssl_match_hostname.py:150: CertificateError During handling of the above exception, another exception occurred: self = <requests.adapters.HTTPAdapter object at 0x7fa3cf1289a0> request = <PreparedRequest [POST]>, stream = True timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection(request.url, proxies) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: if not chunked: > resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, ) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/requests/adapters.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa3cf128c70> method = 'POST', url = '/api/v1/container/create?recreate=true' body = b'--4f8d1c82d76d234eafe6bd07219da21b\r\nContent-Disposition: form-data; name="pipeline_zip"; filename="438fbf1fc8437c0...5\x06\x00\x00\x00\x00\x01\x00\x01\x00R\x00\x00\x00\x9e\x00\x00\x00\x00\x00\r\n--4f8d1c82d76d234eafe6bd07219da21b--\r\n' headers = {'User-Agent': 'python-requests/2.28.2', 'Accept-Encoding': 'gzip, deflate', 'Accept': '*/*', 'Connection': 'keep-alive', 'Content-Length': '474', 'Content-Type': 'multipart/form-data; boundary=4f8d1c82d76d234eafe6bd07219da21b'} retries = Retry(total=0, connect=None, read=False, redirect=None, status=None) redirect = False, assert_same_host = False timeout = Timeout(connect=None, read=None, total=None), pool_timeout = None release_conn = False, chunked = False, body_pos = None response_kw = {'decode_content': False, 'preload_content': False} parsed_url = Url(scheme=None, auth=None, host=None, port=None, path='/api/v1/container/create', query='recreate=true', fragment=None) destination_scheme = None, conn = None, release_this_conn = True http_tunnel_required = False, err = None, clean_exit = False def urlopen( self, method, url, body=None, headers=None, retries=None, redirect=True, assert_same_host=True, timeout=_Default, pool_timeout=None, release_conn=None, chunked=False, body_pos=None, **response_kw ): """ Get a connection from the pool and perform an HTTP request. This is the lowest level call for making a request, so you'll need to specify all the raw details. .. note:: More commonly, it's appropriate to use a convenience method provided by :class:`.RequestMethods`, such as :meth:`request`. .. note:: `release_conn` will only behave as expected if `preload_content=False` because we want to make `preload_content=False` the default behaviour someday soon without breaking backwards compatibility. :param method: HTTP request method (such as GET, POST, PUT, etc.) :param url: The URL to perform the request on. :param body: Data to send in the request body, either :class:`str`, :class:`bytes`, an iterable of :class:`str`/:class:`bytes`, or a file-like object. :param headers: Dictionary of custom headers to send, such as User-Agent, If-None-Match, etc. If None, pool headers are used. If provided, these headers completely replace any pool-specific headers. :param retries: Configure the number of retries to allow before raising a :class:`~urllib3.exceptions.MaxRetryError` exception. Pass ``None`` to retry until you receive a response. Pass a :class:`~urllib3.util.retry.Retry` object for fine-grained control over different types of retries. Pass an integer number to retry connection errors that many times, but no other types of errors. Pass zero to never retry. If ``False``, then retries are disabled and any exception is raised immediately. Also, instead of raising a MaxRetryError on redirects, the redirect response will be returned. :type retries: :class:`~urllib3.util.retry.Retry`, False, or an int. :param redirect: If True, automatically handle redirects (status codes 301, 302, 303, 307, 308). Each redirect counts as a retry. Disabling retries will disable redirect, too. :param assert_same_host: If ``True``, will make sure that the host of the pool requests is consistent else will raise HostChangedError. When ``False``, you can use the pool on an HTTP proxy and request foreign hosts. :param timeout: If specified, overrides the default timeout for this one request. It may be a float (in seconds) or an instance of :class:`urllib3.util.Timeout`. :param pool_timeout: If set and the pool is set to block=True, then this method will block for ``pool_timeout`` seconds and raise EmptyPoolError if no connection is available within the time period. :param release_conn: If False, then the urlopen call will not release the connection back into the pool once a response is received (but will release if you read the entire contents of the response such as when `preload_content=True`). This is useful if you're not preloading the response's content immediately. You will need to call ``r.release_conn()`` on the response ``r`` to return the connection back into the pool. If None, it takes the value of ``response_kw.get('preload_content', True)``. :param chunked: If True, urllib3 will send the body using chunked transfer encoding. Otherwise, urllib3 will send the body using the standard content-length form. Defaults to False. :param int body_pos: Position to seek to in file-like body in the event of a retry or redirect. Typically this won't need to be set because urllib3 will auto-populate the value when needed. :param \\**response_kw: Additional parameters are passed to :meth:`urllib3.response.HTTPResponse.from_httplib` """ parsed_url = parse_url(url) destination_scheme = parsed_url.scheme if headers is None: headers = self.headers if not isinstance(retries, Retry): retries = Retry.from_int(retries, redirect=redirect, default=self.retries) if release_conn is None: release_conn = response_kw.get("preload_content", True) # Check host if assert_same_host and not self.is_same_host(url): raise HostChangedError(self, url, retries) # Ensure that the URL we're connecting to is properly encoded if url.startswith("/"): url = six.ensure_str(_encode_target(url)) else: url = six.ensure_str(parsed_url.url) conn = None # Track whether `conn` needs to be released before # returning/raising/recursing. Update this variable if necessary, and # leave `release_conn` constant throughout the function. That way, if # the function recurses, the original value of `release_conn` will be # passed down into the recursive call, and its value will be respected. # # See issue #651 [1] for details. # # [1] <https://github.com/urllib3/urllib3/issues/651> release_this_conn = release_conn http_tunnel_required = connection_requires_http_tunnel( self.proxy, self.proxy_config, destination_scheme ) # Merge the proxy headers. Only done when not using HTTP CONNECT. We # have to copy the headers dict so we can safely change it without those # changes being reflected in anyone else's copy. if not http_tunnel_required: headers = headers.copy() headers.update(self.proxy_headers) # Must keep the exception bound to a separate variable or else Python 3 # complains about UnboundLocalError. err = None # Keep track of whether we cleanly exited the except block. This # ensures we do proper cleanup in finally. clean_exit = False # Rewind body position, if needed. Record current position # for future rewinds in the event of a redirect/retry. body_pos = set_file_position(body, body_pos) try: # Request a connection from the queue. timeout_obj = self._get_timeout(timeout) conn = self._get_conn(timeout=pool_timeout) conn.timeout = timeout_obj.connect_timeout is_new_proxy_conn = self.proxy is not None and not getattr( conn, "sock", None ) if is_new_proxy_conn and http_tunnel_required: self._prepare_proxy(conn) # Make the request on the httplib connection object. httplib_response = self._make_request( conn, method, url, timeout=timeout_obj, body=body, headers=headers, chunked=chunked, ) # If we're going to release the connection in ``finally:``, then # the response doesn't need to know about the connection. Otherwise # it will also try to release it and we'll have a double-release # mess. response_conn = conn if not release_conn else None # Pass method to Response for length checking response_kw["request_method"] = method # Import httplib's response into our own wrapper object response = self.ResponseCls.from_httplib( httplib_response, pool=self, connection=response_conn, retries=retries, **response_kw ) # Everything went great! clean_exit = True except EmptyPoolError: # Didn't get a connection from the pool, no need to clean up clean_exit = True release_this_conn = False raise except ( TimeoutError, HTTPException, SocketError, ProtocolError, BaseSSLError, SSLError, CertificateError, ) as e: # Discard the connection for these exceptions. It will be # replaced during the next _get_conn() call. clean_exit = False def _is_ssl_error_message_from_http_proxy(ssl_error): # We're trying to detect the message 'WRONG_VERSION_NUMBER' but # SSLErrors are kinda all over the place when it comes to the message, # so we try to cover our bases here! message = " ".join(re.split("[^a-z]", str(ssl_error).lower())) return ( "wrong version number" in message or "unknown protocol" in message ) # Try to detect a common user error with proxies which is to # set an HTTP proxy to be HTTPS when it should be 'http://' # (ie {'http': 'http://proxy', 'https': 'https://proxy'}) # Instead we add a nice error message and point to a URL. if ( isinstance(e, BaseSSLError) and self.proxy and _is_ssl_error_message_from_http_proxy(e) and conn.proxy and conn.proxy.scheme == "https" ): e = ProxyError( "Your proxy appears to only use HTTP and not HTTPS, " "try changing your proxy URL to be HTTP. See: " "https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html" "#https-proxy-error-http-proxy", SSLError(e), ) elif isinstance(e, (BaseSSLError, CertificateError)): e = SSLError(e) elif isinstance(e, (SocketError, NewConnectionError)) and self.proxy: e = ProxyError("Cannot connect to proxy.", e) elif isinstance(e, (SocketError, HTTPException)): e = ProtocolError("Connection aborted.", e) > retries = retries.increment( method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2] ) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/connectionpool.py:787: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = Retry(total=0, connect=None, read=False, redirect=None, status=None) method = 'POST', url = '/api/v1/container/create?recreate=true', response = None error = SSLError(CertificateError("hostname 'public.bitia.link' doesn't match either of 'subcom.link', 'www.subcom.link'")) _pool = <urllib3.connectionpool.HTTPSConnectionPool object at 0x7fa3cf128c70> _stacktrace = <traceback object at 0x7fa3cf129880> def increment( self, method=None, url=None, response=None, error=None, _pool=None, _stacktrace=None, ): """Return a new Retry object with incremented retry counters. :param response: A response object, or None, if the server did not return a response. :type response: :class:`~urllib3.response.HTTPResponse` :param Exception error: An error encountered during the request, or None if the response was received successfully. :return: A new ``Retry`` object. """ if self.total is False and error: # Disabled, indicate to re-raise the error. raise six.reraise(type(error), error, _stacktrace) total = self.total if total is not None: total -= 1 connect = self.connect read = self.read redirect = self.redirect status_count = self.status other = self.other cause = "unknown" status = None redirect_location = None if error and self._is_connection_error(error): # Connect retry? if connect is False: raise six.reraise(type(error), error, _stacktrace) elif connect is not None: connect -= 1 elif error and self._is_read_error(error): # Read retry? if read is False or not self._is_method_retryable(method): raise six.reraise(type(error), error, _stacktrace) elif read is not None: read -= 1 elif error: # Other retry? if other is not None: other -= 1 elif response and response.get_redirect_location(): # Redirect retry? if redirect is not None: redirect -= 1 cause = "too many redirects" redirect_location = response.get_redirect_location() status = response.status else: # Incrementing because of a server error like a 500 in # status_forcelist and the given method is in the allowed_methods cause = ResponseError.GENERIC_ERROR if response and response.status: if status_count is not None: status_count -= 1 cause = ResponseError.SPECIFIC_ERROR.format(status_code=response.status) status = response.status history = self.history + ( RequestHistory(method, url, error, status, redirect_location), ) new_retry = self.new( total=total, connect=connect, read=read, redirect=redirect, status=status_count, other=other, history=history, ) if new_retry.is_exhausted(): > raise MaxRetryError(_pool, url, error or ResponseError(cause)) E urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='public.bitia.link', port=443): Max retries exceeded with url: /api/v1/container/create?recreate=true (Caused by SSLError(CertificateError("hostname 'public.bitia.link' doesn't match either of 'subcom.link', 'www.subcom.link'"))) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/urllib3/util/retry.py:592: MaxRetryError During handling of the above exception, another exception occurred: capsys = <_pytest.capture.CaptureFixture object at 0x7fa3cf1286a0> def test_run_simple(capsys): # set the plain-text mode. bconfig.set_config("plain", True) > bitia.__main__.run_user_input("ls -ltr /") tests/test_sanity.py:28: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ bitia/__main__.py:47: in wrapper retval = func(*args, **kwargs) bitia/__main__.py:175: in run_user_input create_remote_container(user_input, recreate=rerun, output_lines=output_lines) bitia/__main__.py:47: in wrapper retval = func(*args, **kwargs) bitia/__main__.py:69: in create_remote_container res = bhelper.post_pipeline_task( bitia/helper.py:58: in post_pipeline_task return bsession.post( bitia/session.py:47: in post return g_session.post(*args, **kwargs) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/requests/sessions.py:635: in post return self.request("POST", url, data=data, json=json, **kwargs) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/requests/sessions.py:587: in request resp = self.send(prep, **send_kwargs) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/requests/sessions.py:701: in send r = adapter.send(request, **kwargs) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = <requests.adapters.HTTPAdapter object at 0x7fa3cf1289a0> request = <PreparedRequest [POST]>, stream = True timeout = Timeout(connect=None, read=None, total=None), verify = True cert = None, proxies = OrderedDict() def send( self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None ): """Sends PreparedRequest object. Returns Response object. :param request: The :class:`PreparedRequest <PreparedRequest>` being sent. :param stream: (optional) Whether to stream the request content. :param timeout: (optional) How long to wait for the server to send data before giving up, as a float, or a :ref:`(connect timeout, read timeout) <timeouts>` tuple. :type timeout: float or tuple or urllib3 Timeout object :param verify: (optional) Either a boolean, in which case it controls whether we verify the server's TLS certificate, or a string, in which case it must be a path to a CA bundle to use :param cert: (optional) Any user-provided SSL certificate to be trusted. :param proxies: (optional) The proxies dictionary to apply to the request. :rtype: requests.Response """ try: conn = self.get_connection(request.url, proxies) except LocationValueError as e: raise InvalidURL(e, request=request) self.cert_verify(conn, request.url, verify, cert) url = self.request_url(request, proxies) self.add_headers( request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies, ) chunked = not (request.body is None or "Content-Length" in request.headers) if isinstance(timeout, tuple): try: connect, read = timeout timeout = TimeoutSauce(connect=connect, read=read) except ValueError: raise ValueError( f"Invalid timeout {timeout}. Pass a (connect, read) timeout tuple, " f"or a single float to set both timeouts to the same value." ) elif isinstance(timeout, TimeoutSauce): pass else: timeout = TimeoutSauce(connect=timeout, read=timeout) try: if not chunked: resp = conn.urlopen( method=request.method, url=url, body=request.body, headers=request.headers, redirect=False, assert_same_host=False, preload_content=False, decode_content=False, retries=self.max_retries, timeout=timeout, ) # Send the request. else: if hasattr(conn, "proxy_pool"): conn = conn.proxy_pool low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT) try: skip_host = "Host" in request.headers low_conn.putrequest( request.method, url, skip_accept_encoding=True, skip_host=skip_host, ) for header, value in request.headers.items(): low_conn.putheader(header, value) low_conn.endheaders() for i in request.body: low_conn.send(hex(len(i))[2:].encode("utf-8")) low_conn.send(b"\r\n") low_conn.send(i) low_conn.send(b"\r\n") low_conn.send(b"0\r\n\r\n") # Receive the response from the server r = low_conn.getresponse() resp = HTTPResponse.from_httplib( r, pool=conn, connection=low_conn, preload_content=False, decode_content=False, ) except Exception: # If we hit any problems here, clean up the connection. # Then, raise so that we can handle the actual exception. low_conn.close() raise except (ProtocolError, OSError) as err: raise ConnectionError(err, request=request) except MaxRetryError as e: if isinstance(e.reason, ConnectTimeoutError): # TODO: Remove this in 3.0.0: see #2811 if not isinstance(e.reason, NewConnectionError): raise ConnectTimeout(e, request=request) if isinstance(e.reason, ResponseError): raise RetryError(e, request=request) if isinstance(e.reason, _ProxyError): raise ProxyError(e, request=request) if isinstance(e.reason, _SSLError): # This branch is for urllib3 v1.22 and later. > raise SSLError(e, request=request) E requests.exceptions.SSLError: HTTPSConnectionPool(host='public.bitia.link', port=443): Max retries exceeded with url: /api/v1/container/create?recreate=true (Caused by SSLError(CertificateError("hostname 'public.bitia.link' doesn't match either of 'subcom.link', 'www.subcom.link'"))) /root/.cache/pypoetry/virtualenvs/bitia-xNCUUZ2P-py3.8/lib/python3.8/site-packages/requests/adapters.py:563: SSLError ------------------------------ Captured log call ------------------------------- WARNING urllib3.connection:connection.py:542 Certificate did not match expected hostname: public.bitia.link. Certificate: {'subject': ((('commonName', 'subcom.link'),),), 'issuer': ((('countryName', 'US'),), (('organizationName', "Let's Encrypt"),), (('commonName', 'R3'),)), 'version': 3, 'serialNumber': '0458DF12E0741A8E317D7F4AFB02C9B7FD5F', 'notBefore': 'Mar 2 08:28:02 2023 GMT', 'notAfter': 'May 31 08:28:01 2023 GMT', 'subjectAltName': (('DNS', 'subcom.link'), ('DNS', 'www.subcom.link')), 'OCSP': ('http://r3.o.lencr.org',), 'caIssuers': ('http://r3.i.lencr.org/',)} =========================== short test summary info ============================ FAILED tests/test_sanity.py::test_run_repeat - requests.exceptions.SSLError: HTTPSConnectionPool(host='public.bitia.link', port=443): Max retries exceeded with url: /api/v1/container/create?recreate=true (Caused by SSLError(CertificateError("hostname 'public.bitia.link' doesn't match either of 'subcom.link', 'www.subcom.link'"))) FAILED tests/test_sanity.py::test_run_simple - requests.exceptions.SSLError: HTTPSConnectionPool(host='public.bitia.link', port=443): Max retries exceeded with url: /api/v1/container/create?recreate=true (Caused by SSLError(CertificateError("hostname 'public.bitia.link' doesn't match either of 'subcom.link', 'www.subcom.link'"))) ========================= 2 failed, 4 passed in 1.65s ========================== make: *** [Makefile:21: test] Error 1 section_end:1678435816:step_script [0Ksection_start:1678435816:cleanup_file_variables [0K[0K[36;1mCleaning up project directory and file based variables[0;m[0;m section_end:1678435817:cleanup_file_variables [0K[31;1mERROR: Job failed: exit code 1 [0;m