Compare commits

...

159 Commits

Author SHA1 Message Date
dependabot[bot]
dcb302493a build(deps): bump pypa/cibuildwheel from 2.22.0 to 2.23.2
Bumps [pypa/cibuildwheel](https://github.com/pypa/cibuildwheel) from 2.22.0 to 2.23.2.
- [Release notes](https://github.com/pypa/cibuildwheel/releases)
- [Changelog](https://github.com/pypa/cibuildwheel/blob/main/docs/changelog.md)
- [Commits](https://github.com/pypa/cibuildwheel/compare/v2.22.0...v2.23.2)

---
updated-dependencies:
- dependency-name: pypa/cibuildwheel
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-04-02 12:45:46 +01:00
Daniele Varrazzo
5509e01108
Merge pull request #1755 from bwoodsend/fix-macos-deployment-target
ci(macos): Avoid linking against Homebrew
2025-01-06 01:15:51 +01:00
Daniele Varrazzo
6cd0fbdc49 fix(macos): don't crash on undefined variable 2025-01-05 22:00:24 +01:00
Daniele Varrazzo
cee23d83e0 chore(macos): drop unneeded gettext from libpq building 2025-01-05 21:37:12 +01:00
Daniele Varrazzo
5bfba4c961 refactor: use pushd/popd instead of cd 2025-01-05 21:37:12 +01:00
Daniele Varrazzo
b943457896 test: drop brew curl to use the system one 2025-01-05 21:37:12 +01:00
Brénainn Woodsend
d0bc154f31 build(macos): Enable cross compiling libpq across macOS architectures
The GitHub Actions runners look like they're only 1 year away from the
last macOS x86_64 platform being removed. Get ahead of the game and
build x86_64 on arm64.
2025-01-05 20:44:25 +01:00
Daniele Varrazzo
1eac4fd4da test(macos): soften tests to account for macOS polling differences 2025-01-05 04:00:47 +01:00
Daniele Varrazzo
c8abc5ce61 ci(macos): no fast tests on macOS package building
We don't run complete tests in CI, so let's not waste this chance. The
overhead for complete tests is minimal compared to all the pipeline
boilerplate.
2025-01-05 04:00:00 +01:00
Daniele Varrazzo
65626ec565 ci(macos): add libpq build caching 2025-01-05 04:00:00 +01:00
Daniele Varrazzo
310bc75532 ci(macos): move libpq build script to BEFORE_ALL build step
This is is how it is organised in Linux.
2025-01-05 04:00:00 +01:00
Brénainn Woodsend
d43e5fe092 ci(macos): Avoid linking against homebrew
Homebrew binaries are always compiled for exactly the version they're
installed on making them very un-portable. When a wheel is "repaired" by
cibuildwheel, delocate-wheel pulls in _psycopg's dependencies
(libpq.dylib, libssl.dylib and libcrypto.dylib) which, on a GitHub
Actions macOS 14 runner, are provided by Homebrew and are therefore only
macOS >= 14 compatible. The resultant wheel is therefore incompatible
with all but the latest macOS versions.

Build all dependencies from source so that we can set the deployment
target to something sensible. Fixes #1753.
2025-01-04 21:23:15 +01:00
Daniele Varrazzo
3b684f91ca ci: rename merged artifact package
It doesn't contain binary packages only
2025-01-04 21:06:33 +01:00
dependabot[bot]
bf7fc6cfa4 build(deps): bump peter-evans/repository-dispatch from 2 to 3
Bumps [peter-evans/repository-dispatch](https://github.com/peter-evans/repository-dispatch) from 2 to 3.
- [Release notes](https://github.com/peter-evans/repository-dispatch/releases)
- [Commits](https://github.com/peter-evans/repository-dispatch/compare/v2...v3)

---
updated-dependencies:
- dependency-name: peter-evans/repository-dispatch
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-04 21:05:08 +01:00
Daniele Varrazzo
979d56a797 chore: update cibuildwheel to 2.22.0 2025-01-04 21:04:14 +01:00
dependabot[bot]
4903f1c5d6 build(deps): bump actions/cache from 3 to 4
Bumps [actions/cache](https://github.com/actions/cache) from 3 to 4.
- [Release notes](https://github.com/actions/cache/releases)
- [Changelog](https://github.com/actions/cache/blob/main/RELEASES.md)
- [Commits](https://github.com/actions/cache/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/cache
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-01-04 21:02:30 +01:00
Daniele Varrazzo
1dc7b5b70b ci: add merge step to download all packages at once 2025-01-04 21:01:09 +01:00
Daniele Varrazzo
ed4ba11d17
Merge pull request #1772 from psycopg/ci-vcpkg
Package psycopg2-binary for windows using vcpkg libpq
2025-01-04 21:00:44 +01:00
Daniele Varrazzo
947f731400 ci: test against final Python 3.13 2025-01-04 19:06:56 +01:00
Daniele Varrazzo
b8d49e6280 test: skip module test on Windows
Life is too short to figure out why it fails.
2025-01-04 19:06:56 +01:00
Daniele Varrazzo
4dfa680a71 ci(macos): use the macos-13 runners
macos-12 is not supported anymore.
2025-01-04 19:06:56 +01:00
Daniele Varrazzo
f4282c6d87 chore: drop Postgres version parsing in setup.py
The macro is in the include files, no idea why parsing it from pg_config
was needed.
2025-01-04 19:06:56 +01:00
Daniele Varrazzo
a8765121d9 fix(ci): handle other pg_config options required by setup.py 2025-01-04 19:06:56 +01:00
Daniele Varrazzo
bb52bcf769 ci(windows): create the psycopg2-binary package in Github 2025-01-04 19:06:56 +01:00
Daniele Varrazzo
fa24c922e7 ci(windows): build binary packages using the vcpkg package 2025-01-04 19:06:56 +01:00
Daniele Varrazzo
3c7889b0e7 chore: drop appveyor CI integration 2025-01-04 19:06:56 +01:00
Daniele Varrazzo
e83754a414 ci: work around the envionment breaking guard 2024-10-15 13:49:08 +02:00
Daniele Varrazzo
a805acf59f chore: bump to version 2.9.10 2024-10-15 10:40:56 +02:00
Daniele Varrazzo
78561ac99d
Merge pull request #1728 from romank0/fetch-notifications-on-commit
Adds notifies processing during commit
2024-10-11 03:13:56 +02:00
Daniele Varrazzo
5283a835dc chore: add TransactionTimeout error, added in PostgreSQL 17
Url to fetch source changed from the official Postgres one to the Github
mirror because the former throttled us.
2024-10-11 02:41:31 +02:00
Daniele Varrazzo
f64dd397fd docs: add news entry about notifications on commit 2024-10-11 00:29:28 +02:00
Roman Konoval
cba6d39be0 removes duplication in tests 2024-10-11 00:26:05 +02:00
Roman Konoval
282360dd04 adds notifications processing after every PQexec 2024-10-11 00:26:05 +02:00
Roman Konoval
362cb00978 Adds notifies processing in pq_commit 2024-10-11 00:24:37 +02:00
Daniele Varrazzo
eaeeb76944
Merge pull request #1729 from edgarrmondragon/1692-py313-wheels
Build Python 3.13 wheels, drop support for Python 3.7
2024-10-11 00:17:47 +02:00
Daniele Varrazzo
4987362fb4 ci(windows): drop Python 3.8 packages
The runner image to build 3.8 package doesn't seem to have a currently
supported database, and the previously used 9.6 is no more supported on
current runners.
2024-10-10 15:48:48 +02:00
Daniele Varrazzo
8c9a35de38 ci: test with PostgreSQL 17 2024-10-09 19:46:48 +02:00
Daniele Varrazzo
563b55a725 docs: bump supported versions to Python 3.13 and Postgres 17 2024-10-08 17:08:02 +02:00
Daniele Varrazzo
dac8fa5632 ci(win): use PostgreSQL 13 for tests
By latest errors, it seems that Postgres 9.6 is no more supported on
VS2019 image. By documentation, it also seem that Postgres 13 is the
most recent supported database and not available in VS2015 image.
Therefore, drop Python 3.8 test (and likely build).

See https://www.appveyor.com/docs/services-databases/#postgresql
2024-10-08 17:04:40 +02:00
Edgar Ramírez-Mondragón
e1cf23d9c7
Drop Python 3.7 in other places 2024-10-05 01:41:20 -06:00
Edgar Ramírez-Mondragón
0eccfbec47
Ensure pg data dir exists 2024-10-05 01:35:47 -06:00
Edgar Ramírez-Mondragón
26f0f13b39
Use py executable in appveyor 2024-10-05 01:29:06 -06:00
Edgar Ramírez-Mondragón
a59079a4f2
Build Python 3.13 wheels 2024-10-04 22:40:03 -06:00
Anoosh Dsouza
f9780aa054 fixed a typo in doc/src/usage.rst file 2024-09-19 20:56:05 +02:00
0xTiger
658afe4cd9 docs: tiny grammar fix "a" -> "one" 2024-07-17 18:44:43 +02:00
Daniele Varrazzo
f79867c9f2 chore: bump to next dev version 2024-07-14 22:01:17 +02:00
Daniele Varrazzo
dc5249ba01
Merge pull request #1695 from befeleme/py3.13
Add support for Python 3.13
2024-07-14 21:58:10 +02:00
Daniele Varrazzo
7c2706a8b4 docs: note Python 3.13 support in news file 2024-07-14 21:57:27 +02:00
Karolina Surma
4a4b5acdc2 Declare the support for Python 3.13 in classifiers 2024-04-26 09:21:05 +02:00
Karolina Surma
efc5ad01e0 Add Python 3.13.0a6 to tox matrix 2024-04-26 09:21:05 +02:00
Karolina Surma
866bcef589 Add Python 3.13.0a6 to CI 2024-04-26 09:21:05 +02:00
Karolina Surma
3b9aa7cf9f Fix tests with Python 3.13
The textual representation of addresses has changed, adapt the code to
expect different values on Python 3.13+.
See: https://github.com/python/cpython/commit/f22bf8e3cf899896cf587099d292
2024-04-24 10:15:54 +02:00
Karolina Surma
829a7a2be9 _PyInterpreterState_Get() has become public in Python 3.13
Since 3.13.0a1 it has been renamed to PyInterpreterStateGet()
Source: https://github.com/python/cpython/pull/106321
2024-04-24 10:15:50 +02:00
Nick Zandbergen
a971c11d50 Update lobject_type.c
Add bytes as accepted input for documentation
2024-02-15 22:26:05 +00:00
dependabot[bot]
00870545b7 build(deps): bump actions/setup-python from 4 to 5
Bumps [actions/setup-python](https://github.com/actions/setup-python) from 4 to 5.
- [Release notes](https://github.com/actions/setup-python/releases)
- [Commits](https://github.com/actions/setup-python/compare/v4...v5)

---
updated-dependencies:
- dependency-name: actions/setup-python
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-02 03:27:05 +00:00
dependabot[bot]
bf45060074 build(deps): bump actions/upload-artifact from 3 to 4
Bumps [actions/upload-artifact](https://github.com/actions/upload-artifact) from 3 to 4.
- [Release notes](https://github.com/actions/upload-artifact/releases)
- [Commits](https://github.com/actions/upload-artifact/compare/v3...v4)

---
updated-dependencies:
- dependency-name: actions/upload-artifact
  dependency-type: direct:production
  update-type: version-update:semver-major
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-01-02 03:26:43 +00:00
Daniele Varrazzo
5fb59cd6ee Merge branch 'macos-arm64-py312' 2023-11-01 11:45:52 +01:00
Daniele Varrazzo
e0d1daf290 Merge branch 'wheel-312-win' 2023-11-01 11:45:35 +01:00
Rene Leonhardt
941ac9a724 chore: add support for Python 3.12 macOS arm64 wheels 2023-11-01 11:33:30 +01:00
Rene Leonhardt
4e473010a3
chore: let dependabot update GitHub actions 2023-10-30 09:19:50 +00:00
Rene Leonhardt
8947b00142 chore: update GitHub actions and Postgres image tags 2023-10-29 18:05:30 +00:00
Daniele Varrazzo
46191f1fde ci(windows): add Python 3.12 to the testing grid 2023-10-28 11:41:40 +02:00
Daniele Varrazzo
e73d2fa9f0 ci(win32): install the setuptools package to build in appveyor
Present so far, it wasn't installed in the first image containing Python 3.12.
2023-10-28 01:57:56 +02:00
Daniele Varrazzo
89005ac5b8 docs: add README blurb pointing to psycopg 3 on PyPI
See #1632.
2023-10-10 23:35:40 +02:00
Panagiotis H.M. Issaris
bfdffc2c57
chore: show a Changelog link on PyPI 2023-10-06 15:05:42 +01:00
Daniele Varrazzo
ad5bee7054 chore: bump version number to 2.9.9 2023-10-03 11:39:35 +02:00
Daniele Varrazzo
37d1de1c8f chore: add support for Python 3.12 2023-10-03 11:39:35 +02:00
Daniele Varrazzo
abf2723c0a chore: drop support for Python 3.6 2023-10-03 11:39:35 +02:00
Daniele Varrazzo
2da65a715c chore: drop leftover Python 2.7 import aliases from setup.py 2023-10-03 11:39:35 +02:00
Daniele Varrazzo
3fa60fd268 chore: bump doc requirement complained by dependabot 2023-10-03 11:39:32 +02:00
Daniele Varrazzo
1c1484e43b ci: better interaction with scaleway build server 2023-10-03 11:39:32 +02:00
Daniele Varrazzo
c81cec604f chore: bump to next dev release 2023-10-03 11:20:17 +02:00
Daniele Varrazzo
7fe8cb77ca chore: bump docs requirements dependabot complains about 2023-09-28 09:29:21 +02:00
Daniele Varrazzo
b39d5d6492 chore: bundle libpq 16
- https://github.com/psycopg/psycopg/issues/650
- https://github.com/psycopg/psycopg/discussions/528
2023-09-28 09:26:33 +02:00
Daniele Varrazzo
921510d5be docs: replace "compiled against" with "bundled with" in news file
Less confrontational...
2023-09-28 09:22:08 +02:00
Daniele Varrazzo
999d7a6d01 test: skip ssl test if libpq runtime > 16
Close #1619
2023-09-11 16:11:07 +01:00
Daniele Varrazzo
3eee3e336d ci: fix passing env vars to build scripts 2023-08-04 17:28:24 +01:00
Daniele Varrazzo
1e0086b1fe chore: bump version to 2.9.7 2023-08-04 17:22:46 +01:00
Daniele Varrazzo
4fe28d661a Merge branch 'dev/init-failure' 2023-08-04 17:20:50 +01:00
Daniele Varrazzo
14e06d8185 docs: mention module init errors fix in news file 2023-08-04 17:20:02 +01:00
Jacob Champion
959339cefb Return NULL on failed module initialization
Previously, any exceptions raised during initialization were swallowed
with a message like

    SystemError: initialization of _psycopg raised unreported exception

Fixes #1598.
2023-08-04 17:19:58 +01:00
Daniele Varrazzo
fb77bdca0b Merge branch 'dev/fix-meson-build' 2023-08-04 17:19:41 +01:00
Daniele Varrazzo
ef7053c070 docs: add pg_config improvement to news file 2023-08-04 17:18:59 +01:00
Jacob Champion
ea71fbcd46 setup.py: handle more corner cases for pg_config
- Differentiate between unexpected empty values and execution failure.
- Accept empty --cppflags and --ldflags output. Fixes #1599.
- Accept UTF-8 output from pg_config, for alternative client locales.
2023-08-04 17:18:56 +01:00
Daniele Varrazzo
0c5b5f4ec3 chore: bump cibuildwheel version to 2.14.1 2023-08-04 17:18:17 +01:00
Daniele Varrazzo
20fcfd6786 chore: upgrade libpq and openssl versions used in packaging 2023-08-04 17:18:17 +01:00
Xing Guo
329f43c762 Use except psycopg2.Error as e in example. 2023-07-27 10:09:03 +01:00
Daniele Varrazzo
9f020124f8 docs: don't show objects in side bar
Too wide, too ugly, useless to navigate.

Close #1587
2023-06-07 12:09:12 +02:00
Christoph Berg
8b17e218be Disable test_ssl_attribute on PG16+
PG15 changed the semantics of some ssl attributes (#1506), and a very
similar regression test failure has now been observed again with PG16.
Disable the test for now.
2023-05-29 17:42:42 +02:00
Brent Wilkins
c96f991a8d Updated from deprecated license_file parameter 2023-05-25 19:01:27 +02:00
Amirsoroush
3450d159b5 fix typo in Usage.html page in documentation 2023-04-23 22:47:26 +02:00
Daniele Varrazzo
5108191aa5 chore: upgrade docs build dependencies 2023-04-17 20:11:47 +02:00
Daniele Varrazzo
638be85eb6 docs: drop use of print statement, use the print() function instead
Close #1556
2023-04-17 20:07:17 +02:00
Daniele Varrazzo
0b01ded426 ci: drop github download script
Easier to do interactively, now that all the artifacts are packaged in
the same archive.
2023-04-03 05:10:36 +02:00
Daniele Varrazzo
46238ba351 ci: fix cache key by setting lib versions in job env 2023-04-03 05:07:01 +02:00
Daniele Varrazzo
51dd59ef9d chore: drop Python 3.6 from Windows packages 2023-04-03 05:06:13 +02:00
Daniele Varrazzo
333b3b7ac4 ci: use cibuildwheel to build linux wheel packages 2023-04-02 17:56:29 +02:00
Daniele Varrazzo
7a8f4d6222 chore: bump version to 2.9.6 2023-04-02 13:00:38 +02:00
Daniele Varrazzo
b747b5b0fd ci: bundle all build artifacts in a single directory 2023-04-02 12:59:29 +02:00
Daniele Varrazzo
1781e8b2c9 build: package openssl 1.1.1t with binary packages 2023-04-02 12:47:29 +02:00
Daniele Varrazzo
fdb204b4e3 docs: mention manylinux2014 packages in news file 2023-04-02 12:47:25 +02:00
Daniele Varrazzo
09b82e4094 ci: bump qemu action version to drop node deprecation warning 2023-03-30 17:09:23 +02:00
Daniele Varrazzo
97df29a312 ci: build macOS packages using cibuildwheel
Close #1558.
2023-03-30 13:31:30 +02:00
Daniele Varrazzo
daeec37fab
Merge pull request #1545 from AmirBitaraf/aarch64_manylinux2014_libpq
Move to manylinux2014 for aarch64, ppc64le builds.
2023-03-27 17:11:28 +02:00
Amir Bitaraf
c0666b0935 Modify LD_LIBRARY_PATH to support all architectures 2023-03-26 18:35:00 +01:00
Amir Bitaraf
cc21faa4f4 Move to manylinux2014 for aarch64, ppc64le builds. 2023-03-26 18:35:00 +01:00
Daniele Varrazzo
63947e2552 ci: drop test on Python 3.6
The image is not available anymore
2023-02-25 16:07:21 +01:00
Daniele Varrazzo
52df8371f3 ci: pin tox to v3
Not interested in fixing incompatibility changes.
2023-02-25 16:05:14 +01:00
Daniele Varrazzo
feeb989323 docs: use https url in license file
Close #1549.
2023-02-25 15:36:35 +01:00
Daniele Varrazzo
e8d92b74fd Merge branch 'py311-win32' 2022-11-07 23:42:28 +01:00
Daniele Varrazzo
026b5bf3ab ci: re-enable builds suspended for build win32 packages for Python 3.11 2022-11-07 23:41:50 +01:00
Daniele Varrazzo
02b5e226f4 ci: build packages for Python 3.11 for Workgroup... for Windows! 2022-11-07 22:38:07 +01:00
Daniele Varrazzo
57009707b1 ci: Test Python 3.11 on Appveyor 2022-11-07 22:32:56 +01:00
Daniele Varrazzo
3182ea2303 ci: adapt macOS arm64 build script to changes in Python 3.11 and PostgreSQL 15 2022-10-27 00:41:09 +02:00
Daniele Varrazzo
ea32730a39 Merge branch 'build-macos-py311' 2022-10-27 00:40:23 +02:00
Daniele Varrazzo
deb00e5454 ci: re-enable builds suspended to build macOS packages for Python 3.11 2022-10-27 00:39:11 +02:00
Daniele Varrazzo
8c824d0e47 Build packages for macOS x86_64 Python 3.11
The required images weren't available at the time of building the other
packages. See #1514.

The changeset includes temporary changes to skip other builds. They will
be reverted before merging.
2022-10-27 00:28:04 +02:00
Daniele Varrazzo
1bf8e77ea2 chore: remove macOS 3.11 build from build grid
Not available yet on Github: see build failure at
https://github.com/psycopg/psycopg2/actions/runs/3320363567/jobs/5486654852
2022-10-25 13:34:29 +02:00
Daniele Varrazzo
af3ee06ec0 chore: upgrade Github action versions 2022-10-25 13:04:28 +02:00
Daniele Varrazzo
963fb1190b chore: fix yaml syntax in Github Action workflow 2022-10-25 12:59:23 +02:00
Daniele Varrazzo
27a99dac72 chore: bump version number to release 2.9.5 2022-10-25 12:55:57 +02:00
Daniele Varrazzo
78690cfaf8 lint: reformat appveyor yaml 2022-10-25 12:54:34 +02:00
Daniele Varrazzo
259d15ae3e chore: build binary packages with OpenSSL 1.1.1q 2022-10-25 12:47:46 +02:00
Daniele Varrazzo
77039cad63 chore: fix directory where to find binary package after build 2022-10-25 12:47:02 +02:00
Daniele Varrazzo
e6e465c509 chore: build binary packages using libpq from PostgreSQL 15
fix #1497 as a side effect of using libpq 15.
2022-10-25 12:47:02 +02:00
Daniele Varrazzo
12700a5f02 Build packages for Python 3.11 2022-10-25 12:37:23 +02:00
Daniele Varrazzo
271dd1fce7 chore: move cache_rebuild file into appveyor dir 2022-10-25 12:24:10 +02:00
Daniele Varrazzo
e4b2a197c6 chore: bump to next dev version 2022-10-25 12:20:46 +02:00
Daniele Varrazzo
20bb486663 Merge branch 'doc_examples_executemanybatch' 2022-10-20 21:33:46 +02:00
Daniele Varrazzo
f401d0b738 docs: fix reST syntax and whitespace in executemany examples 2022-10-20 21:31:18 +02:00
Ion Alberdi
4912be0e7f [test_basic_types] Add test for array[%s] on NULL arrays
Add test to verifity the fix for #1507.
2022-10-11 13:02:22 +01:00
Hannes
aabac5df31
Add executemany & execute_batch examples 2022-10-10 19:08:46 +02:00
Daniele Varrazzo
a12dbc4357 docs: fix typos in release notes 2022-10-06 03:58:50 +01:00
Daniele Varrazzo
bc82c8f9cc fix: set default SYSCONFDIR to the quasi-standard /etc/postgresql-common
Fix #1365.
2022-10-06 03:49:25 +01:00
Daniele Varrazzo
c38aa27d7d chore: bump version number to release 2.9.4 2022-10-06 03:32:17 +01:00
Daniele Varrazzo
bd96594e2d docs: add link to release notes page to find which Python version is supported
Close #1418.
2022-10-06 03:27:40 +01:00
Daniele Varrazzo
182a51a33f chore: upgrade packaged libpq version and dependencies
appveyor.cache_rebuild reformatted for greppability.
2022-10-06 03:27:40 +01:00
Daniele Varrazzo
76b703e910 Merge branch 'pg15' 2022-10-06 02:59:28 +01:00
Daniele Varrazzo
29a65f756c chore: upgrade error codes to PostgreSQL 15 2022-10-06 02:26:09 +01:00
Daniele Varrazzo
6d815f5df9 test: adapt ssl test to libpq 15
See #1506, PostgreSQL bug 17625
(https://www.postgresql.org/message-id/17625-fc47c78b7d71b534%40postgresql.org)
2022-10-06 02:09:19 +01:00
Daniele Varrazzo
c7326f8da7 test: add PostgreSQL 15 to the test grid 2022-10-06 02:09:19 +01:00
Daniele Varrazzo
68d786b610 Merge branch 'fix-1487' 2022-10-06 02:09:06 +01:00
Daniele Varrazzo
7054e1aadf test: add test to verify register_range() with names requiring escape
Unlike for register_composite(), this works already.
2022-10-06 02:05:49 +01:00
Daniele Varrazzo
ac25d3bdc0 fix: look up for range types defined in schemas in the search path 2022-10-06 02:05:39 +01:00
Daniele Varrazzo
9535462ce9 fix: correctly handle composites with names or schema requiring escape 2022-10-06 01:56:28 +01:00
Daniele Varrazzo
d88e4c2a3c fix: handle types in the search path in register_composite()
Fix #1487.
2022-10-06 01:10:07 +01:00
Daniele Varrazzo
31a80410db chore: bump to next dev release 2022-10-06 00:21:27 +01:00
Daniele Varrazzo
d6c81b4ff0 docs: mention MacOS ARM wheel support 2022-10-05 19:48:04 +01:00
Magnus Watn
c6f30880a2 Remove Apple M1 bullet point from issue template
https://github.com/psycopg/psycopg2/issues/1286 is now closed.
2022-10-03 10:09:46 +01:00
Daniele Varrazzo
e3664380c4 build: fix starting Postgres in macOS build script
The brew command fails with:

    Could not enable service: 125: Domain does not support specified action
    Error: Failure while executing; `/bin/launchctl enable gui/501/homebrew.mxcl.postgresql@14` exited with 125.
2022-09-25 02:49:16 +01:00
Daniele Varrazzo
fdf957dcbd build: use "latest" version of github builders 2022-09-25 02:46:21 +01:00
Nikita Sobolev
3e7bb8d1aa Remove __nonzero__ method 2022-07-30 14:03:10 +02:00
Tim Tisdall
07c83ef8bb Link to the right PR for adding alpine wheels 2022-07-28 14:39:33 +02:00
Daniele Varrazzo
f07b3ad0a6 Merge branch 'build-macos-arm64' 2022-07-28 13:30:32 +02:00
Daniele Varrazzo
611c610041 docs: fixed quote_ident() example
Close #1481
2022-07-27 02:54:17 +02:00
Daniele Varrazzo
25c40f8ac3 build: add scripts to build macOS arm64 packages 2022-07-17 00:20:08 +01:00
Daniele Varrazzo
ba92a22bc9 test: drop test table if exist
It might be a residue of a psycopg 3 test run in the same db.
2022-07-16 23:58:43 +01:00
Rafi Shamim
3c58e96e10 Unskip tests that work on CockroachDB v22.1
CockroachDB supports named cursors in v22.1, so more tests pass.
2022-03-28 20:26:23 +02:00
Daniele Varrazzo
626078388a Use pip-tools to create the requirement file to build the docs
Docs building just broke. The requirement file had some version upper
boundary that caused problems between Sphinx and jinja2.
2022-03-26 02:45:40 +01:00
75 changed files with 1560 additions and 1956 deletions

View File

@ -1,88 +0,0 @@
version : 2.x.{build}
clone_folder: C:\Project
# We use the configuration to specify the package name
configuration:
- psycopg2
- psycopg2-binary
environment:
matrix:
# For Python versions available on Appveyor, see
# https://www.appveyor.com/docs/windows-images-software/#python
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2019, PY_VER: "310", PY_ARCH: "32"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2019, PY_VER: "310", PY_ARCH: "64"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2019, PY_VER: "39", PY_ARCH: "32"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2019, PY_VER: "39", PY_ARCH: "64"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "38", PY_ARCH: "32"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "38", PY_ARCH: "64"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "37", PY_ARCH: "32"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "37", PY_ARCH: "64"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "36", PY_ARCH: "32"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "36", PY_ARCH: "64"}
WORKFLOW: packages
OPENSSL_VERSION: "1_1_1l"
POSTGRES_VERSION: "14_1"
PSYCOPG2_TESTDB: psycopg2_test
PSYCOPG2_TESTDB_USER: postgres
PSYCOPG2_TESTDB_HOST: localhost
PGUSER: postgres
PGPASSWORD: Password12!
PGSSLMODE: require
# Add CWD to perl library path for PostgreSQL build on VS2019
PERL5LIB: .
# Select according to the service enabled
POSTGRES_DIR: C:\Program Files\PostgreSQL\9.6\
# The python used in the build process, not the one packages are built for
PYEXE: C:\Python36\python.exe
matrix:
fast_finish: false
services:
# Note: if you change this service also change POSTGRES_DIR
- postgresql96
cache:
# Rebuild cache if following file changes
# (See the file to zap the cache manually)
- C:\Others -> scripts\build\appveyor.cache_rebuild
# Script called before repo cloning
# init:
# Repository gets cloned, Cache is restored
install:
- "%PYEXE% scripts\\build\\appveyor.py install"
# PostgreSQL server starts now
build: off
build_script:
- "%PYEXE% scripts\\build\\appveyor.py build_script"
after_build:
- "%PYEXE% scripts\\build\\appveyor.py after_build"
before_test:
- "%PYEXE% scripts\\build\\appveyor.py before_test"
test_script:
- "%PYEXE% scripts\\build\\appveyor.py test_script"
artifacts:
- path: dist\psycopg2-*\*.whl
name: wheel
# vim: set ts=4 sts=4 sw=4:

View File

@ -1,79 +0,0 @@
version : 2.x.{build}
clone_folder: C:\Project
environment:
matrix:
# For Python versions available on Appveyor, see
# https://www.appveyor.com/docs/windows-images-software/#python
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2019, PY_VER: "310", PY_ARCH: "32"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2019, PY_VER: "310", PY_ARCH: "64"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2019, PY_VER: "39", PY_ARCH: "32"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2019, PY_VER: "39", PY_ARCH: "64"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "38", PY_ARCH: "32"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "38", PY_ARCH: "64"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "37", PY_ARCH: "32"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "37", PY_ARCH: "64"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "36", PY_ARCH: "32"}
- {APPVEYOR_BUILD_WORKER_IMAGE: Visual Studio 2015, PY_VER: "36", PY_ARCH: "64"}
WORKFLOW: tests
OPENSSL_VERSION: "1_1_1l"
POSTGRES_VERSION: "14_1"
PSYCOPG2_TESTDB: psycopg2_test
PSYCOPG2_TESTDB_USER: postgres
PSYCOPG2_TESTDB_HOST: localhost
PGUSER: postgres
PGPASSWORD: Password12!
PGSSLMODE: require
# Add CWD to perl library path for PostgreSQL build on VS2019
PERL5LIB: .
# Select according to the service enabled
POSTGRES_DIR: C:\Program Files\PostgreSQL\9.6\
# The python used in the build process, not the one packages are built for
PYEXE: C:\Python36\python.exe
matrix:
fast_finish: false
services:
# Note: if you change this service also change POSTGRES_DIR
- postgresql96
cache:
# Rebuild cache if following file changes
# (See the file to zap the cache manually)
- C:\Others -> scripts\build\appveyor.cache_rebuild
# Script called before repo cloning
# init:
# Repository gets cloned, Cache is restored
install:
- "%PYEXE% scripts\\build\\appveyor.py install"
# PostgreSQL server starts now
build: off
build_script:
- "%PYEXE% scripts\\build\\appveyor.py build_script"
after_build:
- "%PYEXE% scripts\\build\\appveyor.py after_build"
before_test:
- "%PYEXE% scripts\\build\\appveyor.py before_test"
test_script:
- "%PYEXE% scripts\\build\\appveyor.py test_script"
# vim: set ts=4 sts=4 sw=4:

View File

@ -13,7 +13,6 @@ If you have a question, such has "how do you do X with Python/PostgreSQL/psycopg
**Before opening this ticket, please confirm that:**
- [ ] I am running the latest version of pip, i.e. typing ``pip --version`` you get [this version](https://pypi.org/project/pip/).
- [ ] I have read the [installation documentation](https://www.psycopg.org/docs/install.html) and the [frequently asked questions](https://www.psycopg.org/docs/faq.html)
- [ ] I am not trying to install `psycopg2-binary` on an Apple M1, for which [binary packages are not supported](https://github.com/psycopg/psycopg2/issues/1286)
- [ ] If install failed, I typed `pg_config` on the command line and I obtained an output instead of an error.
**Please complete the following information:**

6
.github/dependabot.yml vendored Normal file
View File

@ -0,0 +1,6 @@
version: 2
updates:
- package-ecosystem: "github-actions"
directory: "/"
schedule:
interval: "monthly"

View File

@ -11,7 +11,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Trigger docs build
uses: peter-evans/repository-dispatch@v1
uses: peter-evans/repository-dispatch@v3
with:
repository: psycopg/psycopg-website
event-type: psycopg2-commit

View File

@ -3,9 +3,14 @@ name: Build packages
on:
- workflow_dispatch
env:
PIP_BREAK_SYSTEM_PACKAGES: "1"
LIBPQ_VERSION: "16.0"
OPENSSL_VERSION: "1.1.1w"
jobs:
build-sdist:
sdist: # {{{
if: true
strategy:
fail-fast: false
matrix:
@ -13,10 +18,10 @@ jobs:
- package_name: psycopg2
- package_name: psycopg2-binary
runs-on: ubuntu-20.04
runs-on: ubuntu-latest
steps:
- name: Checkout repos
uses: actions/checkout@v2
uses: actions/checkout@v4
- name: Build sdist
run: ./scripts/build/build_sdist.sh
@ -24,11 +29,11 @@ jobs:
PACKAGE_NAME: ${{ matrix.package_name }}
- name: Upload artifacts
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v4
with:
name: packages_sdist
name: sdist-${{ matrix.package_name }}
path: |
dist/*/*.tar.gz
dist/*.tar.gz
env:
PSYCOPG2_TESTDB: postgres
@ -39,7 +44,7 @@ jobs:
services:
postgresql:
image: postgres:13
image: postgres:16
env:
POSTGRES_PASSWORD: password
ports:
@ -51,55 +56,68 @@ jobs:
--health-timeout 5s
--health-retries 5
# }}}
linux: # {{{
if: true
build-manylinux:
strategy:
fail-fast: false
matrix:
include:
- {tag: manylinux2014, arch: x86_64}
- {tag: manylinux2014, arch: i686}
- {tag: manylinux_2_24, arch: aarch64}
- {tag: manylinux_2_24, arch: ppc64le}
- {tag: musllinux_1_1, arch: x86_64}
- {tag: musllinux_1_1, arch: i686}
- {tag: musllinux_1_1, arch: aarch64}
- {tag: musllinux_1_1, arch: ppc64le}
platform: [manylinux, musllinux]
arch: [x86_64, i686, aarch64, ppc64le]
pyver: [cp38, cp39, cp310, cp311, cp312, cp313]
runs-on: ubuntu-20.04
runs-on: ubuntu-latest
steps:
- name: Checkout repos
uses: actions/checkout@v2
uses: actions/checkout@v4
- name: Set up QEMU for multi-arch build
uses: docker/setup-qemu-action@v1
uses: docker/setup-qemu-action@v3
- name: Build packages
run: >-
docker run --rm
-e PLAT=${{ matrix.tag }}_${{ matrix.arch }}
-e PACKAGE_NAME=psycopg2-binary
-e PYVERS="cp36-cp36m cp37-cp37m cp38-cp38 cp39-cp39 cp310-cp310"
-e PSYCOPG2_TESTDB=postgres
-e PSYCOPG2_TESTDB_HOST=172.17.0.1
-e PSYCOPG2_TESTDB_USER=postgres
-e PSYCOPG2_TESTDB_PASSWORD=password
-e PSYCOPG2_TEST_FAST=1
-v `pwd`:/src
--workdir /src
quay.io/pypa/${{ matrix.tag }}_${{ matrix.arch }}
./scripts/build/build_${{ matrix.tag }}.sh
- name: Upload artifacts
uses: actions/upload-artifact@v2
- name: Cache libpq build
uses: actions/cache@v4
with:
name: packages_${{ matrix.tag }}_${{ matrix.arch }}
path: |
dist/*/*${{ matrix.tag }}_${{ matrix.arch }}.whl
path: /tmp/libpq.build
key: libpq-${{ env.LIBPQ_VERSION }}-${{ matrix.platform }}-${{ matrix.arch }}
- name: Build wheels
uses: pypa/cibuildwheel@v2.23.2
env:
CIBW_MANYLINUX_X86_64_IMAGE: manylinux2014
CIBW_MANYLINUX_I686_IMAGE: manylinux2014
CIBW_MANYLINUX_AARCH64_IMAGE: manylinux2014
CIBW_MANYLINUX_PPC64LE_IMAGE: manylinux2014
CIBW_BUILD: ${{matrix.pyver}}-${{matrix.platform}}_${{matrix.arch}}
CIBW_ARCHS_LINUX: auto aarch64 ppc64le
CIBW_BEFORE_ALL_LINUX: ./scripts/build/wheel_linux_before_all.sh
CIBW_REPAIR_WHEEL_COMMAND: >-
./scripts/build/strip_wheel.sh {wheel}
&& auditwheel repair -w {dest_dir} {wheel}
CIBW_TEST_COMMAND: >-
export PYTHONPATH={project} &&
python -c "import tests; tests.unittest.main(defaultTest='tests.test_suite')"
CIBW_ENVIRONMENT_PASS_LINUX: LIBPQ_VERSION OPENSSL_VERSION
CIBW_ENVIRONMENT: >-
PACKAGE_NAME=psycopg2-binary
LIBPQ_BUILD_PREFIX=/host/tmp/libpq.build
PATH="$LIBPQ_BUILD_PREFIX/bin:$PATH"
LD_LIBRARY_PATH="$LIBPQ_BUILD_PREFIX/lib:$LIBPQ_BUILD_PREFIX/lib64"
PSYCOPG2_TESTDB=postgres
PSYCOPG2_TESTDB_HOST=172.17.0.1
PSYCOPG2_TESTDB_USER=postgres
PSYCOPG2_TESTDB_PASSWORD=password
PSYCOPG2_TEST_FAST=1
- uses: actions/upload-artifact@v4
with:
name: linux-${{matrix.pyver}}-${{matrix.platform}}_${{matrix.arch}}
path: ./wheelhouse/*.whl
services:
postgresql:
image: postgres:13
image: postgres:16
env:
POSTGRES_PASSWORD: password
ports:
@ -111,33 +129,138 @@ jobs:
--health-timeout 5s
--health-retries 5
# }}}
macos: # {{{
runs-on: macos-latest
if: true
build-macos:
runs-on: macos-10.15
strategy:
fail-fast: false
matrix:
python-version: ['3.6', '3.7', '3.8', '3.9', '3.10']
# These archs require an Apple M1 runner: [arm64, universal2]
arch: [x86_64, arm64]
pyver: [cp39, cp310, cp311, cp312, cp313]
steps:
- name: Checkout repos
uses: actions/checkout@v2
uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v2
- name: Cache libpq build
uses: actions/cache@v4
with:
python-version: ${{ matrix.python-version }}
path: /tmp/libpq.build
key: libpq-${{ env.LIBPQ_VERSION }}-macos-${{ matrix.arch }}
- name: Build packages
run: ./scripts/build/build_macos.sh
- name: Build wheels
uses: pypa/cibuildwheel@v2.23.2
env:
PACKAGE_NAME: psycopg2-binary
PSYCOPG2_TESTDB: postgres
PSYCOPG2_TEST_FAST: 1
CIBW_BUILD: ${{matrix.pyver}}-macosx_${{matrix.arch}}
CIBW_ARCHS_MACOS: ${{matrix.arch}}
MACOSX_ARCHITECTURE: ${{matrix.arch}}
CIBW_BEFORE_ALL_MACOS: ./scripts/build/wheel_macos_before_all.sh
CIBW_TEST_COMMAND: >-
export PYTHONPATH={project} &&
python -c "import tests; tests.unittest.main(defaultTest='tests.test_suite')"
CIBW_ENVIRONMENT: >-
PG_VERSION=16
PACKAGE_NAME=psycopg2-binary
PSYCOPG2_TESTDB=postgres
PATH="/tmp/libpq.build/bin:$PATH"
- name: Upload artifacts
uses: actions/upload-artifact@v2
uses: actions/upload-artifact@v4
with:
name: packages_macos
path: |
dist/*/*${{ matrix.platform }}.whl
name: macos-${{matrix.pyver}}-macos-${{matrix.arch}}
path: ./wheelhouse/*.whl
# }}}
windows: # {{{
runs-on: windows-latest
if: true
strategy:
fail-fast: false
matrix:
arch: [win_amd64]
pyver: [cp38, cp39, cp310, cp311, cp312, cp313]
package_name: [psycopg2, psycopg2-binary]
defaults:
run:
shell: bash
steps:
# there are some other libpq in PATH
- name: Drop spurious libpq in the path
run: rm -rf c:/tools/php C:/Strawberry/c/bin
- name: Checkout repo
uses: actions/checkout@v4
- name: Start PostgreSQL service for test
run: |
$PgSvc = Get-Service "postgresql*"
Set-Service $PgSvc.Name -StartupType manual
$PgSvc.Start()
shell: powershell
- name: Export GitHub Actions cache environment variables
uses: actions/github-script@v7
with:
script: |
const path = require('path')
core.exportVariable('ACTIONS_CACHE_URL', process.env.ACTIONS_CACHE_URL || '');
core.exportVariable('ACTIONS_RUNTIME_TOKEN', process.env.ACTIONS_RUNTIME_TOKEN || '');
core.addPath(path.join(process.env.VCPKG_INSTALLATION_ROOT, 'installed/x64-windows-release/lib'));
core.addPath(path.join(process.env.VCPKG_INSTALLATION_ROOT, 'installed/x64-windows-release/bin'));
- name: Create the binary package source tree
run: >-
sed -i 's/^setup(name="psycopg2"/setup(name="${{matrix.package_name}}"/'
setup.py
if: ${{ matrix.package_name != 'psycopg2' }}
- name: Build wheels
uses: pypa/cibuildwheel@v2.23.2
env:
VCPKG_BINARY_SOURCES: "clear;x-gha,readwrite" # cache vcpkg
CIBW_BUILD: ${{matrix.pyver}}-${{matrix.arch}}
CIBW_ARCHS_WINDOWS: AMD64 x86
CIBW_BEFORE_BUILD_WINDOWS: '.\scripts\build\wheel_win32_before_build.bat'
CIBW_REPAIR_WHEEL_COMMAND_WINDOWS: >-
delvewheel repair -w {dest_dir}
--no-mangle "libiconv-2.dll;libwinpthread-1.dll" {wheel}
CIBW_TEST_COMMAND: >-
set PYTHONPATH={project} &&
python -c "import tests; tests.unittest.main(defaultTest='tests.test_suite')"
# Note: no fast test because we don't run Windows tests
CIBW_ENVIRONMENT_WINDOWS: >-
PSYCOPG2_TESTDB=postgres
PSYCOPG2_TESTDB_USER=postgres
PSYCOPG2_TESTDB_HOST=localhost
- name: Upload artifacts
uses: actions/upload-artifact@v4
with:
name: windows-${{ matrix.package_name }}-${{matrix.pyver}}-${{matrix.arch}}
path: ./wheelhouse/*.whl
# }}}
merge: # {{{
runs-on: ubuntu-latest
needs:
- sdist
- linux
- macos
- windows
steps:
- name: Merge Artifacts
uses: actions/upload-artifact/merge@v4
with:
name: psycopg2-artifacts
delete-merged: true
# }}}

View File

@ -1,32 +1,35 @@
name: Tests
env:
PIP_BREAK_SYSTEM_PACKAGES: "1"
on:
push:
pull_request:
jobs:
tests:
name: Unit tests run
runs-on: ubuntu-20.04
linux:
runs-on: ubuntu-latest
if: true
strategy:
fail-fast: false
matrix:
include:
- {python: "3.6", postgres: "10"}
- {python: "3.7", postgres: "11"}
- {python: "3.8", postgres: "12"}
- {python: "3.9", postgres: "13"}
- {python: "3.10", postgres: "14"}
- {python: "3.11-dev", postgres: "14"}
- {python: "3.11", postgres: "15"}
- {python: "3.12", postgres: "16"}
- {python: "3.13", postgres: "17"}
# Opposite extremes of the supported Py/PG range, other architecture
- {python: "3.6", postgres: "14", architecture: "x86"}
- {python: "3.7", postgres: "13", architecture: "x86"}
- {python: "3.8", postgres: "12", architecture: "x86"}
- {python: "3.9", postgres: "11", architecture: "x86"}
- {python: "3.10", postgres: "10", architecture: "x86"}
- {python: "3.11-dev", postgres: "10", architecture: "x86"}
- {python: "3.8", postgres: "17", architecture: "x86"}
- {python: "3.9", postgres: "16", architecture: "x86"}
- {python: "3.10", postgres: "15", architecture: "x86"}
- {python: "3.11", postgres: "14", architecture: "x86"}
- {python: "3.12", postgres: "13", architecture: "x86"}
- {python: "3.13", postgres: "12", architecture: "x86"}
env:
PSYCOPG2_TESTDB: postgres
@ -49,10 +52,24 @@ jobs:
--health-retries 5
steps:
- uses: actions/checkout@v2
- uses: actions/checkout@v4
# Can enable to test an unreleased libpq version.
- name: install libpq 16
if: false
run: |
set -x
rel=$(lsb_release -c -s)
echo "deb http://apt.postgresql.org/pub/repos/apt ${rel}-pgdg main 16" \
| sudo tee -a /etc/apt/sources.list.d/pgdg.list
sudo apt-get -qq update
pqver=$(apt-cache show libpq5 | grep ^Version: | head -1 \
| awk '{print $2}')
sudo apt-get -qq -y install "libpq-dev=${pqver}" "libpq5=${pqver}"
- name: Install tox
run: pip install tox
- uses: actions/setup-python@v2
run: pip install "tox < 4"
- uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python }}
- name: Run tests

3
.gitignore vendored
View File

@ -6,7 +6,7 @@ MANIFEST
*.sw[po]
*.egg-info/
dist/*
build/*
/build
env
env?
.idea
@ -15,3 +15,4 @@ env?
/rel
/wheels
/packages
/wheelhouse

93
NEWS
View File

@ -1,10 +1,75 @@
Current release
---------------
What's new in psycopg 2.9.10
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Add support for Python 3.13.
- Receive notifications on commit (:ticket:`#1728`).
- `~psycopg2.errorcodes` map and `~psycopg2.errors` classes updated to
PostgreSQL 17.
- Drop support for Python 3.7.
What's new in psycopg 2.9.9
^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Add support for Python 3.12.
- Drop support for Python 3.6.
What's new in psycopg 2.9.8
^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Wheel package bundled with PostgreSQL 16 libpq in order to add support for
recent features, such as ``sslcertmode``.
What's new in psycopg 2.9.7
^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Fix propagation of exceptions raised during module initialization
(:ticket:`#1598`).
- Fix building when pg_config returns an empty string (:ticket:`#1599`).
- Wheel package bundled with OpenSSL 1.1.1v.
What's new in psycopg 2.9.6
^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Package manylinux 2014 for aarch64 and ppc64le platforms, in order to
include libpq 15 in the binary package (:ticket:`#1396`).
- Wheel package bundled with OpenSSL 1.1.1t.
What's new in psycopg 2.9.5
^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Add support for Python 3.11.
- Add support for rowcount in MERGE statements in binary packages
(:ticket:`#1497`).
- Wheel package bundled with OpenSSL 1.1.1r and PostgreSQL 15 libpq.
What's new in psycopg 2.9.4
^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Fix `~psycopg2.extras.register_composite()`,
`~psycopg2.extras.register_range()` with customized :sql:`search_path`
(:ticket:`#1487`).
- Handle correctly composite types with names or in schemas requiring escape.
- Find ``pg_service.conf`` file in the ``/etc/postgresql-common`` directory in
binary packages (:ticket:`#1365`).
- `~psycopg2.errorcodes` map and `~psycopg2.errors` classes updated to
PostgreSQL 15.
- Wheel package bundled with OpenSSL 1.1.1q and PostgreSQL 14.4 libpq.
What's new in psycopg 2.9.3
^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Alpine (musl) wheels now available (:ticket:`#1148`).
- Alpine (musl) wheels now available (:ticket:`#1392`).
- macOS arm64 (Apple M1) wheels now available (:ticket:`1482`).
What's new in psycopg 2.9.2
@ -14,14 +79,14 @@ What's new in psycopg 2.9.2
- `~psycopg2.errorcodes` map and `~psycopg2.errors` classes updated to
PostgreSQL 14.
- Add preliminary support for Python 3.11 (:tickets:`#1376, #1386`).
- Wheel package compiled against OpenSSL 1.1.1l and PostgreSQL 14.1
- Wheel package bundled with OpenSSL 1.1.1l and PostgreSQL 14.1 libpq
(:ticket:`#1388`).
What's new in psycopg 2.9.1
^^^^^^^^^^^^^^^^^^^^^^^^^^^
Fix regression with named `sql.Placeholder` (:ticket:`#1291`).
- Fix regression with named `~psycopg2.sql.Placeholder` (:ticket:`#1291`).
What's new in psycopg 2.9
@ -51,7 +116,7 @@ Other changes:
platforms.
- Provide :pep:`600` wheels packages (manylinux_2_24 tag) for aarch64 and
ppc64le platforms.
- Wheel package compiled against OpenSSL 1.1.1k and PostgreSQL 13.3.
- Wheel package bundled with OpenSSL 1.1.1k and PostgreSQL 13.3 libpq.
- Build system for Linux/MacOS binary packages moved to GitHub Actions.
@ -75,7 +140,7 @@ What's new in psycopg 2.8.6
- `~psycopg2.errorcodes` map and `~psycopg2.errors` classes updated to
PostgreSQL 13.
- Added wheel packages for ARM architecture (:ticket:`#1125`).
- Wheel package compiled against OpenSSL 1.1.1g.
- Wheel package bundled with OpenSSL 1.1.1g.
What's new in psycopg 2.8.5
@ -104,7 +169,7 @@ What's new in psycopg 2.8.4
and `~psycopg2.extensions.Column.type_code` (:ticket:`#961`).
- `~psycopg2.errorcodes` map and `~psycopg2.errors` classes updated to
PostgreSQL 12.
- Wheel package compiled against OpenSSL 1.1.1d and PostgreSQL at least 11.4.
- Wheel package bundled with OpenSSL 1.1.1d and PostgreSQL at least 11.4.
What's new in psycopg 2.8.3
@ -193,7 +258,7 @@ Other changes:
source files are now compatible with Python 2 & 3 as is.
- The `!psycopg2.test` package is no longer installed by ``python setup.py
install``.
- Wheel package compiled against OpenSSL 1.0.2r and PostgreSQL 11.2 libpq.
- Wheel package bundled with OpenSSL 1.0.2r and PostgreSQL 11.2 libpq.
What's new in psycopg 2.7.7
@ -201,14 +266,14 @@ What's new in psycopg 2.7.7
- Cleanup of the cursor results assignment code, which might have solved
double free and inconsistencies in concurrent usage (:tickets:`#346, #384`).
- Wheel package compiled against OpenSSL 1.0.2q.
- Wheel package bundled with OpenSSL 1.0.2q.
What's new in psycopg 2.7.6.1
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Fixed binary package broken on OS X 10.12 (:ticket:`#807`).
- Wheel package compiled against PostgreSQL 11.1 libpq.
- Wheel package bundled with PostgreSQL 11.1 libpq.
What's new in psycopg 2.7.6
@ -225,7 +290,7 @@ What's new in psycopg 2.7.6
- `~psycopg2.extras.execute_values()` accepts `~psycopg2.sql.Composable`
objects (:ticket:`#794`).
- `~psycopg2.errorcodes` map updated to PostgreSQL 11.
- Wheel package compiled against PostgreSQL 10.5 libpq and OpenSSL 1.0.2p.
- Wheel package bundled with PostgreSQL 10.5 libpq and OpenSSL 1.0.2p.
What's new in psycopg 2.7.5
@ -239,7 +304,7 @@ What's new in psycopg 2.7.5
- Maybe fixed building on MSYS2 (as reported in :ticket:`#658`).
- Allow string subclasses in connection and other places (:ticket:`#679`).
- Don't raise an exception closing an unused named cursor (:ticket:`#716`).
- Wheel package compiled against PostgreSQL 10.4 libpq and OpenSSL 1.0.2o.
- Wheel package bundled with PostgreSQL 10.4 libpq and OpenSSL 1.0.2o.
What's new in psycopg 2.7.4
@ -261,7 +326,7 @@ What's new in psycopg 2.7.4
- Fixed `~cursor.rowcount` after `~cursor.executemany()` with :sql:`RETURNING`
statements (:ticket:`#633`).
- Fixed compatibility problem with pypy3 (:ticket:`#649`).
- Wheel packages compiled against PostgreSQL 10.1 libpq and OpenSSL 1.0.2n.
- Wheel packages bundled with PostgreSQL 10.1 libpq and OpenSSL 1.0.2n.
- Wheel packages for Python 2.6 no more available (support dropped from
wheel building infrastructure).
@ -269,7 +334,7 @@ What's new in psycopg 2.7.4
What's new in psycopg 2.7.3.2
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
- Wheel package compiled against PostgreSQL 10.0 libpq and OpenSSL 1.0.2l
- Wheel package bundled with PostgreSQL 10.0 libpq and OpenSSL 1.0.2l
(:tickets:`#601, #602`).
@ -342,7 +407,7 @@ New features:
them together.
- Added `~psycopg2.__libpq_version__` and
`~psycopg2.extensions.libpq_version()` to inspect the version of the
``libpq`` library the module was compiled/loaded with
``libpq`` library the module was bundled with
(:tickets:`#35, #323`).
- The attributes `~connection.notices` and `~connection.notifies` can be
customized replacing them with any object exposing an `!append()` method

View File

@ -17,6 +17,18 @@ flexible objects adaptation system.
Psycopg 2 is both Unicode and Python 3 friendly.
.. Note::
The psycopg2 package is still widely used and actively maintained, but it
is not expected to receive new features.
`Psycopg 3`__ is the evolution of psycopg2 and is where `new features are
being developed`__: if you are starting a new project you should probably
start from 3!
.. __: https://pypi.org/project/psycopg/
.. __: https://www.psycopg.org/psycopg3/docs/index.html
Documentation
-------------
@ -61,13 +73,8 @@ production it is advised to use the package built from sources.
.. _install: https://www.psycopg.org/docs/install.html#install-from-source
.. _faq: https://www.psycopg.org/docs/faq.html#faq-compile
:Linux/OSX: |gh-actions|
:Windows: |appveyor|
:Build status: |gh-actions|
.. |gh-actions| image:: https://github.com/psycopg/psycopg2/actions/workflows/tests.yml/badge.svg
:target: https://github.com/psycopg/psycopg2/actions/workflows/tests.yml
:alt: Linux and OSX build status
.. |appveyor| image:: https://ci.appveyor.com/api/projects/status/github/psycopg/psycopg2?branch=master&svg=true
:target: https://ci.appveyor.com/project/psycopg/psycopg2/branch/master
:alt: Windows build status
:alt: Build status

View File

@ -1,7 +1,7 @@
GNU LESSER GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.

View File

@ -8,7 +8,7 @@ check: doctest
# It is not clean by 'make clean'
PYTHON := python$(PYTHON_VERSION)
PYTHON_VERSION ?= $(shell $(PYTHON) -c 'import sys; print ("%d.%d" % sys.version_info[:2])')
PYTHON_VERSION ?= $(shell $(PYTHON) -c 'import sys; print("%d.%d" % sys.version_info[:2])')
BUILD_DIR = $(shell pwd)/../build/lib.$(PYTHON_VERSION)
SPHINXBUILD ?= $$(pwd)/env/bin/sphinx-build

View File

@ -16,10 +16,9 @@ How to make a psycopg2 release
$ export VERSION=2.8.4
- Push psycopg2 to master or to the maint branch. Make sure tests on `GitHub
Actions`__ and AppVeyor__ pass.
Actions`__.
.. __: https://github.com/psycopg/psycopg2/actions/workflows/tests.yml
.. __: https://ci.appveyor.com/project/psycopg/psycopg2
- Create a signed tag with the content of the relevant NEWS bit and push it.
E.g.::
@ -41,22 +40,14 @@ How to make a psycopg2 release
- On GitHub Actions run manually a `package build workflow`__.
- On Appveyor change the `build settings`__ and replace the custom
configuration file name from ``.appveyor/tests.yml`` to
``.appveyor/packages.yml`` (yeah, that sucks a bit. Remember to put it
back to testing).
.. __: https://github.com/psycopg/psycopg2/actions/workflows/packages.yml
.. __: https://ci.appveyor.com/project/psycopg/psycopg2/settings
- When the workflows have finished download the packages using the
``download_packages_{github|appveyor}.py`` scripts from the
``scripts/build`` directory. They will be saved in a
``packages/psycopg2-${VERSION}`` directory.
- When the workflows have finished download the packages from the job
artifacts.
- Only for stable packages: upload the signed packages on PyPI::
$ twine upload -s packages/psycopg2-${VERSION}/*
$ twine upload -s wheelhouse/psycopg2-${VERSION}/*
- Create a release and release notes in the psycopg website, announce to
psycopg and pgsql-announce mailing lists.
@ -69,7 +60,7 @@ Releasing test packages
Test packages may be uploaded on the `PyPI testing site`__ using::
$ twine upload -s -r testpypi packages/psycopg2-${VERSION}/*
$ twine upload -s -r testpypi wheelhouse/psycopg2-${VERSION}/*
assuming `proper configuration`__ of ``~/.pypirc``.

2
doc/requirements.in Normal file
View File

@ -0,0 +1,2 @@
Sphinx
sphinx-better-theme

View File

@ -1,8 +1,50 @@
# Packages only needed to build the docs
Pygments>=2.2,<2.3
Sphinx>=1.6,<=1.7
sphinx-better-theme>=0.1.5,<0.2
# 0.15.2 affected by https://sourceforge.net/p/docutils/bugs/353/
# Can update to 0.16 after release (currently in rc) but must update Sphinx too
docutils<0.15
#
# This file is autogenerated by pip-compile with Python 3.10
# by the following command:
#
# pip-compile requirements.in
#
alabaster==0.7.13
# via sphinx
babel==2.12.1
# via sphinx
certifi>=2023.7.22
# via requests
charset-normalizer==3.1.0
# via requests
docutils==0.19
# via sphinx
idna==3.4
# via requests
imagesize==1.4.1
# via sphinx
jinja2==3.1.2
# via sphinx
markupsafe==2.1.2
# via jinja2
packaging==23.1
# via sphinx
pygments==2.15.0
# via sphinx
requests==2.31.0
# via sphinx
snowballstemmer==2.2.0
# via sphinx
sphinx==6.1.3
# via -r requirements.in
sphinx-better-theme==0.1.5
# via -r requirements.in
sphinxcontrib-applehelp==1.0.4
# via sphinx
sphinxcontrib-devhelp==1.0.2
# via sphinx
sphinxcontrib-htmlhelp==2.0.1
# via sphinx
sphinxcontrib-jsmath==1.0.1
# via sphinx
sphinxcontrib-qthelp==1.0.3
# via sphinx
sphinxcontrib-serializinghtml==1.1.5
# via sphinx
urllib3==1.26.17
# via requests

View File

@ -226,7 +226,7 @@ read:
>>> cur.execute("SELECT '(10.2,20.3)'::point")
>>> point = cur.fetchone()[0]
>>> print type(point), point.x, point.y
>>> print(type(point), point.x, point.y)
<class 'Point'> 10.2 20.3
A typecaster created by `!new_type()` can be also used with
@ -284,15 +284,15 @@ something to read::
curs = conn.cursor()
curs.execute("LISTEN test;")
print "Waiting for notifications on channel 'test'"
print("Waiting for notifications on channel 'test'")
while True:
if select.select([conn],[],[],5) == ([],[],[]):
print "Timeout"
print("Timeout")
else:
conn.poll()
while conn.notifies:
notify = conn.notifies.pop(0)
print "Got NOTIFY:", notify.pid, notify.channel, notify.payload
print("Got NOTIFY:", notify.pid, notify.channel, notify.payload)
Running the script and executing a command such as :sql:`NOTIFY test, 'hello'`
in a separate :program:`psql` shell, the output may look similar to:

View File

@ -255,6 +255,7 @@ latex_documents = [
# If false, no module index is generated.
# latex_use_modindex = True
toc_object_entries = False
doctest_global_setup = """

View File

@ -208,6 +208,14 @@ The ``cursor`` class
Parameters are bounded to the query using the same rules described in
the `~cursor.execute()` method.
.. code:: python
>>> nums = ((1,), (5,), (10,))
>>> cur.executemany("INSERT INTO test (num) VALUES (%s)", nums)
>>> tuples = ((123, "foo"), (42, "bar"), (23, "baz"))
>>> cur.executemany("INSERT INTO test (num, data) VALUES (%s, %s)", tuples)
.. warning::
In its current implementation this method is not faster than
executing `~cursor.execute()` in a loop. For better performance
@ -284,7 +292,7 @@ The ``cursor`` class
>>> cur.execute("SELECT * FROM test;")
>>> for record in cur:
... print record
... print(record)
...
(1, 100, "abc'def")
(2, None, 'dada')

View File

@ -50,7 +50,7 @@ An example of the available constants defined in the module:
'42P01'
Constants representing all the error values defined by PostgreSQL versions
between 8.1 and 13 are included in the module.
between 8.1 and 15 are included in the module.
.. autofunction:: lookup(code)

View File

@ -14,11 +14,17 @@
.. versionchanged:: 2.8.6 added errors introduced in PostgreSQL 13
.. versionchanged:: 2.9.2 added errors introduced in PostgreSQL 14
.. versionchanged:: 2.9.4 added errors introduced in PostgreSQL 15
.. versionchanged:: 2.9.10 added errors introduced in PostgreSQL 17
This module exposes the classes psycopg raises upon receiving an error from
the database with a :sql:`SQLSTATE` value attached (available in the
`~psycopg2.Error.pgcode` attribute). The content of the module is generated
from the PostgreSQL source code and includes classes for every error defined
by PostgreSQL in versions between 9.1 and 13.
by PostgreSQL in versions between 9.1 and 15.
Every class in the module is named after what referred as "condition name" `in
the documentation`__, converted to CamelCase: e.g. the error 22012,

View File

@ -1029,6 +1029,14 @@ parameters. By reducing the number of server roundtrips the performance can be
.. autofunction:: execute_batch
.. code:: python
>>> nums = ((1,), (5,), (10,))
>>> execute_batch(cur, "INSERT INTO test (num) VALUES (%s)", nums)
>>> tuples = ((123, "foo"), (42, "bar"), (23, "baz"))
>>> execute_batch(cur, "INSERT INTO test (num, data) VALUES (%s, %s)", tuples)
.. versionadded:: 2.7
.. note::

View File

@ -131,10 +131,17 @@ The current `!psycopg2` implementation supports:
..
NOTE: keep consistent with setup.py and the /features/ page.
- Python versions from 3.6 to 3.10
- PostgreSQL server versions from 7.4 to 14
- Python versions from 3.8 to 3.13
- PostgreSQL server versions from 7.4 to 17
- PostgreSQL client library version from 9.1
.. note::
Not all the psycopg2 versions support all the supported Python versions.
Please see the :ref:`release notes <news>` to verify when the support for
a new Python version was added and when the support for an old Python
version was removed.
.. _build-prerequisites:

View File

@ -168,7 +168,7 @@ available through the following exceptions:
>>> e.pgcode
'42P01'
>>> print e.pgerror
>>> print(e.pgerror)
ERROR: relation "barf" does not exist
LINE 1: SELECT * FROM barf
^
@ -184,7 +184,7 @@ available through the following exceptions:
>>> try:
... cur.execute("SELECT * FROM barf")
... except psycopg2.Error, e:
... except psycopg2.Error as e:
... pass
>>> e.diag.severity

View File

@ -2,6 +2,8 @@
single: Release notes
single: News
.. _news:
Release notes
=============

View File

@ -33,7 +33,7 @@ name should be escaped using `~psycopg2.extensions.quote_ident()`::
# This works, but it is not optimal
table_name = 'my_table'
cur.execute(
"insert into %s values (%%s, %%s)" % ext.quote_ident(table_name),
"insert into %s values (%%s, %%s)" % ext.quote_ident(table_name, cur),
[10, 20])
This is now safe, but it somewhat ad-hoc. In case, for some reason, it is

View File

@ -407,7 +407,7 @@ defined on the database connection (the `PostgreSQL encoding`__, available in
`connection.encoding`, is translated into a `Python encoding`__ using the
`~psycopg2.extensions.encodings` mapping)::
>>> print u, type(u)
>>> print(u, type(u))
àèìòù€ <type 'unicode'>
>>> cur.execute("INSERT INTO test (num, data) VALUES (%s,%s);", (74, u))
@ -418,19 +418,19 @@ defined on the database connection (the `PostgreSQL encoding`__, available in
When reading data from the database, in Python 2 the strings returned are
usually 8 bit `!str` objects encoded in the database client encoding::
>>> print conn.encoding
>>> print(conn.encoding)
UTF8
>>> cur.execute("SELECT data FROM test WHERE num = 74")
>>> x = cur.fetchone()[0]
>>> print x, type(x), repr(x)
>>> print(x, type(x), repr(x))
àèìòù€ <type 'str'> '\xc3\xa0\xc3\xa8\xc3\xac\xc3\xb2\xc3\xb9\xe2\x82\xac'
>>> conn.set_client_encoding('LATIN9')
>>> cur.execute("SELECT data FROM test WHERE num = 74")
>>> x = cur.fetchone()[0]
>>> print type(x), repr(x)
>>> print(type(x), repr(x))
<type 'str'> '\xe0\xe8\xec\xf2\xf9\xa4'
In Python 3 instead the strings are automatically *decoded* in the connection
@ -442,7 +442,7 @@ In Python 2 you must register a :ref:`typecaster
>>> cur.execute("SELECT data FROM test WHERE num = 74")
>>> x = cur.fetchone()[0]
>>> print x, type(x), repr(x)
>>> print(x, type(x), repr(x))
àèìòù€ <type 'unicode'> u'\xe0\xe8\xec\xf2\xf9\u20ac'
In the above example, the `~psycopg2.extensions.UNICODE` typecaster is
@ -814,7 +814,7 @@ is rolled back.
When a cursor exits the ``with`` block it is closed, releasing any resource
eventually associated with it. The state of the transaction is not affected.
A connection can be used in more than a ``with`` statement
A connection can be used in more than one ``with`` statement
and each ``with`` block is effectively wrapped in a separate transaction::
conn = psycopg2.connect(DSN)
@ -860,7 +860,7 @@ Server side cursors
When a database query is executed, the Psycopg `cursor` usually fetches
all the records returned by the backend, transferring them to the client
process. If the query returned an huge amount of data, a proportionally large
process. If the query returns a huge amount of data, a proportionally large
amount of memory will be allocated by the client.
If the dataset is too large to be practically handled on the client side, it is
@ -1053,7 +1053,7 @@ using the |lo_import|_ and |lo_export|_ libpq functions.
If Psycopg was built with 64 bits large objects support (i.e. the first
two conditions above are verified), the `psycopg2.__version__` constant
will contain the ``lo64`` flag. If any of the contition is not met
will contain the ``lo64`` flag. If any of the condition is not met
several `!lobject` methods will fail if the arguments exceed 2GB.

View File

@ -143,10 +143,6 @@ class Range:
def __bool__(self):
return self._bounds is not None
def __nonzero__(self):
# Python 2 compatibility
return type(self).__bool__(self)
def __eq__(self, other):
if not isinstance(other, Range):
return False
@ -367,33 +363,54 @@ class RangeCaster:
schema = 'public'
# get the type oid and attributes
try:
curs.execute("""\
select rngtypid, rngsubtype,
(select typarray from pg_type where oid = rngtypid)
curs.execute("""\
select rngtypid, rngsubtype, typarray
from pg_range r
join pg_type t on t.oid = rngtypid
join pg_namespace ns on ns.oid = typnamespace
where typname = %s and ns.nspname = %s;
""", (tname, schema))
rec = curs.fetchone()
except ProgrammingError:
if not conn.autocommit:
conn.rollback()
raise
else:
rec = curs.fetchone()
if not rec:
# The above algorithm doesn't work for customized seach_path
# (#1487) The implementation below works better, but, to guarantee
# backwards compatibility, use it only if the original one failed.
try:
savepoint = False
# Because we executed statements earlier, we are either INTRANS
# or we are IDLE only if the transaction is autocommit, in
# which case we don't need the savepoint anyway.
if conn.status == STATUS_IN_TRANSACTION:
curs.execute("SAVEPOINT register_type")
savepoint = True
# revert the status of the connection as before the command
if (conn_status != STATUS_IN_TRANSACTION
and not conn.autocommit):
conn.rollback()
curs.execute("""\
SELECT rngtypid, rngsubtype, typarray, typname, nspname
from pg_range r
join pg_type t on t.oid = rngtypid
join pg_namespace ns on ns.oid = typnamespace
WHERE t.oid = %s::regtype
""", (name, ))
except ProgrammingError:
pass
else:
rec = curs.fetchone()
if rec:
tname, schema = rec[3:]
finally:
if savepoint:
curs.execute("ROLLBACK TO SAVEPOINT register_type")
# revert the status of the connection as before the command
if conn_status != STATUS_IN_TRANSACTION and not conn.autocommit:
conn.rollback()
if not rec:
raise ProgrammingError(
f"PostgreSQL type '{name}' not found")
f"PostgreSQL range '{name}' not found")
type, subtype, array = rec
type, subtype, array = rec[:3]
return RangeCaster(name, pyrange,
oid=type, subtype_oid=subtype, array_oid=array)

View File

@ -223,6 +223,7 @@ SQL_JSON_OBJECT_NOT_FOUND = '2203C'
TOO_MANY_JSON_ARRAY_ELEMENTS = '2203D'
TOO_MANY_JSON_OBJECT_MEMBERS = '2203E'
SQL_JSON_SCALAR_REQUIRED = '2203F'
SQL_JSON_ITEM_CANNOT_BE_CAST_TO_TARGET_TYPE = '2203G'
FLOATING_POINT_EXCEPTION = '22P01'
INVALID_TEXT_REPRESENTATION = '22P02'
INVALID_BINARY_REPRESENTATION = '22P03'
@ -255,6 +256,7 @@ HELD_CURSOR_REQUIRES_SAME_ISOLATION_LEVEL = '25008'
NO_ACTIVE_SQL_TRANSACTION = '25P01'
IN_FAILED_SQL_TRANSACTION = '25P02'
IDLE_IN_TRANSACTION_SESSION_TIMEOUT = '25P03'
TRANSACTION_TIMEOUT = '25P04'
# Class 26 - Invalid SQL Statement Name
INVALID_SQL_STATEMENT_NAME = '26000'

View File

@ -357,10 +357,6 @@ class NamedTupleCursor(_cursor):
except StopIteration:
return
# ascii except alnum and underscore
_re_clean = _re.compile(
'[' + _re.escape(' !"#$%&\'()*+,-./:;<=>?@[\\]^`{|}~') + ']')
def _make_nt(self):
key = tuple(d[0] for d in self.description) if self.description else ()
return self._cached_make_nt(key)
@ -369,7 +365,7 @@ class NamedTupleCursor(_cursor):
def _do_make_nt(cls, key):
fields = []
for s in key:
s = cls._re_clean.sub('_', s)
s = _re_clean.sub('_', s)
# Python identifier cannot start with numbers, namedtuple fields
# cannot start with underscore. So...
if s[0] == '_' or '0' <= s[0] <= '9':
@ -1061,6 +1057,7 @@ class CompositeCaster:
return rv
def _create_type(self, name, attnames):
name = _re_clean.sub('_', name)
self.type = namedtuple(name, attnames)
self._ctor = self.type._make
@ -1098,9 +1095,41 @@ ORDER BY attnum;
recs = curs.fetchall()
if not recs:
# The above algorithm doesn't work for customized seach_path
# (#1487) The implementation below works better, but, to guarantee
# backwards compatibility, use it only if the original one failed.
try:
savepoint = False
# Because we executed statements earlier, we are either INTRANS
# or we are IDLE only if the transaction is autocommit, in
# which case we don't need the savepoint anyway.
if conn.status == _ext.STATUS_IN_TRANSACTION:
curs.execute("SAVEPOINT register_type")
savepoint = True
curs.execute("""\
SELECT t.oid, %s, attname, atttypid, typname, nspname
FROM pg_type t
JOIN pg_namespace ns ON typnamespace = ns.oid
JOIN pg_attribute a ON attrelid = typrelid
WHERE t.oid = %%s::regtype
AND attnum > 0 AND NOT attisdropped
ORDER BY attnum;
""" % typarray, (name, ))
except psycopg2.ProgrammingError:
pass
else:
recs = curs.fetchall()
if recs:
tname = recs[0][4]
schema = recs[0][5]
finally:
if savepoint:
curs.execute("ROLLBACK TO SAVEPOINT register_type")
# revert the status of the connection as before the command
if (conn_status != _ext.STATUS_IN_TRANSACTION
and not conn.autocommit):
if conn_status != _ext.STATUS_IN_TRANSACTION and not conn.autocommit:
conn.rollback()
if not recs:
@ -1304,3 +1333,8 @@ def _split_sql(sql):
raise ValueError("the query doesn't contain any '%s' placeholder")
return pre, post
# ascii except alnum and underscore
_re_clean = _re.compile(
'[' + _re.escape(' !"#$%&\'()*+,-./:;<=>?@[\\]^`{|}~') + ']')

View File

@ -1344,6 +1344,11 @@ conn_set_session(connectionObject *self, int autocommit,
}
}
Py_BLOCK_THREADS;
conn_notifies_process(self);
conn_notice_process(self);
Py_UNBLOCK_THREADS;
if (autocommit != SRV_STATE_UNCHANGED) {
self->autocommit = autocommit;
}
@ -1408,6 +1413,11 @@ conn_set_client_encoding(connectionObject *self, const char *pgenc)
goto endlock;
}
Py_BLOCK_THREADS;
conn_notifies_process(self);
conn_notice_process(self);
Py_UNBLOCK_THREADS;
endlock:
pthread_mutex_unlock(&self->lock);
Py_END_ALLOW_THREADS;

View File

@ -64,7 +64,7 @@ psyco_lobj_close(lobjectObject *self, PyObject *args)
/* write method - write data to the lobject */
#define psyco_lobj_write_doc \
"write(str) -- Write a string to the large object."
"write(str | bytes) -- Write a string or bytes to the large object."
static PyObject *
psyco_lobj_write(lobjectObject *self, PyObject *args)

View File

@ -412,6 +412,7 @@ pq_commit(connectionObject *conn)
}
Py_BLOCK_THREADS;
conn_notifies_process(conn);
conn_notice_process(conn);
Py_UNBLOCK_THREADS;
@ -468,6 +469,7 @@ pq_abort(connectionObject *conn)
retvalue = pq_abort_locked(conn, &_save);
Py_BLOCK_THREADS;
conn_notifies_process(conn);
conn_notice_process(conn);
Py_UNBLOCK_THREADS;
@ -538,6 +540,7 @@ pq_reset(connectionObject *conn)
Py_BLOCK_THREADS;
conn_notice_process(conn);
conn_notifies_process(conn);
Py_UNBLOCK_THREADS;
pthread_mutex_unlock(&conn->lock);

View File

@ -27,6 +27,7 @@
#ifndef PSYCOPG_H
#define PSYCOPG_H 1
#include <pg_config.h>
#if PG_VERSION_NUM < 90100
#error "Psycopg requires PostgreSQL client library (libpq) >= 9.1"
#endif

View File

@ -1001,32 +1001,35 @@ INIT_MODULE(_psycopg)(void)
/* initialize types and objects not exposed to the module */
Py_SET_TYPE(&typecastType, &PyType_Type);
if (0 > PyType_Ready(&typecastType)) { goto exit; }
if (0 > PyType_Ready(&typecastType)) { goto error; }
Py_SET_TYPE(&chunkType, &PyType_Type);
if (0 > PyType_Ready(&chunkType)) { goto exit; }
if (0 > PyType_Ready(&chunkType)) { goto error; }
Py_SET_TYPE(&errorType, &PyType_Type);
errorType.tp_base = (PyTypeObject *)PyExc_StandardError;
if (0 > PyType_Ready(&errorType)) { goto exit; }
if (0 > PyType_Ready(&errorType)) { goto error; }
if (!(psyco_null = Bytes_FromString("NULL"))) { goto exit; }
if (!(psyco_null = Bytes_FromString("NULL"))) { goto error; }
/* initialize the module */
module = PyModule_Create(&psycopgmodule);
if (!module) { goto exit; }
if (!module) { goto error; }
if (0 > add_module_constants(module)) { goto exit; }
if (0 > add_module_types(module)) { goto exit; }
if (0 > datetime_init()) { goto exit; }
if (0 > encodings_init(module)) { goto exit; }
if (0 > typecast_init(module)) { goto exit; }
if (0 > adapters_init(module)) { goto exit; }
if (0 > basic_errors_init(module)) { goto exit; }
if (0 > sqlstate_errors_init(module)) { goto exit; }
if (0 > add_module_constants(module)) { goto error; }
if (0 > add_module_types(module)) { goto error; }
if (0 > datetime_init()) { goto error; }
if (0 > encodings_init(module)) { goto error; }
if (0 > typecast_init(module)) { goto error; }
if (0 > adapters_init(module)) { goto error; }
if (0 > basic_errors_init(module)) { goto error; }
if (0 > sqlstate_errors_init(module)) { goto error; }
Dprintf("psycopgmodule: module initialization complete");
exit:
return module;
error:
if (module)
Py_DECREF(module);
return NULL;
}

View File

@ -27,8 +27,8 @@
#ifndef PSYCOPG_PYTHON_H
#define PSYCOPG_PYTHON_H 1
#if PY_VERSION_HEX < 0x03060000
#error "psycopg requires Python 3.6"
#if PY_VERSION_HEX < 0x03080000
#error "psycopg requires Python 3.8"
#endif
#include <structmember.h>

View File

@ -111,6 +111,7 @@
{"2203D", "TooManyJsonArrayElements"},
{"2203E", "TooManyJsonObjectMembers"},
{"2203F", "SqlJsonScalarRequired"},
{"2203G", "SqlJsonItemCannotBeCastToTargetType"},
{"22P01", "FloatingPointException"},
{"22P02", "InvalidTextRepresentation"},
{"22P03", "InvalidBinaryRepresentation"},
@ -143,6 +144,7 @@
{"25P01", "NoActiveSqlTransaction"},
{"25P02", "InFailedSqlTransaction"},
{"25P03", "IdleInTransactionSessionTimeout"},
{"25P04", "TransactionTimeout"},
/* Class 26 - Invalid SQL Statement Name */
{"26000", "InvalidSqlStatementName"},

View File

@ -103,18 +103,8 @@ _parse_inftz(const char *str, PyObject *curs)
goto exit;
}
#if defined(PYPY_VERSION) || PY_VERSION_HEX < 0x03070000
{
PyObject *tzoff;
if (!(tzoff = PyDelta_FromDSU(0, 0, 0))) { goto exit; }
tzinfo = PyObject_CallFunctionObjArgs(tzinfo_factory, tzoff, NULL);
Py_DECREF(tzoff);
if (!tzinfo) { goto exit; }
}
#else
tzinfo = PyDateTime_TimeZone_UTC;
Py_INCREF(tzinfo);
#endif
/* m.replace(tzinfo=tzinfo) */
if (!(args = PyTuple_New(0))) { goto exit; }
@ -178,11 +168,6 @@ _parse_noninftz(const char *str, Py_ssize_t len, PyObject *curs)
appropriate tzinfo object calling the factory */
Dprintf("typecast_PYDATETIMETZ_cast: UTC offset = %ds", tzsec);
#if PY_VERSION_HEX < 0x03070000
/* Before Python 3.7 the timezone offset had to be a whole number
* of minutes, so round the seconds to the closest minute */
tzsec = 60 * (int)round(tzsec / 60.0);
#endif
if (!(tzoff = PyDelta_FromDSU(0, tzsec, 0))) { goto exit; }
if (!(tzinfo = PyObject_CallFunctionObjArgs(
tzinfo_factory, tzoff, NULL))) {
@ -270,11 +255,6 @@ typecast_PYTIME_cast(const char *str, Py_ssize_t len, PyObject *curs)
appropriate tzinfo object calling the factory */
Dprintf("typecast_PYTIME_cast: UTC offset = %ds", tzsec);
#if PY_VERSION_HEX < 0x03070000
/* Before Python 3.7 the timezone offset had to be a whole number
* of minutes, so round the seconds to the closest minute */
tzsec = 60 * (int)round(tzsec / 60.0);
#endif
if (!(tzoff = PyDelta_FromDSU(0, tzsec, 0))) { goto exit; }
if (!(tzinfo = PyObject_CallFunctionObjArgs(tzinfo_factory, tzoff, NULL))) {
goto exit;

View File

@ -392,7 +392,10 @@ psyco_set_error(PyObject *exc, cursorObject *curs, const char *msg)
static int
psyco_is_main_interp(void)
{
#if PY_VERSION_HEX >= 0x03080000
#if PY_VERSION_HEX >= 0x030d0000
/* tested with Python 3.13.0a6 */
return PyInterpreterState_Get() == PyInterpreterState_Main();
#elif PY_VERSION_HEX >= 0x03080000
/* tested with Python 3.8.0a2 */
return _PyInterpreterState_Get() == PyInterpreterState_Main();
#else

View File

@ -1,22 +0,0 @@
This file is a simple placeholder for forcing the appveyor build cache
to invalidate itself since appveyor.yml changes more frequently then
the cache needs updating. Note, the versions list here can be
different than what is indicated in appveyor.yml.
To invalidate the cache, update this file and check it into git.
Currently used modules built in the cache:
OpenSSL
Version: 1.1.1l
PostgreSQL
Version: 14.1
NOTE: to zap the cache manually you can also use:
curl -X DELETE -H "Authorization: Bearer $APPVEYOR_TOKEN" -H "Content-Type: application/json" https://ci.appveyor.com/api/projects/psycopg/psycopg2/buildcache
with the token from https://ci.appveyor.com/api-token

View File

@ -1,845 +0,0 @@
#!/usr/bin/env python3
"""
Build steps for the windows binary packages.
The script is designed to be called by appveyor. Subcommands map the steps in
'appveyor.yml'.
"""
import re
import os
import sys
import json
import shutil
import logging
import subprocess as sp
from glob import glob
from pathlib import Path
from zipfile import ZipFile
from argparse import ArgumentParser
from tempfile import NamedTemporaryFile
from urllib.request import urlopen
opt = None
STEP_PREFIX = 'step_'
logger = logging.getLogger()
logging.basicConfig(
level=logging.INFO, format='%(asctime)s %(levelname)s %(message)s'
)
def main():
global opt
opt = parse_cmdline()
logger.setLevel(opt.loglevel)
cmd = globals()[STEP_PREFIX + opt.step]
cmd()
def setup_build_env():
"""
Set the environment variables according to the build environment
"""
setenv('VS_VER', opt.vs_ver)
path = [
str(opt.py_dir),
str(opt.py_dir / 'Scripts'),
r'C:\Strawberry\Perl\bin',
r'C:\Program Files\Git\mingw64\bin',
str(opt.ssl_build_dir / 'bin'),
os.environ['PATH'],
]
setenv('PATH', os.pathsep.join(path))
logger.info("Configuring compiler")
bat_call([opt.vc_dir / "vcvarsall.bat", 'x86' if opt.arch_32 else 'amd64'])
def python_info():
logger.info("Python Information")
run_python(['--version'], stderr=sp.STDOUT)
run_python(
['-c', "import sys; print('64bit: %s' % (sys.maxsize > 2**32))"]
)
def step_install():
python_info()
configure_sdk()
configure_postgres()
if opt.is_wheel:
install_wheel_support()
def install_wheel_support():
"""
Install an up-to-date pip wheel package to build wheels.
"""
run_python("-m pip install --upgrade pip".split())
run_python("-m pip install wheel".split())
def configure_sdk():
# The program rc.exe on 64bit with some versions look in the wrong path
# location when building postgresql. This cheats by copying the x64 bit
# files to that location.
if opt.arch_64:
for fn in glob(
r'C:\Program Files\Microsoft SDKs\Windows\v7.0\Bin\x64\rc*'
):
copy_file(
fn, r"C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\Bin"
)
def configure_postgres():
"""
Set up PostgreSQL config before the service starts.
"""
logger.info("Configuring Postgres")
with (opt.pg_data_dir / 'postgresql.conf').open('a') as f:
# allow > 1 prepared transactions for test cases
print("max_prepared_transactions = 10", file=f)
print("ssl = on", file=f)
# Create openssl certificate to allow ssl connection
cwd = os.getcwd()
os.chdir(opt.pg_data_dir)
run_openssl(
'req -new -x509 -days 365 -nodes -text '
'-out server.crt -keyout server.key -subj /CN=initd.org'.split()
)
run_openssl(
'req -new -nodes -text -out root.csr -keyout root.key '
'-subj /CN=initd.org'.split()
)
run_openssl(
'x509 -req -in root.csr -text -days 3650 -extensions v3_ca '
'-signkey root.key -out root.crt'.split()
)
run_openssl(
'req -new -nodes -text -out server.csr -keyout server.key '
'-subj /CN=initd.org'.split()
)
run_openssl(
'x509 -req -in server.csr -text -days 365 -CA root.crt '
'-CAkey root.key -CAcreateserial -out server.crt'.split()
)
os.chdir(cwd)
def run_openssl(args):
"""Run the appveyor-installed openssl with some args."""
# https://www.appveyor.com/docs/windows-images-software/
openssl = Path(r"C:\OpenSSL-v111-Win64") / 'bin' / 'openssl'
return run_command([openssl] + args)
def step_build_script():
setup_build_env()
build_openssl()
build_libpq()
build_psycopg()
if opt.is_wheel:
build_binary_packages()
def build_openssl():
top = opt.ssl_build_dir
if (top / 'lib' / 'libssl.lib').exists():
return
logger.info("Building OpenSSL")
# Setup directories for building OpenSSL libraries
ensure_dir(top / 'include' / 'openssl')
ensure_dir(top / 'lib')
# Setup OpenSSL Environment Variables based on processor architecture
if opt.arch_32:
target = 'VC-WIN32'
setenv('VCVARS_PLATFORM', 'x86')
else:
target = 'VC-WIN64A'
setenv('VCVARS_PLATFORM', 'amd64')
setenv('CPU', 'AMD64')
ver = os.environ['OPENSSL_VERSION']
# Download OpenSSL source
zipname = f'OpenSSL_{ver}.zip'
zipfile = opt.cache_dir / zipname
if not zipfile.exists():
download(
f"https://github.com/openssl/openssl/archive/{zipname}", zipfile
)
with ZipFile(zipfile) as z:
z.extractall(path=opt.build_dir)
sslbuild = opt.build_dir / f"openssl-OpenSSL_{ver}"
os.chdir(sslbuild)
run_command(
['perl', 'Configure', target, 'no-asm']
+ ['no-shared', 'no-zlib', f'--prefix={top}', f'--openssldir={top}']
)
run_command("nmake build_libs install_sw".split())
assert (top / 'lib' / 'libssl.lib').exists()
os.chdir(opt.clone_dir)
shutil.rmtree(sslbuild)
def build_libpq():
top = opt.pg_build_dir
if (top / 'lib' / 'libpq.lib').exists():
return
logger.info("Building libpq")
# Setup directories for building PostgreSQL librarires
ensure_dir(top / 'include')
ensure_dir(top / 'lib')
ensure_dir(top / 'bin')
ver = os.environ['POSTGRES_VERSION']
# Download PostgreSQL source
zipname = f'postgres-REL_{ver}.zip'
zipfile = opt.cache_dir / zipname
if not zipfile.exists():
download(
f"https://github.com/postgres/postgres/archive/REL_{ver}.zip",
zipfile,
)
with ZipFile(zipfile) as z:
z.extractall(path=opt.build_dir)
pgbuild = opt.build_dir / f"postgres-REL_{ver}"
os.chdir(pgbuild)
# Setup build config file (config.pl)
os.chdir("src/tools/msvc")
with open("config.pl", 'w') as f:
print(
"""\
$config->{ldap} = 0;
$config->{openssl} = "%s";
1;
"""
% str(opt.ssl_build_dir).replace('\\', '\\\\'),
file=f,
)
# Hack the Mkvcbuild.pm file so we build the lib version of libpq
file_replace('Mkvcbuild.pm', "'libpq', 'dll'", "'libpq', 'lib'")
# Build libpgport, libpgcommon, libpq
run_command([which("build"), "libpgport"])
run_command([which("build"), "libpgcommon"])
run_command([which("build"), "libpq"])
# Install includes
with (pgbuild / "src/backend/parser/gram.h").open("w") as f:
print("", file=f)
# Copy over built libraries
file_replace("Install.pm", "qw(Install)", "qw(Install CopyIncludeFiles)")
run_command(
["perl", "-MInstall=CopyIncludeFiles", "-e"]
+ [f"chdir('../../..'); CopyIncludeFiles('{top}')"]
)
for lib in ('libpgport', 'libpgcommon', 'libpq'):
copy_file(pgbuild / f'Release/{lib}/{lib}.lib', top / 'lib')
# Prepare local include directory for building from
for dir in ('win32', 'win32_msvc'):
merge_dir(pgbuild / f"src/include/port/{dir}", pgbuild / "src/include")
# Build pg_config in place
os.chdir(pgbuild / 'src/bin/pg_config')
run_command(
['cl', 'pg_config.c', '/MT', '/nologo', fr'/I{pgbuild}\src\include']
+ ['/link', fr'/LIBPATH:{top}\lib']
+ ['libpgcommon.lib', 'libpgport.lib', 'advapi32.lib']
+ ['/NODEFAULTLIB:libcmt.lib']
+ [fr'/OUT:{top}\bin\pg_config.exe']
)
assert (top / 'lib' / 'libpq.lib').exists()
assert (top / 'bin' / 'pg_config.exe').exists()
os.chdir(opt.clone_dir)
shutil.rmtree(pgbuild)
def build_psycopg():
os.chdir(opt.package_dir)
patch_package_name()
add_pg_config_path()
run_python(
["setup.py", "build_ext", "--have-ssl"]
+ ["-l", "libpgcommon libpgport"]
+ ["-L", opt.ssl_build_dir / 'lib']
+ ['-I', opt.ssl_build_dir / 'include']
)
run_python(["setup.py", "build_py"])
def patch_package_name():
"""Change the psycopg2 package name in the setup.py if required."""
if opt.package_name == 'psycopg2':
return
logger.info("changing package name to %s", opt.package_name)
with (opt.package_dir / 'setup.py').open() as f:
data = f.read()
# Replace the name of the package with what desired
rex = re.compile(r"""name=["']psycopg2["']""")
assert len(rex.findall(data)) == 1, rex.findall(data)
data = rex.sub(f'name="{opt.package_name}"', data)
with (opt.package_dir / 'setup.py').open('w') as f:
f.write(data)
def build_binary_packages():
"""Create wheel binary packages."""
os.chdir(opt.package_dir)
add_pg_config_path()
# Build .whl packages
run_python(['setup.py', 'bdist_wheel', "-d", opt.dist_dir])
def step_after_build():
if not opt.is_wheel:
install_built_package()
else:
install_binary_package()
def install_built_package():
"""Install the package just built by setup build."""
os.chdir(opt.package_dir)
# Install the psycopg just built
add_pg_config_path()
run_python(["setup.py", "install"])
shutil.rmtree("psycopg2.egg-info")
def install_binary_package():
"""Install the package from a packaged wheel."""
run_python(
['-m', 'pip', 'install', '--no-index', '-f', opt.dist_dir]
+ [opt.package_name]
)
def add_pg_config_path():
"""Allow finding in the path the pg_config just built."""
pg_path = str(opt.pg_build_dir / 'bin')
if pg_path not in os.environ['PATH'].split(os.pathsep):
setenv('PATH', os.pathsep.join([pg_path, os.environ['PATH']]))
def step_before_test():
print_psycopg2_version()
# Create and setup PostgreSQL database for the tests
run_command([opt.pg_bin_dir / 'createdb', os.environ['PSYCOPG2_TESTDB']])
run_command(
[opt.pg_bin_dir / 'psql', '-d', os.environ['PSYCOPG2_TESTDB']]
+ ['-c', "CREATE EXTENSION hstore"]
)
def print_psycopg2_version():
"""Print psycopg2 and libpq versions installed."""
for expr in (
'psycopg2.__version__',
'psycopg2.__libpq_version__',
'psycopg2.extensions.libpq_version()',
):
out = out_python(['-c', f"import psycopg2; print({expr})"])
logger.info("built %s: %s", expr, out.decode('ascii'))
def step_test_script():
check_libpq_version()
run_test_suite()
def check_libpq_version():
"""
Fail if the package installed is not using the expected libpq version.
"""
want_ver = tuple(map(int, os.environ['POSTGRES_VERSION'].split('_')))
want_ver = "%d%04d" % want_ver
got_ver = (
out_python(
['-c']
+ ["import psycopg2; print(psycopg2.extensions.libpq_version())"]
)
.decode('ascii')
.rstrip()
)
assert want_ver == got_ver, f"libpq version mismatch: {want_ver!r} != {got_ver!r}"
def run_test_suite():
# Remove this var, which would make badly a configured OpenSSL 1.1 work
os.environ.pop('OPENSSL_CONF', None)
# Run the unit test
args = [
'-c',
"import tests; tests.unittest.main(defaultTest='tests.test_suite')",
]
if opt.is_wheel:
os.environ['PSYCOPG2_TEST_FAST'] = '1'
else:
args.append('--verbose')
os.chdir(opt.package_dir)
run_python(args)
def step_on_success():
print_sha1_hashes()
if setup_ssh():
upload_packages()
def print_sha1_hashes():
"""
Print the packages sha1 so their integrity can be checked upon signing.
"""
logger.info("artifacts SHA1 hashes:")
os.chdir(opt.package_dir / 'dist')
run_command([which('sha1sum'), '-b', 'psycopg2-*/*'])
def setup_ssh():
"""
Configure ssh to upload built packages where they can be retrieved.
Return False if can't configure and upload shoould be skipped.
"""
# If we are not on the psycopg AppVeyor account, the environment variable
# REMOTE_KEY will not be decrypted. In that case skip uploading.
if os.environ['APPVEYOR_ACCOUNT_NAME'] != 'psycopg':
logger.warn("skipping artifact upload: you are not psycopg")
return False
pkey = os.environ.get('REMOTE_KEY', None)
if not pkey:
logger.warn("skipping artifact upload: no remote key")
return False
# Write SSH Private Key file from environment variable
pkey = pkey.replace(' ', '\n')
with (opt.clone_dir / 'data/id_rsa-psycopg-upload').open('w') as f:
f.write(
f"""\
-----BEGIN RSA PRIVATE KEY-----
{pkey}
-----END RSA PRIVATE KEY-----
"""
)
# Make a directory to please MinGW's version of ssh
ensure_dir(r"C:\MinGW\msys\1.0\home\appveyor\.ssh")
return True
def upload_packages():
# Upload built artifacts
logger.info("uploading artifacts")
os.chdir(opt.clone_dir)
run_command(
[r"C:\MinGW\msys\1.0\bin\rsync", "-avr"]
+ ["-e", r"C:\MinGW\msys\1.0\bin\ssh -F data/ssh_config"]
+ ["psycopg2/dist/", "upload:"]
)
def download(url, fn):
"""Download a file locally"""
logger.info("downloading %s", url)
with open(fn, 'wb') as fo, urlopen(url) as fi:
while 1:
data = fi.read(8192)
if not data:
break
fo.write(data)
logger.info("file downloaded: %s", fn)
def file_replace(fn, s1, s2):
"""
Replace all the occurrences of the string s1 into s2 in the file fn.
"""
assert os.path.exists(fn)
with open(fn, 'r+') as f:
data = f.read()
f.seek(0)
f.write(data.replace(s1, s2))
f.truncate()
def merge_dir(src, tgt):
"""
Merge the content of the directory src into the directory tgt
Reproduce the semantic of "XCOPY /Y /S src/* tgt"
"""
src = str(src)
for dp, _dns, fns in os.walk(src):
logger.debug("dirpath %s", dp)
if not fns:
continue
assert dp.startswith(src)
subdir = dp[len(src) :].lstrip(os.sep)
tgtdir = ensure_dir(os.path.join(tgt, subdir))
for fn in fns:
copy_file(os.path.join(dp, fn), tgtdir)
def bat_call(cmdline):
"""
Simulate 'CALL' from a batch file
Execute CALL *cmdline* and export the changed environment to the current
environment.
nana-nana-nana-nana...
"""
if not isinstance(cmdline, str):
cmdline = map(str, cmdline)
cmdline = ' '.join(c if ' ' not in c else '"%s"' % c for c in cmdline)
data = f"""\
CALL {cmdline}
{opt.py_exe} -c "import os, sys, json; \
json.dump(dict(os.environ), sys.stdout, indent=2)"
"""
logger.debug("preparing file to batcall:\n\n%s", data)
with NamedTemporaryFile(suffix='.bat') as tmp:
fn = tmp.name
with open(fn, "w") as f:
f.write(data)
try:
out = out_command(fn)
# be vewwy vewwy caweful to print the env var as it might contain
# secwet things like your pwecious pwivate key.
# logger.debug("output of command:\n\n%s", out.decode('utf8', 'replace'))
# The output has some useless crap on stdout, because sure, and json
# indented so the last { on column 1 is where we have to start parsing
m = list(re.finditer(b'^{', out, re.MULTILINE))[-1]
out = out[m.start() :]
env = json.loads(out)
for k, v in env.items():
if os.environ.get(k) != v:
setenv(k, v)
finally:
os.remove(fn)
def ensure_dir(dir):
if not isinstance(dir, Path):
dir = Path(dir)
if not dir.is_dir():
logger.info("creating directory %s", dir)
dir.mkdir(parents=True)
return dir
def run_command(cmdline, **kwargs):
"""Run a command, raise on error."""
if not isinstance(cmdline, str):
cmdline = list(map(str, cmdline))
logger.info("running command: %s", cmdline)
sp.check_call(cmdline, **kwargs)
def out_command(cmdline, **kwargs):
"""Run a command, return its output, raise on error."""
if not isinstance(cmdline, str):
cmdline = list(map(str, cmdline))
logger.info("running command: %s", cmdline)
data = sp.check_output(cmdline, **kwargs)
return data
def run_python(args, **kwargs):
"""
Run a script in the target Python.
"""
return run_command([opt.py_exe] + args, **kwargs)
def out_python(args, **kwargs):
"""
Return the output of a script run in the target Python.
"""
return out_command([opt.py_exe] + args, **kwargs)
def copy_file(src, dst):
logger.info("copying file %s -> %s", src, dst)
shutil.copy(src, dst)
def setenv(k, v):
logger.debug("setting %s=%s", k, v)
os.environ[k] = v
def which(name):
"""
Return the full path of a command found on the path
"""
base, ext = os.path.splitext(name)
if not ext:
exts = ('.com', '.exe', '.bat', '.cmd')
else:
exts = (ext,)
for dir in ['.'] + os.environ['PATH'].split(os.pathsep):
for ext in exts:
fn = os.path.join(dir, base + ext)
if os.path.isfile(fn):
return fn
raise Exception(f"couldn't find program on path: {name}")
class Options:
"""
An object exposing the script configuration from env vars and command line.
"""
@property
def py_ver(self):
"""The Python version to build as 2 digits string."""
rv = os.environ['PY_VER']
assert rv in ('36', '37', '38', '39', '310'), rv
return rv
@property
def py_arch(self):
"""The Python architecture to build, 32 or 64."""
rv = os.environ['PY_ARCH']
assert rv in ('32', '64'), rv
return int(rv)
@property
def arch_32(self):
"""True if the Python architecture to build is 32 bits."""
return self.py_arch == 32
@property
def arch_64(self):
"""True if the Python architecture to build is 64 bits."""
return self.py_arch == 64
@property
def package_name(self):
return os.environ.get('CONFIGURATION', 'psycopg2')
@property
def package_version(self):
"""The psycopg2 version number to build."""
with (self.package_dir / 'setup.py').open() as f:
data = f.read()
m = re.search(
r"""^PSYCOPG_VERSION\s*=\s*['"](.*)['"]""", data, re.MULTILINE
)
return m.group(1)
@property
def is_wheel(self):
"""Are we building the wheel packages or just the extension?"""
workflow = os.environ["WORKFLOW"]
return workflow == "packages"
@property
def py_dir(self):
"""
The path to the target python binary to execute.
"""
dirname = ''.join(
[r"C:\Python", self.py_ver, '-x64' if self.arch_64 else '']
)
return Path(dirname)
@property
def py_exe(self):
"""
The full path of the target python executable.
"""
return self.py_dir / 'python.exe'
@property
def vc_dir(self):
"""
The path of the Visual C compiler.
"""
if self.vs_ver == '16.0':
path = Path(
r"C:\Program Files (x86)\Microsoft Visual Studio\2019"
r"\Community\VC\Auxiliary\Build"
)
else:
path = Path(
r"C:\Program Files (x86)\Microsoft Visual Studio %s\VC"
% self.vs_ver
)
return path
@property
def vs_ver(self):
# https://wiki.python.org/moin/WindowsCompilers
# https://www.appveyor.com/docs/windows-images-software/#python
# Py 3.6--3.8 = VS Ver. 14.0 (VS 2015)
# Py 3.9 = VS Ver. 16.0 (VS 2019)
vsvers = {
'36': '14.0',
'37': '14.0',
'38': '14.0',
'39': '16.0',
'310': '16.0',
}
return vsvers[self.py_ver]
@property
def clone_dir(self):
"""The directory where the repository is cloned."""
return Path(r"C:\Project")
@property
def appveyor_pg_dir(self):
"""The directory of the postgres service made available by Appveyor."""
return Path(os.environ['POSTGRES_DIR'])
@property
def pg_data_dir(self):
"""The data dir of the appveyor postgres service."""
return self.appveyor_pg_dir / 'data'
@property
def pg_bin_dir(self):
"""The bin dir of the appveyor postgres service."""
return self.appveyor_pg_dir / 'bin'
@property
def pg_build_dir(self):
"""The directory where to build the postgres libraries for psycopg."""
return self.cache_arch_dir / 'postgresql'
@property
def ssl_build_dir(self):
"""The directory where to build the openssl libraries for psycopg."""
return self.cache_arch_dir / 'openssl'
@property
def cache_arch_dir(self):
rv = self.cache_dir / str(self.py_arch) / self.vs_ver
return ensure_dir(rv)
@property
def cache_dir(self):
return Path(r"C:\Others")
@property
def build_dir(self):
rv = self.cache_arch_dir / 'Builds'
return ensure_dir(rv)
@property
def package_dir(self):
return self.clone_dir
@property
def dist_dir(self):
"""The directory where to build packages to distribute."""
return (
self.package_dir / 'dist' / (f'psycopg2-{self.package_version}')
)
def parse_cmdline():
parser = ArgumentParser(description=__doc__)
g = parser.add_mutually_exclusive_group()
g.add_argument(
'-q',
'--quiet',
help="Talk less",
dest='loglevel',
action='store_const',
const=logging.WARN,
default=logging.INFO,
)
g.add_argument(
'-v',
'--verbose',
help="Talk more",
dest='loglevel',
action='store_const',
const=logging.DEBUG,
default=logging.INFO,
)
steps = [
n[len(STEP_PREFIX) :]
for n in globals()
if n.startswith(STEP_PREFIX) and callable(globals()[n])
]
parser.add_argument(
'step', choices=steps, help="the appveyor step to execute"
)
opt = parser.parse_args(namespace=Options())
return opt
if __name__ == '__main__':
sys.exit(main())

View File

@ -1,94 +1,216 @@
#!/bin/bash
# Build a modern version of libpq and depending libs from source on Centos 5
# Build a modern version of libpq and depending libs from source on Centos 5, Alpine or macOS
set -euo pipefail
set -x
openssl_version="1.1.1l"
ldap_version="2.4.59"
sasl_version="2.1.27"
postgres_version="14.1"
# Last release: https://www.postgresql.org/ftp/source/
# IMPORTANT! Change the cache key in packages.yml when upgrading libraries
postgres_version="${LIBPQ_VERSION}"
yum install -y zlib-devel krb5-devel pam-devel
# last release: https://www.openssl.org/source/
openssl_version="${OPENSSL_VERSION}"
# last release: https://kerberos.org/dist/
krb5_version="1.21.3"
# Build openssl if needed
openssl_tag="OpenSSL_${openssl_version//./_}"
openssl_dir="openssl-${openssl_tag}"
if [ ! -d "${openssl_dir}" ]; then curl -sL \
https://github.com/openssl/openssl/archive/${openssl_tag}.tar.gz \
| tar xzf -
# last release: https://openldap.org/software/download/
ldap_version="2.6.8"
cd "${openssl_dir}"
# last release: https://github.com/cyrusimap/cyrus-sasl/releases
sasl_version="2.1.28"
./config --prefix=/usr/local/ --openssldir=/usr/local/ \
zlib -fPIC shared
make depend
make
else
cd "${openssl_dir}"
export LIBPQ_BUILD_PREFIX=${LIBPQ_BUILD_PREFIX:-/tmp/libpq.build}
case "$(uname)" in
Darwin)
ID=macos
library_suffix=dylib
;;
Linux)
source /etc/os-release
library_suffix=so
;;
*)
echo "$0: unexpected Operating system: '$(uname)'" >&2
exit 1
;;
esac
if [[ -f "${LIBPQ_BUILD_PREFIX}/lib/libpq.${library_suffix}" ]]; then
echo "libpq already available: build skipped" >&2
exit 0
fi
# Install openssl
make install_sw
cd ..
case "$ID" in
centos)
yum update -y
yum install -y zlib-devel krb5-devel pam-devel
;;
alpine)
apk upgrade
apk add --no-cache zlib-dev krb5-dev linux-pam-dev openldap-dev openssl-dev
;;
macos)
brew install automake m4 libtool
# If available, libpq seemingly insists on linking against homebrew's
# openssl no matter what so remove it. Since homebrew's curl depends on
# it, force use of system curl.
brew uninstall --force --ignore-dependencies openssl gettext curl
if [ -z "${MACOSX_ARCHITECTURE:-}" ]; then
MACOSX_ARCHITECTURE="$(uname -m)"
fi
# Set the deployment target to be <= to that of the oldest supported Python version.
# e.g. https://www.python.org/downloads/release/python-380/
if [ "$MACOSX_ARCHITECTURE" == "x86_64" ]; then
export MACOSX_DEPLOYMENT_TARGET=10.9
else
export MACOSX_DEPLOYMENT_TARGET=11.0
fi
;;
*)
echo "$0: unexpected Linux distribution: '$ID'" >&2
exit 1
;;
esac
# Build libsasl2 if needed
# The system package (cyrus-sasl-devel) causes an amazing error on i686:
# "unsupported version 0 of Verneed record"
# https://github.com/pypa/manylinux/issues/376
sasl_tag="cyrus-sasl-${sasl_version}"
sasl_dir="cyrus-sasl-${sasl_tag}"
if [ ! -d "${sasl_dir}" ]; then
curl -sL \
https://github.com/cyrusimap/cyrus-sasl/archive/${sasl_tag}.tar.gz \
| tar xzf -
cd "${sasl_dir}"
autoreconf -i
./configure
make
if [ "$ID" == "macos" ]; then
make_configure_standard_flags=( \
--prefix=${LIBPQ_BUILD_PREFIX} \
"CPPFLAGS=-I${LIBPQ_BUILD_PREFIX}/include/ -arch $MACOSX_ARCHITECTURE" \
"LDFLAGS=-L${LIBPQ_BUILD_PREFIX}/lib -arch $MACOSX_ARCHITECTURE" \
)
else
cd "${sasl_dir}"
make_configure_standard_flags=( \
--prefix=${LIBPQ_BUILD_PREFIX} \
CPPFLAGS=-I${LIBPQ_BUILD_PREFIX}/include/ \
LDFLAGS=-L${LIBPQ_BUILD_PREFIX}/lib \
)
fi
# Install libsasl2
# requires missing nroff to build
touch saslauthd/saslauthd.8
make install
cd ..
if [ "$ID" == "centos" ] || [ "$ID" == "macos" ]; then
# Build openldap if needed
ldap_tag="${ldap_version}"
ldap_dir="openldap-${ldap_tag}"
if [ ! -d "${ldap_dir}" ]; then
curl -sL \
https://www.openldap.org/software/download/OpenLDAP/openldap-release/openldap-${ldap_tag}.tgz \
| tar xzf -
# Build openssl if needed
openssl_tag="OpenSSL_${openssl_version//./_}"
openssl_dir="openssl-${openssl_tag}"
if [ ! -d "${openssl_dir}" ]; then
curl -sL \
https://github.com/openssl/openssl/archive/${openssl_tag}.tar.gz \
| tar xzf -
cd "${ldap_dir}"
pushd "${openssl_dir}"
options=(--prefix=${LIBPQ_BUILD_PREFIX} --openssldir=${LIBPQ_BUILD_PREFIX} \
zlib -fPIC shared)
if [ -z "${MACOSX_ARCHITECTURE:-}" ]; then
./config $options
else
./configure "darwin64-$MACOSX_ARCHITECTURE-cc" $options
fi
make depend
make
else
pushd "${openssl_dir}"
fi
# Install openssl
make install_sw
popd
./configure --enable-backends=no --enable-null
make depend
make -C libraries/liblutil/
make -C libraries/liblber/
make -C libraries/libldap/
make -C libraries/libldap_r/
else
cd "${ldap_dir}"
fi
# Install openldap
make -C libraries/liblber/ install
make -C libraries/libldap/ install
make -C libraries/libldap_r/ install
make -C include/ install
chmod +x /usr/local/lib/{libldap,liblber}*.so*
cd ..
if [ "$ID" == "macos" ]; then
# Build kerberos if needed
krb5_dir="krb5-${krb5_version}/src"
if [ ! -d "${krb5_dir}" ]; then
curl -sL "https://kerberos.org/dist/krb5/${krb5_version%.*}/krb5-${krb5_version}.tar.gz" \
| tar xzf -
pushd "${krb5_dir}"
./configure "${make_configure_standard_flags[@]}"
make
else
pushd "${krb5_dir}"
fi
make install
popd
fi
if [ "$ID" == "centos" ] || [ "$ID" == "macos" ]; then
# Build libsasl2 if needed
# The system package (cyrus-sasl-devel) causes an amazing error on i686:
# "unsupported version 0 of Verneed record"
# https://github.com/pypa/manylinux/issues/376
sasl_tag="cyrus-sasl-${sasl_version}"
sasl_dir="cyrus-sasl-${sasl_tag}"
if [ ! -d "${sasl_dir}" ]; then
curl -sL \
https://github.com/cyrusimap/cyrus-sasl/archive/${sasl_tag}.tar.gz \
| tar xzf -
pushd "${sasl_dir}"
autoreconf -i
./configure "${make_configure_standard_flags[@]}" --disable-macos-framework
make
else
pushd "${sasl_dir}"
fi
# Install libsasl2
# requires missing nroff to build
touch saslauthd/saslauthd.8
make install
popd
fi
if [ "$ID" == "centos" ] || [ "$ID" == "macos" ]; then
# Build openldap if needed
ldap_tag="${ldap_version}"
ldap_dir="openldap-${ldap_tag}"
if [ ! -d "${ldap_dir}" ]; then
curl -sL \
https://www.openldap.org/software/download/OpenLDAP/openldap-release/openldap-${ldap_tag}.tgz \
| tar xzf -
pushd "${ldap_dir}"
./configure "${make_configure_standard_flags[@]}" --enable-backends=no --enable-null
make depend
make -C libraries/liblutil/
make -C libraries/liblber/
make -C libraries/libldap/
else
pushd "${ldap_dir}"
fi
# Install openldap
make -C libraries/liblber/ install
make -C libraries/libldap/ install
make -C include/ install
chmod +x ${LIBPQ_BUILD_PREFIX}/lib/{libldap,liblber}*.${library_suffix}*
popd
fi
# Build libpq if needed
@ -99,32 +221,33 @@ if [ ! -d "${postgres_dir}" ]; then
https://github.com/postgres/postgres/archive/${postgres_tag}.tar.gz \
| tar xzf -
cd "${postgres_dir}"
pushd "${postgres_dir}"
# Match the default unix socket dir default with what defined on Ubuntu and
# Red Hat, which seems the most common location
sed -i 's|#define DEFAULT_PGSOCKET_DIR .*'\
if [ "$ID" != "macos" ]; then
# Match the default unix socket dir default with what defined on Ubuntu and
# Red Hat, which seems the most common location
sed -i 's|#define DEFAULT_PGSOCKET_DIR .*'\
'|#define DEFAULT_PGSOCKET_DIR "/var/run/postgresql"|' \
src/include/pg_config_manual.h
# Without this, libpq ./configure fails on i686
if [[ "$(uname -m)" == "i686" ]]; then
export LD_LIBRARY_PATH=/usr/local/lib
src/include/pg_config_manual.h
fi
./configure --prefix=/usr/local --without-readline \
--with-gssapi --with-openssl --with-pam --with-ldap
# Often needed, but currently set by the workflow
# export LD_LIBRARY_PATH="${LIBPQ_BUILD_PREFIX}/lib"
./configure "${make_configure_standard_flags[@]}" --sysconfdir=/etc/postgresql-common \
--with-gssapi --with-openssl --with-pam --with-ldap \
--without-readline --without-icu
make -C src/interfaces/libpq
make -C src/bin/pg_config
make -C src/include
else
cd "${postgres_dir}"
pushd "${postgres_dir}"
fi
# Install libpq
make -C src/interfaces/libpq install
make -C src/bin/pg_config install
make -C src/include install
cd ..
popd
find /usr/local/ -name \*.so.\* -type f -exec strip --strip-unneeded {} \;
find ${LIBPQ_BUILD_PREFIX} -name \*.${library_suffix}.\* -type f -exec strip --strip-unneeded {} \;

View File

@ -1,82 +0,0 @@
#!/bin/bash
# Create macOS wheels for psycopg2
#
# Following instructions from https://github.com/MacPython/wiki/wiki/Spinning-wheels
# Cargoculting pieces of implementation from https://github.com/matthew-brett/multibuild
set -euo pipefail
set -x
dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
prjdir="$( cd "${dir}/../.." && pwd )"
brew update
brew install gnu-sed postgresql@14
# Fetch 14.1 if 14.0 is still the default version
brew reinstall postgresql
# Start the database for testing
brew services start postgresql
for i in $(seq 10 -1 0); do
eval pg_isready && break
if [ $i == 0 ]; then
echo "PostgreSQL service not ready, giving up"
exit 1
fi
echo "PostgreSQL service not ready, waiting a bit, attempts left: $i"
sleep 5
done
# Find psycopg version
version=$(grep -e ^PSYCOPG_VERSION "${prjdir}/setup.py" | gsed "s/.*'\(.*\)'/\1/")
# A gratuitous comment to fix broken vim syntax file: '")
distdir="${prjdir}/dist/psycopg2-$version"
mkdir -p "$distdir"
# Install required python packages
pip install -U pip wheel delocate
# Replace the package name
if [[ "${PACKAGE_NAME:-}" ]]; then
gsed -i "s/^setup(name=\"psycopg2\"/setup(name=\"${PACKAGE_NAME}\"/" \
"${prjdir}/setup.py"
fi
# Build the wheels
wheeldir="${prjdir}/wheels"
pip wheel -w ${wheeldir} .
delocate-listdeps ${wheeldir}/*.whl
# Check where is the libpq. I'm gonna kill it for testing
if [[ -z "${LIBPQ:-}" ]]; then
export LIBPQ=$(delocate-listdeps ${wheeldir}/*.whl | grep libpq)
fi
delocate-wheel ${wheeldir}/*.whl
# https://github.com/MacPython/wiki/wiki/Spinning-wheels#question-will-pip-give-me-a-broken-wheel
delocate-addplat --rm-orig -x 10_9 -x 10_10 ${wheeldir}/*.whl
cp ${wheeldir}/*.whl ${distdir}
# kill the libpq to make sure tests don't depend on it
mv "$LIBPQ" "${LIBPQ}-bye"
# Install and test the built wheel
pip install ${PACKAGE_NAME:-psycopg2} --no-index -f "$distdir"
# Print psycopg and libpq versions
python -c "import psycopg2; print(psycopg2.__version__)"
python -c "import psycopg2; print(psycopg2.__libpq_version__)"
python -c "import psycopg2; print(psycopg2.extensions.libpq_version())"
# fail if we are not using the expected libpq library
# Disabled as we just use what's available on the system on macOS
# if [[ "${WANT_LIBPQ:-}" ]]; then
# python -c "import psycopg2, sys; sys.exit(${WANT_LIBPQ} != psycopg2.extensions.libpq_version())"
# fi
python -c "import tests; tests.unittest.main(defaultTest='tests.test_suite')"
# just because I'm a boy scout
mv "${LIBPQ}-bye" "$LIBPQ"

View File

@ -0,0 +1,106 @@
#!/bin/bash
# Build psycopg2-binary wheel packages for Apple M1 (cpNNN-macosx_arm64)
#
# This script is designed to run on Scaleway Apple Silicon machines.
#
# The script cannot be run as sudo (installing brew fails), but requires sudo,
# so it can pretty much only be executed by a sudo user as it is.
set -euo pipefail
# set -x
python_versions="3.8.18 3.9.18 3.10.13 3.11.6 3.12.0"
pg_version=16
function log {
echo "$@" >&2
}
# Move to the root of the project
dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
cd "${dir}/../../"
# Add /usr/local/bin to the path. It seems it's not, in non-interactive sessions
if ! (echo $PATH | grep -q '/usr/local/bin'); then
export PATH=/usr/local/bin:$PATH
fi
# Install brew, if necessary. Otherwise just make sure it's in the path
if [[ -x /opt/homebrew/bin/brew ]]; then
eval "$(/opt/homebrew/bin/brew shellenv)"
else
log "installing brew"
command -v brew > /dev/null || (
# Not necessary: already installed
# xcode-select --install
NONINTERACTIVE=1 /bin/bash -c "$(curl -fsSL \
https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"
)
eval "$(/opt/homebrew/bin/brew shellenv)"
fi
export PGDATA=/opt/homebrew/var/postgresql@${pg_version}
# Install PostgreSQL, if necessary
command -v pg_config > /dev/null || (
log "installing postgres"
brew install postgresql@${pg_version}
)
# Starting from PostgreSQL 15, the bin path is not in the path.
export PATH="$(ls -d1 /opt/homebrew/Cellar/postgresql@${pg_version}/*/bin):$PATH"
# Make sure the server is running
# Currently not working
# brew services start postgresql@${pg_version}
if ! pg_ctl status; then
log "starting the server"
pg_ctl -l "/opt/homebrew/var/log/postgresql@${pg_version}.log" start
fi
# Install the Python versions we want to build
for ver3 in $python_versions; do
ver2=$(echo $ver3 | sed 's/\([^\.]*\)\(\.[^\.]*\)\(.*\)/\1\2/')
command -v python${ver2} > /dev/null || (
log "installing Python $ver3"
(cd /tmp &&
curl -fsSl -O \
https://www.python.org/ftp/python/${ver3}/python-${ver3}-macos11.pkg)
sudo installer -pkg /tmp/python-${ver3}-macos11.pkg -target /
)
done
# Create a virtualenv where to work
if [[ ! -x .venv/bin/python ]]; then
log "creating a virtualenv"
python3 -m venv .venv
fi
log "installing cibuildwheel"
source .venv/bin/activate
pip install cibuildwheel
log "building wheels"
# Build the binary packages
export CIBW_PLATFORM=macos
export CIBW_ARCHS=arm64
export CIBW_BUILD='cp{38,39,310,311,312}-*'
export CIBW_TEST_COMMAND='python -c "import tests; tests.unittest.main(defaultTest=\"tests.test_suite\")"'
export PSYCOPG2_TESTDB=postgres
export PYTHONPATH=$(pwd)
# For some reason, cibuildwheel tests says that psycopg2 is already installed,
# refuses to install, then promptly fails import. So, please, seriously,
# install this thing.
export PIP_FORCE_REINSTALL=1
# Replace the package name
sed -i .bak 's/^setup(name="psycopg2"/setup(name="psycopg2-binary"/' setup.py
cibuildwheel

View File

@ -1,76 +0,0 @@
#!/bin/bash
# Create manylinux2014 wheels for psycopg2
#
# manylinux2014 is built on CentOS 7, which packages an old version of the
# libssl, (1.0, which has concurrency problems with the Python libssl). So we
# need to build these libraries from source.
#
# Look at the .github/workflows/packages.yml file for hints about how to use it.
set -euo pipefail
set -x
dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
prjdir="$( cd "${dir}/../.." && pwd )"
# Build all the available versions, or just the ones specified in PYVERS
if [ ! "${PYVERS:-}" ]; then
PYVERS="$(ls /opt/python/)"
fi
# Find psycopg version
version=$(grep -e ^PSYCOPG_VERSION "${prjdir}/setup.py" | sed "s/.*'\(.*\)'/\1/")
# A gratuitous comment to fix broken vim syntax file: '")
distdir="${prjdir}/dist/psycopg2-$version"
# Replace the package name
if [[ "${PACKAGE_NAME:-}" ]]; then
sed -i "s/^setup(name=\"psycopg2\"/setup(name=\"${PACKAGE_NAME}\"/" \
"${prjdir}/setup.py"
fi
# Build depending libraries
"${dir}/build_libpq.sh" > /dev/null
# Create the wheel packages
for pyver in $PYVERS; do
pybin="/opt/python/${pyver}/bin"
"${pybin}/pip" wheel "${prjdir}" -w "${prjdir}/dist/"
done
# Bundle external shared libraries into the wheels
for whl in "${prjdir}"/dist/*.whl; do
"${dir}/strip_wheel.sh" "$whl"
auditwheel repair "$whl" -w "$distdir"
done
# Make sure the libpq is not in the system
for f in $(find /usr/local/lib -name libpq\*) ; do
mkdir -pv "/libpqbak/$(dirname $f)"
mv -v "$f" "/libpqbak/$(dirname $f)"
done
# Install packages and test
cd "${prjdir}"
for pyver in $PYVERS; do
pybin="/opt/python/${pyver}/bin"
"${pybin}/pip" install ${PACKAGE_NAME:-psycopg2} --no-index -f "$distdir"
# Print psycopg and libpq versions
"${pybin}/python" -c "import psycopg2; print(psycopg2.__version__)"
"${pybin}/python" -c "import psycopg2; print(psycopg2.__libpq_version__)"
"${pybin}/python" -c "import psycopg2; print(psycopg2.extensions.libpq_version())"
# Fail if we are not using the expected libpq library
if [[ "${WANT_LIBPQ:-}" ]]; then
"${pybin}/python" -c "import psycopg2, sys; sys.exit(${WANT_LIBPQ} != psycopg2.extensions.libpq_version())"
fi
"${pybin}/python" -c "import tests; tests.unittest.main(defaultTest='tests.test_suite')"
done
# Restore the libpq packages
for f in $(cd /libpqbak/ && find . -not -type d); do
mv -v "/libpqbak/$f" "/$f"
done

View File

@ -1,76 +0,0 @@
#!/bin/bash
# Create manylinux_2_24 wheels for psycopg2
#
# Look at the .github/workflows/packages.yml file for hints about how to use it.
set -euo pipefail
set -x
dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
prjdir="$( cd "${dir}/../.." && pwd )"
# Build all the available versions, or just the ones specified in PYVERS
if [ ! "${PYVERS:-}" ]; then
PYVERS="$(ls /opt/python/)"
fi
# Find psycopg version
version=$(grep -e ^PSYCOPG_VERSION "${prjdir}/setup.py" | sed "s/.*'\(.*\)'/\1/")
# A gratuitous comment to fix broken vim syntax file: '")
distdir="${prjdir}/dist/psycopg2-$version"
# Replace the package name
if [[ "${PACKAGE_NAME:-}" ]]; then
sed -i "s/^setup(name=\"psycopg2\"/setup(name=\"${PACKAGE_NAME}\"/" \
"${prjdir}/setup.py"
fi
# Install prerequisite libraries
curl -k -s https://www.postgresql.org/media/keys/ACCC4CF8.asc | apt-key add -
echo "deb http://apt.postgresql.org/pub/repos/apt stretch-pgdg main" \
> /etc/apt/sources.list.d/pgdg.list
apt-get -y update
apt-get install -y libpq-dev
# Create the wheel packages
for pyver in $PYVERS; do
pybin="/opt/python/${pyver}/bin"
"${pybin}/pip" wheel "${prjdir}" -w "${prjdir}/dist/"
done
# Bundle external shared libraries into the wheels
for whl in "${prjdir}"/dist/*.whl; do
"${dir}/strip_wheel.sh" "$whl"
auditwheel repair "$whl" -w "$distdir"
done
# Make sure the libpq is not in the system
for f in $(find /usr/lib /usr/lib64 -name libpq\*) ; do
mkdir -pv "/libpqbak/$(dirname $f)"
mv -v "$f" "/libpqbak/$(dirname $f)"
done
# Install packages and test
cd "${prjdir}"
for pyver in $PYVERS; do
pybin="/opt/python/${pyver}/bin"
"${pybin}/pip" install ${PACKAGE_NAME:-psycopg2} --no-index -f "$distdir"
# Print psycopg and libpq versions
"${pybin}/python" -c "import psycopg2; print(psycopg2.__version__)"
"${pybin}/python" -c "import psycopg2; print(psycopg2.__libpq_version__)"
"${pybin}/python" -c "import psycopg2; print(psycopg2.extensions.libpq_version())"
# Fail if we are not using the expected libpq library
if [[ "${WANT_LIBPQ:-}" ]]; then
"${pybin}/python" -c "import psycopg2, sys; sys.exit(${WANT_LIBPQ} != psycopg2.extensions.libpq_version())"
fi
"${pybin}/python" -c "import tests; tests.unittest.main(defaultTest='tests.test_suite')"
done
# Restore the libpq packages
for f in $(cd /libpqbak/ && find . -not -type d); do
mv -v "/libpqbak/$f" "/$f"
done

View File

@ -1,68 +0,0 @@
#!/bin/bash
# Create musllinux_1_1 wheels for psycopg2
#
# Look at the .github/workflows/packages.yml file for hints about how to use it.
set -euo pipefail
set -x
dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
prjdir="$( cd "${dir}/../.." && pwd )"
# Build all the available versions, or just the ones specified in PYVERS
if [ ! "${PYVERS:-}" ]; then
PYVERS="$(ls /opt/python/)"
fi
# Find psycopg version
version=$(grep -e ^PSYCOPG_VERSION "${prjdir}/setup.py" | sed "s/.*'\(.*\)'/\1/")
# A gratuitous comment to fix broken vim syntax file: '")
distdir="${prjdir}/dist/psycopg2-$version"
# Replace the package name
if [[ "${PACKAGE_NAME:-}" ]]; then
sed -i "s/^setup(name=\"psycopg2\"/setup(name=\"${PACKAGE_NAME}\"/" \
"${prjdir}/setup.py"
fi
# Install prerequisite libraries
apk update
apk add postgresql-dev
# Add findutils because the Busybox version lacks the `-ls` flag, used by the
# `strip_wheel.sh` script.
apk add findutils
# Create the wheel packages
for pyver in $PYVERS; do
pybin="/opt/python/${pyver}/bin"
"${pybin}/python" -m build -w -o "${prjdir}/dist/" "${prjdir}"
done
# Bundle external shared libraries into the wheels
for whl in "${prjdir}"/dist/*.whl; do
"${dir}/strip_wheel.sh" "$whl"
auditwheel repair "$whl" -w "$distdir"
done
# Make sure the postgresql-dev is not in the system
apk del postgresql-dev
# Install packages and test
cd "${prjdir}"
for pyver in $PYVERS; do
pybin="/opt/python/${pyver}/bin"
"${pybin}/pip" install ${PACKAGE_NAME:-psycopg2} --no-index -f "$distdir"
# Print psycopg and libpq versions
"${pybin}/python" -c "import psycopg2; print(psycopg2.__version__)"
"${pybin}/python" -c "import psycopg2; print(psycopg2.__libpq_version__)"
"${pybin}/python" -c "import psycopg2; print(psycopg2.extensions.libpq_version())"
# Fail if we are not using the expected libpq library
if [[ "${WANT_LIBPQ:-}" ]]; then
"${pybin}/python" -c "import psycopg2, sys; sys.exit(${WANT_LIBPQ} != psycopg2.extensions.libpq_version())"
fi
"${pybin}/python" -c "import tests; tests.unittest.main(defaultTest='tests.test_suite')"
done

View File

@ -9,7 +9,7 @@ prjdir="$( cd "${dir}/../.." && pwd )"
# Find psycopg version
version=$(grep -e ^PSYCOPG_VERSION setup.py | sed "s/.*'\(.*\)'/\1/")
# A gratuitous comment to fix broken vim syntax file: '")
distdir="${prjdir}/dist/psycopg2-$version"
distdir="${prjdir}/dist"
# Replace the package name
if [[ "${PACKAGE_NAME:-}" ]]; then

View File

@ -1,117 +0,0 @@
#!/usr/bin/env python
"""Download packages from appveyor artifacts
"""
import os
import re
import sys
import logging
import datetime as dt
from pathlib import Path
from argparse import ArgumentParser
import requests
logger = logging.getLogger()
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s %(message)s")
API_URL = "https://ci.appveyor.com/api"
REPOS = "psycopg/psycopg2"
WORKFLOW_NAME = "Build packages"
class ScriptError(Exception):
"""Controlled exception raised by the script."""
def main():
opt = parse_cmdline()
try:
token = os.environ["APPVEYOR_TOKEN"]
except KeyError:
raise ScriptError("please set a APPVEYOR_TOKEN to download artifacts")
s = requests.Session()
s.headers["Content-Type"] = "application/json"
s.headers["Authorization"] = f"Bearer {token}"
if opt.build:
logger.info("fetching build %s", opt.build)
resp = s.get(f"{API_URL}/projects/{REPOS}/build/{opt.build}")
else:
logger.info("fetching last run")
resp = s.get(f"{API_URL}/projects/{REPOS}")
resp.raise_for_status()
data = resp.json()
updated_at = dt.datetime.fromisoformat(
re.sub(r"\.\d+", "", data["build"]["finished"])
)
now = dt.datetime.now(dt.timezone.utc)
age = now - updated_at
logger.info(
f"found build {data['build']['version']} updated {pretty_interval(age)} ago"
)
if age > dt.timedelta(hours=6):
logger.warning("maybe it's a bit old?")
jobs = data["build"]["jobs"]
for job in jobs:
if job["status"] != "success":
raise ScriptError(f"status for job {job['jobId']} is {job['status']}")
logger.info(f"fetching artifacts info for {job['name']}")
resp = s.get(f"{API_URL}/buildjobs/{job['jobId']}/artifacts/")
resp.raise_for_status()
afs = resp.json()
for af in afs:
fn = af["fileName"]
if fn.startswith("dist/"):
fn = fn.split("/", 1)[1]
dest = Path("packages") / fn
logger.info(f"downloading {dest}")
resp = s.get(
f"{API_URL}/buildjobs/{job['jobId']}/artifacts/{af['fileName']}"
)
resp.raise_for_status()
if not dest.parent.exists():
dest.parent.mkdir(parents=True)
with dest.open("wb") as f:
f.write(resp.content)
logger.info("now you can run: 'twine upload -s packages/*'")
def parse_cmdline():
parser = ArgumentParser(description=__doc__)
parser.add_argument("--build", help="build version to download [default: latest]")
opt = parser.parse_args()
return opt
def pretty_interval(td):
secs = td.total_seconds()
mins, secs = divmod(secs, 60)
hours, mins = divmod(mins, 60)
days, hours = divmod(hours, 24)
if days:
return f"{int(days)} days, {int(hours)} hours, {int(mins)} minutes"
elif hours:
return f"{int(hours)} hours, {int(mins)} minutes"
else:
return f"{int(mins)} minutes"
if __name__ == "__main__":
try:
sys.exit(main())
except ScriptError as e:
logger.error("%s", e)
sys.exit(1)
except KeyboardInterrupt:
logger.info("user interrupt")
sys.exit(1)

View File

@ -1,99 +0,0 @@
#!/usr/bin/env python
"""Download packages from github actions artifacts
"""
import io
import os
import sys
import logging
import datetime as dt
from pathlib import Path
from zipfile import ZipFile
import requests
logger = logging.getLogger()
logging.basicConfig(level=logging.INFO, format="%(asctime)s %(levelname)s %(message)s")
REPOS = "psycopg/psycopg2"
WORKFLOW_NAME = "Build packages"
class ScriptError(Exception):
"""Controlled exception raised by the script."""
def main():
try:
token = os.environ["GITHUB_TOKEN"]
except KeyError:
raise ScriptError("please set a GITHUB_TOKEN to download artifacts")
s = requests.Session()
s.headers["Accept"] = "application/vnd.github.v3+json"
s.headers["Authorization"] = f"token {token}"
logger.info("looking for recent runs")
resp = s.get(f"https://api.github.com/repos/{REPOS}/actions/runs?per_page=10")
resp.raise_for_status()
for run in resp.json()["workflow_runs"]:
if run["name"] == WORKFLOW_NAME:
break
else:
raise ScriptError(f"couldn't find {WORKFLOW_NAME!r} in recent runs")
if run["status"] != "completed":
raise ScriptError(f"run #{run['run_number']} is in status {run['status']}")
updated_at = dt.datetime.fromisoformat(run["updated_at"].replace("Z", "+00:00"))
now = dt.datetime.now(dt.timezone.utc)
age = now - updated_at
logger.info(f"found run #{run['run_number']} updated {pretty_interval(age)} ago")
if age > dt.timedelta(hours=6):
logger.warning("maybe it's a bit old?")
logger.info(f"looking for run #{run['run_number']} artifacts")
resp = s.get(f"{run['url']}/artifacts")
resp.raise_for_status()
artifacts = resp.json()["artifacts"]
dest = Path("packages")
if not dest.exists():
logger.info(f"creating dir {dest}")
dest.mkdir(parents=True)
for artifact in artifacts:
logger.info(f"downloading {artifact['name']} archive")
zip_url = artifact["archive_download_url"]
resp = s.get(zip_url)
with ZipFile(io.BytesIO(resp.content)) as zf:
logger.info("extracting archive content")
zf.extractall(dest)
logger.info(f"now you can run: 'twine upload -s {dest}/*'")
def pretty_interval(td):
secs = td.total_seconds()
mins, secs = divmod(secs, 60)
hours, mins = divmod(mins, 60)
days, hours = divmod(hours, 24)
if days:
return f"{int(days)} days, {int(hours)} hours, {int(mins)} minutes"
elif hours:
return f"{int(hours)} hours, {int(mins)} minutes"
else:
return f"{int(mins)} minutes"
if __name__ == "__main__":
try:
sys.exit(main())
except ScriptError as e:
logger.error("%s", e)
sys.exit(1)
except KeyboardInterrupt:
logger.info("user interrupt")
sys.exit(1)

View File

@ -0,0 +1,101 @@
#!/usr/bin/env python
"""
We use vcpkg in github actions to build psycopg-binary.
This is a stub to work as `pg_config --libdir` or `pg_config --includedir` to
make it work with vcpkg.
You will need install `vcpkg`, set `VCPKG_ROOT` env, and run `vcpkg install
libpq:x64-windows-release` before using this script.
"""
import os
import sys
import platform
from pathlib import Path
from argparse import ArgumentParser, Namespace, RawDescriptionHelpFormatter
class ScriptError(Exception):
"""Controlled exception raised by the script."""
def _main() -> None:
# only x64-windows
if not (sys.platform == "win32" and platform.machine() == "AMD64"):
raise ScriptError("this script should only be used in x64-windows")
vcpkg_root = os.environ.get(
"VCPKG_ROOT", os.environ.get("VCPKG_INSTALLATION_ROOT", "")
)
if not vcpkg_root:
raise ScriptError("VCPKG_ROOT/VCPKG_INSTALLATION_ROOT env var not specified")
vcpkg_platform_root = (Path(vcpkg_root) / "installed/x64-windows-release").resolve()
args = parse_cmdline()
if args.libdir:
if not (f := vcpkg_platform_root / "lib/libpq.lib").exists():
raise ScriptError(f"libpq library not found: {f}")
print(vcpkg_platform_root.joinpath("lib"))
elif args.includedir or args.includedir_server:
# NOTE: on linux, the includedir-server dir contains pg_config.h
# which we need because it includes the PG_VERSION_NUM macro.
# In the vcpkg directory this file is in the includedir directory,
# therefore we return the same value.
if not (d := vcpkg_platform_root / "include/libpq").is_dir():
raise ScriptError(f"libpq include directory not found: {d}")
print(vcpkg_platform_root.joinpath("include"))
elif args.cppflags or args.ldflags:
print("")
else:
raise ScriptError("command not handled")
def parse_cmdline() -> Namespace:
parser = ArgumentParser(
description=__doc__, formatter_class=RawDescriptionHelpFormatter
)
g = parser.add_mutually_exclusive_group(required=True)
g.add_argument(
"--libdir",
action="store_true",
help="show location of object code libraries",
)
g.add_argument(
"--includedir",
action="store_true",
help="show location of C header files of the client interfaces",
)
g.add_argument(
"--includedir-server",
action="store_true",
help="show location of C header files for the server",
)
g.add_argument(
"--cppflags",
action="store_true",
help="(dummy) show CPPFLAGS value used when PostgreSQL was built",
)
g.add_argument(
"--ldflags",
action="store_true",
help="(dummy) show LDFLAGS value used when PostgreSQL was built",
)
opt = parser.parse_args()
return opt
def main() -> None:
try:
_main()
except ScriptError as e:
print(f"ERROR: {e}.", file=sys.stderr)
sys.exit(1)
if __name__ == "__main__":
main()

View File

@ -0,0 +1,11 @@
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[project]
name = 'pg_config_vcpkg_stub'
version = "0"
description = "see docs string in pg_config_vcpkg_stub for more details"
[project.scripts]
pg_config = 'pg_config_vcpkg_stub:main'

View File

@ -0,0 +1,37 @@
#!/bin/bash
# Take a .so file as input and print the Debian packages and versions of the
# libraries it links.
set -euo pipefail
# set -x
source /etc/os-release
sofile="$1"
case "$ID" in
alpine)
depfiles=$( (ldd "$sofile" 2>/dev/null || true) | grep '=>' | sed 's/.*=> \(.*\) (.*)/\1/')
(for depfile in $depfiles; do
echo "$(basename "$depfile") => $(apk info --who-owns "${depfile}" | awk '{print $(NF)}')"
done) | sort | uniq
;;
debian)
depfiles=$(ldd "$sofile" | grep '=>' | sed 's/.*=> \(.*\) (.*)/\1/')
(for depfile in $depfiles; do
pkgname=$(dpkg -S "${depfile}" | sed 's/\(\): .*/\1/')
dpkg -l "${pkgname}" | grep '^ii' | awk '{print $2 " => " $3}'
done) | sort | uniq
;;
centos)
echo "TODO!"
;;
*)
echo "$0: unexpected Linux distribution: '$ID'" >&2
exit 1
;;
esac

View File

@ -0,0 +1,65 @@
#!/bin/bash
# Build psycopg2-binary wheel packages for Apple M1 (cpNNN-macosx_arm64)
#
# This script is designed to run on a local machine: it will clone the repos
# remotely and execute the `build_macos_arm64.sh` script remotely, then will
# download the built packages. A tag to build must be specified.
# The script requires a Scaleway secret key in the SCW_SECRET_KEY env var:
# It will use scaleway_m1.sh to provision a server and use it.
set -euo pipefail
# set -x
function log {
echo "$@" >&2
}
function error {
# Print an error message and exit.
log "ERROR: $@"
exit 1
}
tag=${1:-}
if [[ ! "${tag}" ]]; then
error "Usage: $0 REF"
fi
dir="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
server=$("${dir}/scaleway_m1.sh" ensure)
status=$(echo "$server" | jq -r .status)
if [[ "$status" != "ready" ]]; then
error "server status is $status"
fi
# Get user, password, ip from vnc url
tmp=$(echo "$server" | jq -r .vnc_url) # vnc://m1:PASS@1.2.3.4:5900
tmp=${tmp/vnc:\/\//} # m1:PASS@1.2.3.4:5900
user=${tmp%%:*} # m1
tmp=${tmp#*:} # PASS@1.2.3.4:5900
password=${tmp%%@*} # PASS
tmp=${tmp#*@} # 1.2.3.4:5900
host=${tmp%%:*} # 1.2.3.4
ssh="ssh ${user}@${host} -o StrictHostKeyChecking=no"
# Allow the user to sudo without asking for password.
echo "$password" | \
$ssh sh -c "test -f /etc/sudoers.d/${user} \
|| sudo -S --prompt= sh -c \
'echo \"${user} ALL=(ALL) NOPASSWD:ALL\" > /etc/sudoers.d/${user}'"
# Clone the repos
rdir=psycobuild
$ssh rm -rf "${rdir}"
$ssh git clone https://github.com/psycopg/psycopg2.git --branch ${tag} "${rdir}"
# Build the wheel packages
$ssh "${rdir}/scripts/build/build_macos_arm64.sh"
# Transfer the packages locally
scp -r "${user}@${host}:${rdir}/wheelhouse" .

119
scripts/build/scaleway_m1.sh Executable file
View File

@ -0,0 +1,119 @@
#!/bin/bash
# Implement the following commands:
#
# ensure:
#
# Get data about currently provisioned M1 server on Scaleway. If needed,
# provision one.
#
# The script requires the SCW_SECRET_KEY env var set to a valid secret.
#
# If successful, return the response data on stdout. It may look like:
#
# {
# "id": "8b196119-3cea-4a9d-b916-265037a85e60",
# "type": "M1-M",
# "name": "mac-m1-psycopg",
# "project_id": "4cf7a85e-f21e-40d4-b758-21d1f4ad3dfb",
# "organization_id": "4cf7a85e-f21e-40d4-b758-21d1f4ad3dfb",
# "ip": "1.2.3.4",
# "vnc_url": "vnc://m1:PASSWORD@1.2.3.4:5900",
# "status": "starting",
# "created_at": "2023-09-22T18:00:18.754646Z",
# "updated_at": "2023-09-22T18:00:18.754646Z",
# "deletable_at": "2023-09-23T18:00:18.754646Z",
# "zone": "fr-par-3"
# }
#
# delete:
#
# Delete one provisioned server, if available.
#
# See https://www.scaleway.com/en/developers/api/apple-silicon/ for api docs.
set -euo pipefail
# set -x
project_id="4cf7a85e-f21e-40d4-b758-21d1f4ad3dfb"
zone=fr-par-3
servers_url="https://api.scaleway.com/apple-silicon/v1alpha1/zones/${zone}/servers"
function log {
echo "$@" >&2
}
function error {
log "ERROR: $@"
exit 1
}
function req {
method=$1
shift
curl -sSL --fail-with-body -X $method \
-H "Content-Type: application/json" \
-H "X-Auth-Token: ${SCW_SECRET_KEY}" \
"$@"
}
function get {
req GET "$@"
}
function post {
req POST "$@"
}
function delete {
req DELETE "$@"
}
function server_id {
# Return the id of the first server available, else the empty string
servers=$(get $servers_url || error "failed to request servers list")
server_ids=$(echo "$servers" | jq -r ".servers[].id")
for id in $server_ids; do
echo $id
break
done
}
function maybe_jq {
# Process the output via jq if displaying on console, otherwise leave
# it unprocessed.
if [ -t 1 ]; then
jq .
else
cat
fi
}
cmd=${1:-list}
case $cmd in
ensure)
id=$(server_id)
if [[ "$id" ]]; then
log "You have servers."
get "$servers_url/$id" | maybe_jq
else
log "Creating new server."
post $servers_url -d "
{
\"name\": \"mac-m1-psycopg\",
\"project_id\": \"$project_id\",
\"type\": \"M1-M\"
}" | maybe_jq
fi
;;
delete)
id=$(server_id)
if [[ "$id" ]]; then
log "Deleting server $id."
delete "$servers_url/$id" | maybe_jq
else
log "No server found."
fi
;;
list)
get $servers_url | maybe_jq
;;
*)
error "Usage: $(basename $0) [list|ensure|delete]"
esac

View File

@ -14,28 +14,36 @@
# This script is designed to run on a wheel archive before auditwheel.
set -euo pipefail
set -x
# set -x
source /etc/os-release
dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
wheel=$(realpath "$1")
shift
# python or python3?
if which python > /dev/null; then
py=python
else
py=python3
fi
tmpdir=$(mktemp -d)
trap "rm -r ${tmpdir}" EXIT
cd "${tmpdir}"
$py -m zipfile -e "${wheel}" .
python -m zipfile -e "${wheel}" .
find . -name *.so -ls -exec strip "$@" {} \;
# Display the size after strip
find . -name *.so -ls
echo "
Libs before:"
# Busybox doesn't have "find -ls"
find . -name \*.so | xargs ls -l
$py -m zipfile -c "${wheel}" *
# On Debian, print the package versions libraries come from
echo "
Dependencies versions of '_psycopg.so' library:"
"${dir}/print_so_versions.sh" "$(find . -name \*_psycopg\*.so)"
find . -name \*.so -exec strip "$@" {} \;
echo "
Libs after:"
find . -name \*.so | xargs ls -l
python -m zipfile -c ${wheel} *
cd -

View File

@ -0,0 +1,53 @@
#!/bin/bash
# Configure the libraries needed to build wheel packages on linux.
# This script is designed to be used by cibuildwheel as CIBW_BEFORE_ALL_LINUX
set -euo pipefail
set -x
dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
prjdir="$( cd "${dir}/../.." && pwd )"
source /etc/os-release
# Install PostgreSQL development files.
case "$ID" in
alpine)
"${dir}/build_libpq.sh" > /dev/null
;;
debian)
# Note that the pgdg doesn't have an aarch64 repository so wheels are
# build with the libpq packaged with Debian 9, which is 9.6.
if [ "$AUDITWHEEL_ARCH" != 'aarch64' ]; then
echo "deb http://apt.postgresql.org/pub/repos/apt $VERSION_CODENAME-pgdg main" \
> /etc/apt/sources.list.d/pgdg.list
# TODO: On 2021-11-09 curl fails on 'ppc64le' with:
# curl: (60) SSL certificate problem: certificate has expired
# Test again later if -k can be removed.
curl -skf https://www.postgresql.org/media/keys/ACCC4CF8.asc \
> /etc/apt/trusted.gpg.d/postgresql.asc
fi
apt-get update
apt-get -y upgrade
apt-get -y install libpq-dev
;;
centos)
"${dir}/build_libpq.sh" > /dev/null
;;
*)
echo "$0: unexpected Linux distribution: '$ID'" >&2
exit 1
;;
esac
# Replace the package name
if [[ "${PACKAGE_NAME:-}" ]]; then
sed -i "s/^setup(name=\"psycopg2\"/setup(name=\"${PACKAGE_NAME}\"/" \
"${prjdir}/setup.py"
fi

View File

@ -0,0 +1,41 @@
#!/bin/bash
# Configure the environment needed to build wheel packages on Mac OS.
# This script is designed to be used by cibuildwheel as CIBW_BEFORE_ALL_MACOS
#
# The PG_VERSION env var must be set to a Postgres major version (e.g. 16).
set -euo pipefail
set -x
dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
prjdir="$( cd "${dir}/../.." && pwd )"
# Build dependency libraries
"${prjdir}/scripts/build/build_libpq.sh"
# Show dependency tree
otool -L /tmp/libpq.build/lib/*.dylib
brew install gnu-sed postgresql@${PG_VERSION}
brew link --overwrite postgresql@${PG_VERSION}
# Start the database for testing
brew services start postgresql@${PG_VERSION}
# Wait for postgres to come up
for i in $(seq 10 -1 0); do
eval pg_isready && break
if [ $i == 0 ]; then
echo "PostgreSQL service not ready, giving up"
exit 1
fi
echo "PostgreSQL service not ready, waiting a bit, attempts left: $i"
sleep 5
done
# Replace the package name
if [[ "${PACKAGE_NAME:-}" ]]; then
gsed -i "s/^setup(name=\"psycopg2\"/setup(name=\"${PACKAGE_NAME}\"/" \
"${prjdir}/setup.py"
fi

View File

@ -0,0 +1,7 @@
@echo on
pip install delvewheel wheel
vcpkg install libpq:x64-windows-release
pipx install .\scripts\build\pg_config_vcpkg_stub\

View File

@ -19,6 +19,7 @@ The script can be run at a new PostgreSQL release to refresh the module.
import re
import sys
import time
from urllib.request import urlopen
from collections import defaultdict
@ -32,8 +33,7 @@ def main():
file_start = read_base_file(filename)
# If you add a version to the list fix the docs (in errorcodes.rst)
classes, errors = fetch_errors(
['9.1', '9.2', '9.3', '9.4', '9.5', '9.6', '10', '11', '12', '13', '14'])
classes, errors = fetch_errors("11 12 13 14 15 16 17".split())
disambiguate(errors)
@ -90,8 +90,8 @@ def parse_errors_txt(url):
errors_txt_url = \
"http://git.postgresql.org/gitweb/?p=postgresql.git;a=blob_plain;" \
"f=src/backend/utils/errcodes.txt;hb=%s"
"https://raw.githubusercontent.com/postgres/postgres/refs/heads/%s" \
"/src/backend/utils/errcodes.txt"
def fetch_errors(versions):

View File

@ -29,8 +29,7 @@ def main():
os.path.dirname(__file__), "../psycopg/sqlstate_errors.h")
# If you add a version to the list fix the docs (in errors.rst)
classes, errors = fetch_errors(
['9.1', '9.2', '9.3', '9.4', '9.5', '9.6', '10', '11', '12', '13', '14'])
classes, errors = fetch_errors("11 12 13 14 15 16 17".split())
f = open(filename, "w")
print("/*\n * Autogenerated by 'scripts/make_errors.py'.\n */\n", file=f)
@ -74,8 +73,8 @@ def parse_errors_txt(url):
errors_txt_url = \
"http://git.postgresql.org/gitweb/?p=postgresql.git;a=blob_plain;" \
"f=src/backend/utils/errcodes.txt;hb=%s"
"https://raw.githubusercontent.com/postgres/postgres/refs/heads/%s" \
"/src/backend/utils/errcodes.txt"
def fetch_errors(versions):

View File

@ -19,4 +19,4 @@ static_libpq=0
libraries=
[metadata]
license_file = LICENSE
license_files = LICENSE

View File

@ -29,22 +29,18 @@ for coroutine libraries.
import os
import sys
import re
import subprocess
from setuptools import setup, Extension
from distutils.command.build_ext import build_ext
from distutils.ccompiler import get_default_compiler
from distutils.errors import CompileError
try:
import configparser
except ImportError:
import ConfigParser as configparser
import configparser
# Take a look at https://www.python.org/dev/peps/pep-0440/
# for a consistent versioning pattern.
PSYCOPG_VERSION = '2.9.3'
PSYCOPG_VERSION = '2.9.10'
# note: if you are changing the list of supported Python version please fix
@ -55,11 +51,12 @@ Intended Audience :: Developers
License :: OSI Approved :: GNU Library or Lesser General Public License (LGPL)
Programming Language :: Python
Programming Language :: Python :: 3
Programming Language :: Python :: 3.6
Programming Language :: Python :: 3.7
Programming Language :: Python :: 3.8
Programming Language :: Python :: 3.9
Programming Language :: Python :: 3.10
Programming Language :: Python :: 3.11
Programming Language :: Python :: 3.12
Programming Language :: Python :: 3.13
Programming Language :: Python :: 3 :: Only
Programming Language :: Python :: Implementation :: CPython
Programming Language :: C
@ -104,24 +101,23 @@ For further information please check the 'doc/src/install.rst' file (also at
""")
sys.exit(1)
def query(self, attr_name):
def query(self, attr_name, *, empty_ok=False):
"""Spawn the pg_config executable, querying for the given config
name, and return the printed value, sanitized. """
try:
pg_config_process = subprocess.Popen(
pg_config_process = subprocess.run(
[self.pg_config_exe, "--" + attr_name],
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE)
except OSError:
raise Warning(
f"Unable to find 'pg_config' file in '{self.pg_config_exe}'")
pg_config_process.stdin.close()
result = pg_config_process.stdout.readline().strip()
if not result:
raise Warning(pg_config_process.stderr.readline())
if not isinstance(result, str):
result = result.decode('ascii')
if pg_config_process.returncode:
err = pg_config_process.stderr.decode(errors='backslashreplace')
raise Warning(f"pg_config --{attr_name} failed: {err}")
result = pg_config_process.stdout.decode().strip()
if not result and not empty_ok:
raise Warning(f"pg_config --{attr_name} is empty")
return result
def find_on_path(self, exename, path_directories=None):
@ -161,10 +157,7 @@ For further information please check the 'doc/src/install.rst' file (also at
return None
def _get_pg_config_from_registry(self):
try:
import winreg
except ImportError:
import _winreg as winreg
import winreg
reg = winreg.ConnectRegistry(None, winreg.HKEY_LOCAL_MACHINE)
try:
@ -377,43 +370,19 @@ For further information please check the 'doc/src/install.rst' file (also at
self.include_dirs.append(pg_config_helper.query("includedir"))
self.include_dirs.append(pg_config_helper.query("includedir-server"))
# add includedirs from cppflags, libdirs from ldflags
for token in pg_config_helper.query("ldflags").split():
# if present, add includedirs from cppflags, libdirs from ldflags
tokens = pg_config_helper.query("ldflags", empty_ok=True).split()
for token in tokens:
if token.startswith("-L"):
self.library_dirs.append(token[2:])
for token in pg_config_helper.query("cppflags").split():
tokens = pg_config_helper.query("cppflags", empty_ok=True).split()
for token in tokens:
if token.startswith("-I"):
self.include_dirs.append(token[2:])
pgversion = pg_config_helper.query("version").split()[1]
verre = re.compile(
r"(\d+)(?:\.(\d+))?(?:(?:\.(\d+))|(devel|(?:alpha|beta|rc)\d+))?")
m = verre.match(pgversion)
if m:
pgmajor, pgminor, pgpatch = m.group(1, 2, 3)
# Postgres >= 10 doesn't have pgminor anymore.
pgmajor = int(pgmajor)
if pgmajor >= 10:
pgminor, pgpatch = None, pgminor
if pgminor is None or not pgminor.isdigit():
pgminor = 0
if pgpatch is None or not pgpatch.isdigit():
pgpatch = 0
pgminor = int(pgminor)
pgpatch = int(pgpatch)
else:
sys.stderr.write(
f"Error: could not determine PostgreSQL version from "
f"'{pgversion}'")
sys.exit(1)
define_macros.append(("PG_VERSION_NUM", "%d%02d%02d" %
(pgmajor, pgminor, pgpatch)))
# enable lo64 if libpq >= 9.3 and Python 64 bits
if (pgmajor, pgminor) >= (9, 3) and is_py_64():
# enable lo64 if Python 64 bits
if is_py_64():
define_macros.append(("HAVE_LO64", "1"))
# Inject the flag in the version string already packed up
@ -555,7 +524,7 @@ setup(name="psycopg2",
url="https://psycopg.org/",
license="LGPL with exceptions",
platforms=["any"],
python_requires='>=3.6',
python_requires='>=3.8',
description=readme.split("\n")[0],
long_description="\n".join(readme.split("\n")[2:]).lstrip(),
classifiers=[x for x in classifiers.split("\n") if x],
@ -566,6 +535,7 @@ setup(name="psycopg2",
ext_modules=ext,
project_urls={
'Homepage': 'https://psycopg.org/',
'Changes': 'https://www.psycopg.org/docs/news.html',
'Documentation': 'https://www.psycopg.org/docs/',
'Code': 'https://github.com/psycopg/psycopg2',
'Issue Tracker': 'https://github.com/psycopg/psycopg2/issues',

View File

@ -390,6 +390,7 @@ class AsyncTests(ConnectingTestCase):
# fetching from the correct cursor works
self.assertEquals(cur1.fetchone()[0], 1)
@skip_if_crdb("batch statements", version="< 22.1")
def test_error(self):
cur = self.conn.cursor()
cur.execute("insert into table1 values (%s)", (1, ))
@ -402,9 +403,8 @@ class AsyncTests(ConnectingTestCase):
# this should fail as well (Postgres behaviour)
self.assertRaises(psycopg2.IntegrityError, self.wait, cur)
# but this should work
if crdb_version(self.sync_conn) is None:
cur.execute("insert into table1 values (%s)", (2, ))
self.wait(cur)
cur.execute("insert into table1 values (%s)", (2, ))
self.wait(cur)
# and the cursor should be usable afterwards
cur.execute("insert into table1 values (%s)", (3, ))
self.wait(cur)
@ -504,7 +504,7 @@ class AsyncTests(ConnectingTestCase):
raise Exception("Unexpected result from poll: %r", state)
polls += 1
self.assert_(polls >= 8, polls)
self.assert_(polls >= 5, polls)
def test_poll_noop(self):
self.conn.poll()

View File

@ -1920,7 +1920,12 @@ class TestConnectionInfo(ConnectingTestCase):
self.conn.info.ssl_attribute('wat')
@skip_before_libpq(9, 5)
@skip_after_libpq(16)
def test_ssl_attribute(self):
# Skip this test even if libpq built == 15, runtime == 16 (see #1619)
if ext.libpq_version() >= 160000:
return self.skipTest("libpq runtime version == %s" % ext.libpq_version())
attribs = self.conn.info.ssl_attribute_names
self.assert_(attribs)
if self.conn.info.ssl_in_use:
@ -1928,11 +1933,16 @@ class TestConnectionInfo(ConnectingTestCase):
self.assertIsInstance(self.conn.info.ssl_attribute(attrib), str)
else:
for attrib in attribs:
# Behaviour changed in PostgreSQL 15
if attrib == "library":
continue
self.assertIsNone(self.conn.info.ssl_attribute(attrib))
self.assertIsNone(self.conn.info.ssl_attribute('wat'))
for attrib in attribs:
if attrib == "library":
continue
self.assertIsNone(self.bconn.info.ssl_attribute(attrib))

View File

@ -379,6 +379,12 @@ class CursorTests(ConnectingTestCase):
@skip_before_postgres(8, 2)
def test_rowcount_on_executemany_returning(self):
cur = self.conn.cursor()
try:
cur.execute("drop table execmany")
self.conn.commit()
except psycopg2.DatabaseError:
self.conn.rollback()
cur.execute("create table execmany(id serial primary key, data int)")
cur.executemany(
"insert into execmany (data) values (%s)",
@ -412,7 +418,7 @@ class CursorTests(ConnectingTestCase):
self.assert_(curs.pgresult_ptr is None)
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
class NamedCursorTests(ConnectingTestCase):
def test_invalid_name(self):
curs = self.conn.cursor()
@ -436,6 +442,7 @@ class NamedCursorTests(ConnectingTestCase):
curs.execute("insert into withhold values (%s)", (i,))
curs.close()
@skip_if_crdb("cursor with hold")
def test_withhold(self):
self.assertRaises(psycopg2.ProgrammingError, self.conn.cursor,
withhold=True)
@ -460,6 +467,7 @@ class NamedCursorTests(ConnectingTestCase):
curs.execute("drop table withhold")
self.conn.commit()
@skip_if_crdb("cursor with hold")
def test_withhold_no_begin(self):
self._create_withhold_table()
curs = self.conn.cursor("w", withhold=True)
@ -484,6 +492,7 @@ class NamedCursorTests(ConnectingTestCase):
self.assertEqual(self.conn.info.transaction_status,
psycopg2.extensions.TRANSACTION_STATUS_IDLE)
@skip_if_crdb("cursor with hold")
def test_withhold_autocommit(self):
self._create_withhold_table()
self.conn.commit()
@ -506,6 +515,7 @@ class NamedCursorTests(ConnectingTestCase):
self.assertEqual(self.conn.info.transaction_status,
psycopg2.extensions.TRANSACTION_STATUS_IDLE)
@skip_if_crdb("scroll cursor")
def test_scrollable(self):
self.assertRaises(psycopg2.ProgrammingError, self.conn.cursor,
scrollable=True)
@ -679,6 +689,7 @@ class NamedCursorTests(ConnectingTestCase):
self.assertRaises((IndexError, psycopg2.ProgrammingError),
cur.scroll, 1)
@skip_if_crdb("scroll cursor")
@skip_before_postgres(8, 0)
def test_scroll_named(self):
cur = self.conn.cursor('tmp', scrollable=True)

View File

@ -64,7 +64,7 @@ class _DictCursorBase(ConnectingTestCase):
class ExtrasDictCursorTests(_DictCursorBase):
"""Test if DictCursor extension class works."""
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
def testDictConnCursorArgs(self):
self.conn.close()
self.conn = self.connect(connection_factory=psycopg2.extras.DictConnection)
@ -132,19 +132,19 @@ class ExtrasDictCursorTests(_DictCursorBase):
return row
self._testWithNamedCursor(getter)
@skip_if_crdb("named cursor")
@skip_if_crdb("greedy cursor")
@skip_before_postgres(8, 2)
def testDictCursorWithNamedCursorNotGreedy(self):
curs = self.conn.cursor('tmp', cursor_factory=psycopg2.extras.DictCursor)
self._testNamedCursorNotGreedy(curs)
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
@skip_before_postgres(8, 0)
def testDictCursorWithNamedCursorIterRowNumber(self):
curs = self.conn.cursor('tmp', cursor_factory=psycopg2.extras.DictCursor)
self._testIterRowNumber(curs)
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
def _testWithNamedCursor(self, getter):
curs = self.conn.cursor('aname', cursor_factory=psycopg2.extras.DictCursor)
curs.execute("SELECT * FROM ExtrasDictCursorTests")
@ -285,19 +285,19 @@ class ExtrasDictCursorRealTests(_DictCursorBase):
return row
self._testWithNamedCursorReal(getter)
@skip_if_crdb("named cursor")
@skip_if_crdb("greedy cursor")
@skip_before_postgres(8, 2)
def testDictCursorRealWithNamedCursorNotGreedy(self):
curs = self.conn.cursor('tmp', cursor_factory=psycopg2.extras.RealDictCursor)
self._testNamedCursorNotGreedy(curs)
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
@skip_before_postgres(8, 0)
def testDictCursorRealWithNamedCursorIterRowNumber(self):
curs = self.conn.cursor('tmp', cursor_factory=psycopg2.extras.RealDictCursor)
self._testIterRowNumber(curs)
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
def _testWithNamedCursorReal(self, getter):
curs = self.conn.cursor('aname',
cursor_factory=psycopg2.extras.RealDictCursor)
@ -376,7 +376,7 @@ class NamedTupleCursorTest(ConnectingTestCase):
curs.execute("INSERT INTO nttest VALUES (3, 'baz')")
self.conn.commit()
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
def test_cursor_args(self):
cur = self.conn.cursor('foo', cursor_factory=psycopg2.extras.DictCursor)
self.assertEqual(cur.name, 'foo')
@ -533,7 +533,7 @@ class NamedTupleCursorTest(ConnectingTestCase):
finally:
NamedTupleCursor._make_nt = f_orig
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
@skip_before_postgres(8, 0)
def test_named(self):
curs = self.conn.cursor('tmp')
@ -544,28 +544,28 @@ class NamedTupleCursorTest(ConnectingTestCase):
recs.extend(curs.fetchall())
self.assertEqual(list(range(10)), [t.i for t in recs])
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
def test_named_fetchone(self):
curs = self.conn.cursor('tmp')
curs.execute("""select 42 as i""")
t = curs.fetchone()
self.assertEqual(t.i, 42)
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
def test_named_fetchmany(self):
curs = self.conn.cursor('tmp')
curs.execute("""select 42 as i""")
recs = curs.fetchmany(10)
self.assertEqual(recs[0].i, 42)
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
def test_named_fetchall(self):
curs = self.conn.cursor('tmp')
curs.execute("""select 42 as i""")
recs = curs.fetchall()
self.assertEqual(recs[0].i, 42)
@skip_if_crdb("named cursor")
@skip_if_crdb("greedy cursor")
@skip_before_postgres(8, 2)
def test_not_greedy(self):
curs = self.conn.cursor('tmp')
@ -580,7 +580,7 @@ class NamedTupleCursorTest(ConnectingTestCase):
self.assert_(recs[1].ts - recs[0].ts < timedelta(seconds=0.005))
self.assert_(recs[2].ts - recs[1].ts > timedelta(seconds=0.0099))
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
@skip_before_postgres(8, 0)
def test_named_rownumber(self):
curs = self.conn.cursor('tmp')

View File

@ -152,7 +152,7 @@ class GreenTestCase(ConnectingTestCase):
""")
polls = stub.polls.count(POLL_READ)
self.assert_(polls > 8, polls)
self.assert_(polls > 6, polls)
class CallbackErrorTestCase(ConnectingTestCase):
@ -219,7 +219,7 @@ class CallbackErrorTestCase(ConnectingTestCase):
self.fail("you should have had a success or an error by now")
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
def test_errors_named_cursor(self):
for i in range(100):
self.to_error = None

View File

@ -18,6 +18,7 @@
from . import testutils
import unittest
import sys
import psycopg2
import psycopg2.extras
@ -68,7 +69,12 @@ class NetworkingTestCase(testutils.ConnectingTestCase):
self.assertEquals(cur.fetchone()[0], '127.0.0.1/24')
cur.execute("select %s", [ip.ip_interface('::ffff:102:300/128')])
self.assertEquals(cur.fetchone()[0], '::ffff:102:300/128')
# The texual representation of addresses has changed in Python 3.13
if sys.version_info >= (3, 13):
self.assertEquals(cur.fetchone()[0], '::ffff:1.2.3.0/128')
else:
self.assertEquals(cur.fetchone()[0], '::ffff:102:300/128')
@testutils.skip_if_crdb("cidr")
def test_cidr_cast(self):
@ -109,7 +115,12 @@ class NetworkingTestCase(testutils.ConnectingTestCase):
self.assertEquals(cur.fetchone()[0], '127.0.0.0/24')
cur.execute("select %s", [ip.ip_network('::ffff:102:300/128')])
self.assertEquals(cur.fetchone()[0], '::ffff:102:300/128')
# The texual representation of addresses has changed in Python 3.13
if sys.version_info >= (3, 13):
self.assertEquals(cur.fetchone()[0], '::ffff:1.2.3.0/128')
else:
self.assertEquals(cur.fetchone()[0], '::ffff:102:300/128')
def test_suite():

View File

@ -31,7 +31,7 @@ from subprocess import Popen
from weakref import ref
import unittest
from .testutils import (skip_before_postgres,
from .testutils import (skip_before_postgres, skip_if_windows,
ConnectingTestCase, skip_copy_if_green, skip_if_crdb, slow, StringIO)
import psycopg2
@ -330,6 +330,7 @@ class ExceptionsTestCase(ConnectingTestCase):
class TestExtensionModule(unittest.TestCase):
@slow
@skip_if_windows
def test_import_internal(self):
# check that the internal package can be imported "naked"
# we may break this property if there is a compelling reason to do so,

View File

@ -23,13 +23,15 @@
# FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public
# License for more details.
import os
import unittest
from collections import deque
from functools import partial
import psycopg2
from psycopg2 import extensions
from psycopg2.extensions import Notify
from .testutils import ConnectingTestCase, skip_if_crdb, slow
from .testutils import ConnectingTestCase, skip_if_crdb, skip_if_windows, slow
from .testconfig import dsn
import sys
@ -74,7 +76,9 @@ conn.close()
module=psycopg2.__name__,
dsn=dsn, sec=sec, name=name, payload=payload))
return Popen([sys.executable, '-c', script], stdout=PIPE)
env = os.environ.copy()
env.pop("PSYCOPG_DEBUG", None)
return Popen([sys.executable, '-c', script], stdout=PIPE, env=env)
@slow
def test_notifies_received_on_poll(self):
@ -126,6 +130,52 @@ conn.close()
self.assertEqual(pid, self.conn.notifies[0][0])
self.assertEqual('foo', self.conn.notifies[0][1])
def _test_notifies_received_on_operation(self, operation, execute_query=True):
self.listen('foo')
self.conn.commit()
if execute_query:
self.conn.cursor().execute('select 1;')
pid = int(self.notify('foo').communicate()[0])
self.assertEqual(0, len(self.conn.notifies))
operation()
self.assertEqual(1, len(self.conn.notifies))
self.assertEqual(pid, self.conn.notifies[0][0])
self.assertEqual('foo', self.conn.notifies[0][1])
@slow
@skip_if_windows
def test_notifies_received_on_commit(self):
self._test_notifies_received_on_operation(self.conn.commit)
@slow
@skip_if_windows
def test_notifies_received_on_rollback(self):
self._test_notifies_received_on_operation(self.conn.rollback)
@slow
@skip_if_windows
def test_notifies_received_on_reset(self):
self._test_notifies_received_on_operation(self.conn.reset, execute_query=False)
@slow
@skip_if_windows
def test_notifies_received_on_set_session(self):
self._test_notifies_received_on_operation(
partial(self.conn.set_session, autocommit=True, readonly=True),
execute_query=False,
)
@slow
@skip_if_windows
def test_notifies_received_on_set_client_encoding(self):
self._test_notifies_received_on_operation(
partial(
self.conn.set_client_encoding,
'LATIN1' if self.conn.encoding != 'LATIN1' else 'UTF8'
),
execute_query=False,
)
@slow
def test_notify_object(self):
self.autocommit(self.conn)

View File

@ -272,6 +272,8 @@ class TypesBasicTests(ConnectingTestCase):
]:
curs.execute("select %s::int[]", (a,))
self.assertEqual(curs.fetchone()[0], a)
curs.execute("select array[%s::int[]]", (a,))
self.assertEqual(curs.fetchone()[0], [a])
def testTypeRoundtripBytes(self):
o1 = bytes(range(256))

View File

@ -584,6 +584,68 @@ class AdaptTypeTestCase(ConnectingTestCase):
curs.execute("select (4,8)::typens.typens_ii")
self.assertEqual(curs.fetchone()[0], (4, 8))
@skip_if_no_composite
def test_composite_namespace_path(self):
curs = self.conn.cursor()
curs.execute("""
select nspname from pg_namespace
where nspname = 'typens';
""")
if not curs.fetchone():
curs.execute("create schema typens;")
self.conn.commit()
self._create_type("typens.typensp_ii",
[("a", "integer"), ("b", "integer")])
curs.execute("set search_path=typens,public")
t = psycopg2.extras.register_composite(
"typensp_ii", self.conn)
self.assertEqual(t.schema, 'typens')
curs.execute("select (4,8)::typensp_ii")
self.assertEqual(curs.fetchone()[0], (4, 8))
@skip_if_no_composite
def test_composite_weird_name(self):
curs = self.conn.cursor()
curs.execute("""
select nspname from pg_namespace
where nspname = 'qux.quux';
""")
if not curs.fetchone():
curs.execute('create schema "qux.quux";')
self._create_type('"qux.quux"."foo.bar"',
[("a", "integer"), ("b", "integer")])
t = psycopg2.extras.register_composite(
'"qux.quux"."foo.bar"', self.conn)
self.assertEqual(t.name, 'foo.bar')
self.assertEqual(t.schema, 'qux.quux')
curs.execute('select (4,8)::"qux.quux"."foo.bar"')
self.assertEqual(curs.fetchone()[0], (4, 8))
@skip_if_no_composite
def test_composite_not_found(self):
self.assertRaises(
psycopg2.ProgrammingError, psycopg2.extras.register_composite,
"nosuchtype", self.conn)
self.assertEqual(self.conn.status, ext.STATUS_READY)
cur = self.conn.cursor()
cur.execute("select 1")
self.assertRaises(
psycopg2.ProgrammingError, psycopg2.extras.register_composite,
"nosuchtype", self.conn)
self.assertEqual(self.conn.status, ext.STATUS_IN_TRANSACTION)
self.conn.rollback()
self.conn.autocommit = True
self.assertRaises(
psycopg2.ProgrammingError, psycopg2.extras.register_composite,
"nosuchtype", self.conn)
self.assertEqual(self.conn.status, ext.STATUS_READY)
@skip_if_no_composite
@skip_before_postgres(8, 4)
def test_composite_array(self):
@ -710,22 +772,15 @@ class AdaptTypeTestCase(ConnectingTestCase):
def _create_type(self, name, fields):
curs = self.conn.cursor()
try:
curs.execute("savepoint x")
curs.execute(f"drop type {name} cascade;")
except psycopg2.ProgrammingError:
self.conn.rollback()
curs.execute("rollback to savepoint x")
curs.execute("create type {} as ({});".format(name,
", ".join(["%s %s" % p for p in fields])))
if '.' in name:
schema, name = name.split('.')
else:
schema = 'public'
curs.execute("""\
SELECT t.oid
FROM pg_type t JOIN pg_namespace ns ON typnamespace = ns.oid
WHERE typname = %s and nspname = %s;
""", (name, schema))
curs.execute("SELECT %s::regtype::oid", (name, ))
oid = curs.fetchone()[0]
self.conn.commit()
return oid
@ -1560,6 +1615,18 @@ class RangeCasterTestCase(ConnectingTestCase):
cur = self.conn.cursor()
self.assertRaises(psycopg2.ProgrammingError,
register_range, 'nosuchrange', 'FailRange', cur)
self.assertEqual(self.conn.status, ext.STATUS_READY)
cur.execute("select 1")
self.assertRaises(psycopg2.ProgrammingError,
register_range, 'nosuchrange', 'FailRange', cur)
self.assertEqual(self.conn.status, ext.STATUS_IN_TRANSACTION)
self.conn.rollback()
self.conn.autocommit = True
self.assertRaises(psycopg2.ProgrammingError,
register_range, 'nosuchrange', 'FailRange', cur)
@restore_types
def test_schema_range(self):
@ -1574,7 +1641,7 @@ class RangeCasterTestCase(ConnectingTestCase):
register_range('r1', 'r1', cur)
ra2 = register_range('r2', 'r2', cur)
rars2 = register_range('rs.r2', 'r2', cur)
register_range('rs.r3', 'r3', cur)
rars3 = register_range('rs.r3', 'r3', cur)
self.assertNotEqual(
ra2.typecaster.values[0],
@ -1588,6 +1655,27 @@ class RangeCasterTestCase(ConnectingTestCase):
register_range, 'rs.r1', 'FailRange', cur)
cur.execute("rollback to savepoint x;")
cur2 = self.conn.cursor()
cur2.execute("set local search_path to rs,public")
ra3 = register_range('r3', 'r3', cur2)
self.assertEqual(ra3.typecaster.values[0], rars3.typecaster.values[0])
@skip_if_no_composite
def test_rang_weird_name(self):
cur = self.conn.cursor()
cur.execute("""
select nspname from pg_namespace
where nspname = 'qux.quux';
""")
if not cur.fetchone():
cur.execute('create schema "qux.quux";')
cur.execute('create type "qux.quux"."foo.range" as range (subtype=text)')
r = psycopg2.extras.register_range(
'"qux.quux"."foo.range"', "foorange", cur)
cur.execute('''select '[a,z]'::"qux.quux"."foo.range"''')
self.assertEqual(cur.fetchone()[0], r.range('a', 'z', '[]'))
def test_suite():
return unittest.TestLoader().loadTestsFromName(__name__)

View File

@ -290,7 +290,7 @@ class WithCursorTestCase(WithTestCase):
self.assert_(curs.closed)
self.assert_(closes)
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
def test_exception_swallow(self):
# bug #262: __exit__ calls cur.close() that hides the exception
# with another error.
@ -304,7 +304,7 @@ class WithCursorTestCase(WithTestCase):
else:
self.fail("where is my exception?")
@skip_if_crdb("named cursor")
@skip_if_crdb("named cursor", version="< 22.1")
@skip_before_postgres(8, 2)
def test_named_with_noop(self):
with self.conn.cursor('named'):

View File

@ -467,11 +467,13 @@ def skip_if_crdb(reason, conn=None, version=None):
crdb_reasons = {
"2-phase commit": 22329,
"backend pid": 35897,
"batch statements": 44803,
"cancel": 41335,
"cast adds tz": 51692,
"cidr": 18846,
"composite": 27792,
"copy": 41608,
"cursor with hold": 77101,
"deferrable": 48307,
"encoding": 35882,
"hstore": 41284,
@ -483,6 +485,7 @@ crdb_reasons = {
"notify": 41522,
"password_encryption": 42519,
"range": 41282,
"scroll cursor": 77102,
"stored procedure": 1751,
}

View File

@ -1,5 +1,5 @@
[tox]
envlist = {3.6,3.7,3.8,3.9,3.10,3.11}
envlist = {3.8,3.9,3.10,3.11,3.12,3.13}
[testenv]
commands = make check