This doesn't work when using C++20 as they produce values of "char8_t*"
type which is different from the expected "char*".
Just assume UTF-8 encoding is always used, we already use /utf-8 flag
for MSVC and all Unix compilers should use it too anyhow.
Improve error reporting for ODBC, PostgreSQL and SQLite backends, in
particular, provide, or improve, get_error_category() in the exception
classes of all these backends.
See #1235, #1236, #1237.
Try to classify all the possible SQLite result codes using SOCI
error_category enum.
In particular, this allows to detect connection and permission errors.
Update the unit test to test the error category too.
Don't use CHECK_THROWS_WITH(), it's convenient but not very flexible,
while an explicit try/catch will allow us to add more checks on the
exception object in the upcoming commits.
Use these functions instead of directly using SQLite3 ones in the test.
This makes no real difference, but allows to provide the implementation
of these functions in SOCI SQLite3 backend itself and so works even when
SQLite3 library is linked statically into this backend and is not linked
into the test.
Since SQLite doesn't have a type system, we have to always be prepared
to select huge numbers, even if the "type" is e.g. int8.
To do that, we overwrite the exchange type (e.g. in row based APIs) to
always be (u)int64 for integers to avoid over-/underflows.
Update the documentation to explain this SQLite peculiarity and add a
test verifying that this behaves as expected.
Fixes#1190.
Closes#1217.
Check that the values of the vectors elements that resulted in the error
appear in the output too, at least for the backends that have been
already updated to do it (i.e. none yet, but some of them will be soon).
Check for the exact fragment expected and not just a single word.
Also use REQUIRE_THAT() instead of CAPTURE() + CHECK() combination, this
is simpler and provides the same information (arguably more clearly).
Check that a partial update does throw an exception instead of checking
that it may throw it and check that get_affected_rows() returns the
actual number of affected rows instead of returning something non zero.
The only remaining exclusion is the PostgreSQL ODBC driver which just
seems to be buggy, notably in its versions < 13.02 when it inserted
valid rows into the database but still returned fatal error (with later
versions it doesn't insert anything, which is inconsistent with the
other drivers and unhelpful, but not quite as bad).
The unit test relies on failing to insert "a" into an integer column but
SQLite is perfectly fine with doing this by default, so use an explicit
CHECK constraint to prevent this from succeeding and to make it behave
in the same way as the other databases.
"Suitable" means that the soci::is_contiguous_resizable_container is
specialized to have true "value" member. This specialization is provided
for std::string and std::vector<T> where sizeof(T) == sizeof(char).
This restores compatibility with the existing code and allows to read
BLOB data into a string again.
Fixes#1173.
Closes#1189.
Work around vector int8_t unit test failure with FreeTDS: due to a bug
in the current versions of FreeTDS ODBC drivers, negative TINYINT values
are stored as positive values in the database and so sorting by them
doesn't work, even if reading them back does work.
The issue (https://github.com/FreeTDS/freetds/issues/627) was fixed in
the latest FreeTDS version, but for now work around it in SOCI tests to
let them pass even with older ones.
See #1194.
SQLite documentation states that it can change in the future and it's
simple enough to turn it off explicitly, so do it.
Also combine 2 very similar tests in a single one and use 2 sections for
them instead.
No real changes.
Negative values saved to the database are stored as positive numbers in
at least some versions of the database, so even if they're converted
back to the negative ones when we read them back, ordering the result
set by them doesn't work as expected, see #1193.
Until this can be fixed, work around this by sorting the values in the
test itself instead.
We need to read the entire contents of the CLOB in Oracle backend and
not just the number of bytes corresponding to its length in characters
as returned by OCILobGetLength() because this may (and will) be strictly
less than its full size in bytes for any encoding using multiple bytes
per character, such as the de facto standard UTF-8.
Also make reading CLOBs more efficient by doing what Oracle
documentation suggests and using the LOB chunk size for reading.
Finally, add a unit test checking that using non-ASCII strings in UTF-8
(which had to be enabled for the CI) with CLOBs does work.
This commit is best viewed ignoring whitespace-only changes.
This will allow to use it for creating functions as well as procedures
(and also procedures with a name other than "soci_test" if this is ever
needed).
Describe the statement once again after calling execute() if we had
failed to describe it before because SQLNumResultCols() returned 0, as
it may happen with some complex queries and SQL Server used via ODBC
backend.
Closes#1151.
See #1182.
Co-Authored-By: Vadim Zeitlin <vz-soci@zeitlins.org>
Handle wide-strings similarly to how normal strings are already handled.
For now support for them is only available in the ODBC backend.
Also add conversion functions between UTF-{8,16,32} and wchar_t and the
tests for them. Note that some of these functions are not used yet, but
provide the complete set as they probably will be in the future.
Co-Authored-By: Vadim Zeitlin <vz-soci@zeitlins.org>
Improvements to connection string handling: don't duplicate parsing code
in Firebird, Oracle, PostgreSQL and SQLite backends.
Also allow using connection_parameters::set_option() to set some option
instead of having to specify it in the connection string itself.
See #1176.