Only run the imprecision part if json_int_t is long long, otherwise
the imprecision cannot be tested against json_int_t.
Also, convert the double value to json_int_t before checking the
imprecision, because otherwise the json_int_t is converted to double
and the imprecision happens in the "wrong direction".
In the "/* perform the same update again */" area the error
message should be "unable to update an non-empty object"
instead of "unable to update an empty object".
Signed-off-by: Andrei Epure <epure.andrei@gmail.com>
Enable using the old envrionment variable passing of settings for json_process
so that the current autoconf test suite scripts can run nicely.
json_process now takes an optional command line parameter --env which
causes it to use the old method of reading the settings from
environment variables instead of directly from the "env" file.
Also added a --strip command line option, this will be used to
run the stripped tests with CMake as well.
We should use json_int_t instead of int, since this can be different
depending on platform. Also MSVC warns that this can cause "loss of
information" since it's converting long long to int.
gcc 4.2.1 warns about a possible incorrect cast to ssize_t when comparing against refcount, which is of type size_t. signed vs unsigned comparison.
Sinc warnings are treated as errors (as in the autoconf project) this would cause a compile error.
- Moved everything to one CMakeLists.txt
- Added support for the json_process test suites (instead of just the API
tests).
- Changed to use the modified json_process version that does away with the
environment variables (originally written by DanielT).
- Had to exclude "test_memory_funcs" on MSVC, since void pointer
arithmetics are not allowed as it is done in secure_malloc and
secure_free.
- Had to add a check for "ssize_t". This is not available on Windows and
maybe on some other platforms (used in test_pack.c)
- Result from running ctest (The failure seems unrelated to CMake, it's
just that the expected result is in a different order):
99% tests passed, 1 tests failed out of 121
Total Test time (real) = 1.31 sec
The following tests FAILED:
24 - valid__complex-array (Failed)
Added multiple CMake-related files to project.
Supports building the library and the tests.
See CMakeLists.txt for notes on how it works.
I had to adjust 3 existing files in order to disable some configuration
that should be taken care of by cmake/automake anyway.
I also added jansson.def from a future jansson version,
to test cmake's support for .def files (which works fine).
The decimal point '.' is changed to locale's decimal point
before/after JSON conversion to make C standard library's
locale-specific string conversion functions work correctly.
All the tests now call setlocale(LC_ALL, "") on startup to use the
locale set in the environment.
Fixes GH-32.
The keys which are stored temporarily to a hashtable to make sure that
all object keys are unpacked, were hashed with the object key hashing
function. It is meant to compute hashes for object_key_t values, and
it works incorrectly for plain strings.
Fixed by introducing suitable functions for hashing and comparing
strings for string-keyed hashtables.
* Use "printf" instead of "echo -n" as it's more portable
* Treat a test as skipped if it exits with exit status of 77
* Skip the check_exports test if "nm -D" doesn't work
- Add a new field position to the json_error_t structure. This is the
position in bytes from the beginning of the input.
- Keep track of line, column and input position in the stream level.
Previously, only line was tracked, and it was in the lexer level, so
this info was not available for UTF-8 decoding errors.
- While at it, refactor tests so that no separate "stripped" tests are
required. json_process is now able to strip whitespace from its
input, and the "valid" and "invalid" test suites now use this to
test both non-stripped and stripped input.
Closes GH-9.
Thanks to Basile Starynkevitch for the suggestion and initial patch.
Thanks to Jonathan Landis and Deron Meranda for showing how this can
be utilized for implementing secure memory operations.
Expand the pack/unpack API: json_(un)pack is the simple version with
no error output, json_(un)pack_ex has error output and flags,
json_v(un)pack_ex is a va_list version of json_(un)pack_ex.
Implement unpacking flags:
- JSON_UNPACK_ONLY turns off extra validation, i.e. array and object
unpacking doesn't check that all items are unpacked. This is really
just convenience for not adding '*' after each object and array.
- JSON_VALIDATE_ONLY turns off unpacking, i.e. no varargs are expected
and nothing is unpacked.
* Implement a "scanner" that reads the format string, maintaining state
* Split json_vnpack() to three separate functions for packing objects,
arrays and simple values. This makes it more clear what is being
packed, and the object and array structures become more evident.
* Make the skipping of ignored character simpler, i.e. skip ':' and
',' independent of their context
This patch shaves around 80 lines of code from the original
implementation.
Note that we pass va_list pointers around instead of just va_lists, which
would seem more intuitive. This is necessary since the behaviour of va_lists
passed as function parameters is finicky. Quoth stdarg(3):
If ap is passed to a function that uses va_arg(ap,type) then the value
of ap is undefined after the return of that function.
The pointer-passing strategy is used by Python's Py_BuildValue() for the same
purpose.
After looking at the new code for a few days, I didn't like it
anymore. To prepare for the future, a few fields will be added to the
json_error_t struct later.
This reverts commit 23dd078c8d. Some
adjustments were needed because of newer commits.
All decoding functions now accept a json_error_t** parameter and set
it to point to a heap-allocated json_error_t structure if an error
occurs. The contents of json_error_t are no longer exposed directly, a
few functions to do it have been added instead. If an error occurs,
the user must free the json_error_t value.
This makes it possible to enhance the error reporting facilities in
the future without breaking ABI compatibility with older versions.
This is a backwards incompatible change.
As of now, the parameter is unused, but may be needed in the future.
I'm adding it now so that in the future both API and ABI remain
backwards compatible as long as possible.
This is a backwards incompatible change.
json_int_t is typedef'd to long long if it's supported, or long
otherwise. There's also some supporting things, like the
JSON_INTEGER_FORMAT macro that expands to the printf() conversion
specifier that corresponds to json_int_t's actual type.
This is a backwards incompatible change.
Replace all occurences of unsigned int and unsigned long with size_t.
This is a backwards incompatible change, as the signature of many API
functions changes.
When encoding an array or object ends in an error, the visited flag
wasn't zeroed, causing subsequent encoding attempts to fail. This
patch fixes the problem by always zeroing the visited flag.
Encoding an empty array or object worked, but encoding it again
(possibly after adding some items) failed, because the visited flag
(used for detecting circular references) wasn't zeroed.
Initialize their reference counts to (unsigned int)-1 to disable
reference counting on them. It already was meant to work like this,
but the reference counts were just initialized to 1 instead of -1.
Thanks to Andrew Thompson for reporting this issue.