Enable using the old envrionment variable passing of settings for json_process
so that the current autoconf test suite scripts can run nicely.
json_process now takes an optional command line parameter --env which
causes it to use the old method of reading the settings from
environment variables instead of directly from the "env" file.
Also added a --strip command line option, this will be used to
run the stripped tests with CMake as well.
We should use json_int_t instead of int, since this can be different
depending on platform. Also MSVC warns that this can cause "loss of
information" since it's converting long long to int.
gcc 4.2.1 warns about a possible incorrect cast to ssize_t when comparing against refcount, which is of type size_t. signed vs unsigned comparison.
Sinc warnings are treated as errors (as in the autoconf project) this would cause a compile error.
- Moved everything to one CMakeLists.txt
- Added support for the json_process test suites (instead of just the API
tests).
- Changed to use the modified json_process version that does away with the
environment variables (originally written by DanielT).
- Had to exclude "test_memory_funcs" on MSVC, since void pointer
arithmetics are not allowed as it is done in secure_malloc and
secure_free.
- Had to add a check for "ssize_t". This is not available on Windows and
maybe on some other platforms (used in test_pack.c)
- Result from running ctest (The failure seems unrelated to CMake, it's
just that the expected result is in a different order):
99% tests passed, 1 tests failed out of 121
Total Test time (real) = 1.31 sec
The following tests FAILED:
24 - valid__complex-array (Failed)
Added multiple CMake-related files to project.
Supports building the library and the tests.
See CMakeLists.txt for notes on how it works.
I had to adjust 3 existing files in order to disable some configuration
that should be taken care of by cmake/automake anyway.
I also added jansson.def from a future jansson version,
to test cmake's support for .def files (which works fine).
The decimal point '.' is changed to locale's decimal point
before/after JSON conversion to make C standard library's
locale-specific string conversion functions work correctly.
All the tests now call setlocale(LC_ALL, "") on startup to use the
locale set in the environment.
Fixes GH-32.
The keys which are stored temporarily to a hashtable to make sure that
all object keys are unpacked, were hashed with the object key hashing
function. It is meant to compute hashes for object_key_t values, and
it works incorrectly for plain strings.
Fixed by introducing suitable functions for hashing and comparing
strings for string-keyed hashtables.
* Use "printf" instead of "echo -n" as it's more portable
* Treat a test as skipped if it exits with exit status of 77
* Skip the check_exports test if "nm -D" doesn't work
- Add a new field position to the json_error_t structure. This is the
position in bytes from the beginning of the input.
- Keep track of line, column and input position in the stream level.
Previously, only line was tracked, and it was in the lexer level, so
this info was not available for UTF-8 decoding errors.
- While at it, refactor tests so that no separate "stripped" tests are
required. json_process is now able to strip whitespace from its
input, and the "valid" and "invalid" test suites now use this to
test both non-stripped and stripped input.
Closes GH-9.
Thanks to Basile Starynkevitch for the suggestion and initial patch.
Thanks to Jonathan Landis and Deron Meranda for showing how this can
be utilized for implementing secure memory operations.
Expand the pack/unpack API: json_(un)pack is the simple version with
no error output, json_(un)pack_ex has error output and flags,
json_v(un)pack_ex is a va_list version of json_(un)pack_ex.
Implement unpacking flags:
- JSON_UNPACK_ONLY turns off extra validation, i.e. array and object
unpacking doesn't check that all items are unpacked. This is really
just convenience for not adding '*' after each object and array.
- JSON_VALIDATE_ONLY turns off unpacking, i.e. no varargs are expected
and nothing is unpacked.
* Implement a "scanner" that reads the format string, maintaining state
* Split json_vnpack() to three separate functions for packing objects,
arrays and simple values. This makes it more clear what is being
packed, and the object and array structures become more evident.
* Make the skipping of ignored character simpler, i.e. skip ':' and
',' independent of their context
This patch shaves around 80 lines of code from the original
implementation.
Note that we pass va_list pointers around instead of just va_lists, which
would seem more intuitive. This is necessary since the behaviour of va_lists
passed as function parameters is finicky. Quoth stdarg(3):
If ap is passed to a function that uses va_arg(ap,type) then the value
of ap is undefined after the return of that function.
The pointer-passing strategy is used by Python's Py_BuildValue() for the same
purpose.
After looking at the new code for a few days, I didn't like it
anymore. To prepare for the future, a few fields will be added to the
json_error_t struct later.
This reverts commit 23dd078c8d. Some
adjustments were needed because of newer commits.
All decoding functions now accept a json_error_t** parameter and set
it to point to a heap-allocated json_error_t structure if an error
occurs. The contents of json_error_t are no longer exposed directly, a
few functions to do it have been added instead. If an error occurs,
the user must free the json_error_t value.
This makes it possible to enhance the error reporting facilities in
the future without breaking ABI compatibility with older versions.
This is a backwards incompatible change.
As of now, the parameter is unused, but may be needed in the future.
I'm adding it now so that in the future both API and ABI remain
backwards compatible as long as possible.
This is a backwards incompatible change.
json_int_t is typedef'd to long long if it's supported, or long
otherwise. There's also some supporting things, like the
JSON_INTEGER_FORMAT macro that expands to the printf() conversion
specifier that corresponds to json_int_t's actual type.
This is a backwards incompatible change.
Replace all occurences of unsigned int and unsigned long with size_t.
This is a backwards incompatible change, as the signature of many API
functions changes.
When encoding an array or object ends in an error, the visited flag
wasn't zeroed, causing subsequent encoding attempts to fail. This
patch fixes the problem by always zeroing the visited flag.
Encoding an empty array or object worked, but encoding it again
(possibly after adding some items) failed, because the visited flag
(used for detecting circular references) wasn't zeroed.
Initialize their reference counts to (unsigned int)-1 to disable
reference counting on them. It already was meant to work like this,
but the reference counts were just initialized to 1 instead of -1.
Thanks to Andrew Thompson for reporting this issue.
With this encoding flag, the object key-value pairs in output are in
the same order in which they were first inserted into the object.
To make this possible, a key of an object is now a serial number plus
a string. An object keeps an increasing counter which is used to
assign serial number to the keys. Hashing, comparison and public API
functions were changed to act only on the string part, i.e. the serial
number is ignored everywhere else but in the encoder, where it's used
to order object keys if JSON_PRESERVE_ORDER flag is used.
This is to better catch distribution errors. It's easier to notice
that run-tests fails than to notice that one of many test suites is
silently skipped.
Added functions are:
* json_string_nocheck()
* json_string_set_nocheck()
* json_object_set_nocheck()
* json_object_set_new_nocheck()
These functions don't check that their string argument is valid UTF-8,
but assume that the user has already performed the check.
* Now that JSON_SORT_KEYS is implemented, take it into use with the
valid and valid-strip suites. This is to ensure that the tests
remain valid even if the string hash function is changed in the
future.
* Remove test_dump API test. Instead, implement the same tests more
elegantly in the encoding-flags suite.
- Never append newline to output
- By default, add spaces between array and object items for more
readable output
- Introduce the flag JSON_COMPACT to not add the aforementioned spaces
Failing to do this has the effect that the error message is not
returned when the input file cannot be opened (e.g. if it doesn't
exist).
Thanks to Martin Vopatek for reporting.
It's now an error to try to add an object or array to itself. The
encoder checks for circular references and fails with an error status
if one is detected.
Added functions:
json_string_set
json_integer_set
json_real_set
While at it, clarify the documentation and parameter naming of
json_{string,integer,real}_value() a bit.
That is, test cases where there's no newline or other whitespace at
the beginning or end of input. This was implemented by adding a
--strip option to split-testfile to strip the input file after writing
it.
The actual test JSON texts are the same as testdata/invalid and
testdata/valid. The expected output of the invalid cases had to be
adjusted a bit: because there's no newline at the end, some of the
line numbers needed to be changed.
All pointer arguments are now tested for NULL. json_string() now also
tests that strdup() succeeds. This is to ensure that no NULL values
end up in data structures.
Also desribe the different sources of errors in documentation.
Before, only the syntax level (parse_*) was able to set the error
string. This patch fixes the situation so that lexical (lex_*) and
stream (stream_*) levels can report detailed error messages.
Also, instead of 0, EOF is now returned by stream on error.
It's no longer needed to load the whole input into a string and then
parse from the string. Instead, the input is read as needed from
a string or file.