IF YOU WOULD LIKE TO GET AN ACCOUNT, please write an
email to Administrator. User accounts are meant only to access repo
and report issues and/or generate pull requests.
This is a purpose-specific Git hosting for
BaseALT
projects. Thank you for your understanding!
Только зарегистрированные пользователи имеют доступ к сервису!
Для получения аккаунта, обратитесь к администратору.
more or less a copy from the normal section config test, but now with
properties defined multiple times as well as conflicting options
Signed-off-by: Dominik Csapak <d.csapak@proxmox.com>
[ TL: improve consistency with property-isolation terminology ]
Signed-off-by: Thomas Lamprecht <t.lamprecht@proxmox.com>
to compare nested hashes/lists and scalar values recursively.
Also includes some tests
Signed-off-by: Dominik Csapak <d.csapak@proxmox.com>
Signed-off-by: Thomas Lamprecht <t.lamprecht@proxmox.com>
a few things were missing for it to work:
* on the cli, we have to get the option as an array if the type is an
array
* the untainting must be done recursively, otherwise, the regex matching
converts an array hash into the string 'ARRAY(0x123412341234)'
* JSONSchema::parse_config did not handle array formats specially, but
we want to allow to specify them multiple time
* the biggest point: in the RESTHandler, to be compatible with the
current gui behavior, we have to rewrite two parameter types:
- when the api defines a '-list' format for a string type, but we get
a list (because of the changes in http-server), we join the list
with a comma into a string
- when the api defines an 'array' type, but we get a scalar value,
wrap the value in an array (because for www-form-urlencoded, you
cannot send an array with a single value) add tests for this
behavior, some of which we want to deprecate and remove in the
future
Signed-off-by: Dominik Csapak <d.csapak@proxmox.com>
converting from 0.5 gb to mb resulted in 0 mb
with this patch it correctly returns 512
also add tests and catch more errors
Signed-off-by: Dominik Csapak <d.csapak@proxmox.com>
Previously an external exception (eg. caused by a SIGARLM in a code
which is already inside a run_with_timeout() call) could happen in
various places where we did not properly this situation.
For instance after calling $lock_func() but before reaching the cleanup
code. In this case a lock was leaked.
Additionally the code was broken in that it used perl's automatic hash
creation side effect ($a->{x}->{y} implicitly initializing $a->{x} with
an empty hash when it did not exist). The effect was that if our own
time out was triggered after the initial check for an existing file
handle inside $lock_func() happened (extremely rare since perl would have
to be running insanely slow), the cleanup did:
if (my $fh = $lock_handles->{$$}->{$filename}->{fh}) {
This recreated $lock_handles->{$$}->{$filename} as an empty hash.
A subsequent call to lock_file_full() will think a file descriptor
already exists because the check simply used:
if (!$lock_handles->{$$}->{$filename}) {
While this could have been a one-line fix for this one particular case,
we'd still not be taking external timeouts into account causing the
first issue described above.