summaryrefslogtreecommitdiffstats
path: root/tests/010
Commit message (Collapse)AuthorAgeFilesLines
* hash-eql: regression: always returns zero.Kaz Kylheku2024-02-011-0/+6
| | | | | | | | | | | * hash.c (hash_eql): Use hash_traversal_limit for the initial value of the limit rather than zero. Commit 84e9903c27ede099e2361e15b16a05c6aa4dc819 in October 2019 fixed eql_hash to actually make use of the limit, which broke the assumption that we could use zero. * tests/010/hash.tl: Add a few tests for hash-equal and hash-eql.
* json: support Lisp comments.Kaz Kylheku2023-12-201-0/+3
| | | | | | | | | | | | | | | I've run into situations in which I wanted a comment in a big JSON quasiliteral to explain some embedded piece of code. We support only semicolon comments, and no #; ignore notation. * parser.l (grammar): Recognize Lisp comments in the JSON state also. That does it. * tests/010/json.tl: One modest little test. * txr.1: Documented. * lex.yy.c.shipped: Regenerated.
* hash: new function, hash-join.Kaz Kylheku2023-12-181-0/+5
| | | | | | | | | | | * hash.c (hash_join): New function. (hash_init): hash-join intrinsic registered. * hash.h (hash_join): Declared. * tests/010/hash.tl: New tests. * txr.1: Documented.
* hash: test cases and small doc fix.Kaz Kylheku2023-12-181-0/+18
| | | | | | | * tests/010/hash.tl: Add test cases for the hash set operations. * txr.1: Clarify that in hash-uni, the mapping functions are used on all items, not just ones subject to joinfun.
* New functions: nested-vec-of and nested-vec.Kaz Kylheku2023-09-211-0/+21
| | | | | | | | | | | | | | | * eval.c (eval_init): Register nestd-vec-of and nested-vec intrinsics. * lib.[ch] (vec_allocate, vec_own, vec_init): New static functions. (vector, copy_vec): Expressed in terms of new functions. (nested_vec_of_v, nested_vec_v): New functions. * args.[ch] (args_cat_from): New function. * tests/010/vec.tl: New tests. * txr.1: Documented.
* json: allow integers and lists.Kaz Kylheku2023-09-031-0/+5
| | | | | | | | | | | | * lib.c (out_json_rec): Handle NUM and BGNUM cases same as FLNUM so integers get printed. The restriction against integers has been largely unhelpful and bothersome. Handle LCONS together with CONS. Lists that are not special notation fall through to the VEC case, which now uses seq_iter_t iteration to handle vectors and lists. * tests/010/json.tl: New tests. * txr.1: Documented support for printing integers and lists.
* tree: bug: tree-delete-specific-node doesn't use key funKaz Kylheku2023-08-141-1/+7
| | | | | | | | | | | * tree.c (tr_delete_specific): We cant' juse use key(node) as the search key; we must apply the tree's key function to the node key field to retrieve the correct search key. * tests/010/tree.tl: New test case which fails without this bugfix: a node which is the left subtree of the root node doesn't get deleted since the search is led astray by the wrong key object.
* del/replace with index-list: fix semantics.Kaz Kylheku2023-07-182-1/+92
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | This commit does two things. The replace function, implemented under the hood by four specializations: replace-list, replace-vec, replace-str and replace-buf, will handle the index-list case a little differently. This is needed to fix the ability of the del macro work on place designated by an index list, such as: (del [sequence '(1 3 5 6)] which now deletes elements 1, 3, 5 and 6 from the sequence, and returns a sequence of those items. The underlying implementation uses replace with an index-list, which is now capable of deleting items. Previously, replace would stop processing the index list when the replacement-sequence corresponding to the index list ran out of items. Now, when the replacement-sequence runs out of items, the remaining index-list sequence elements specify items to be deleted. For instance if str holds "abcdefg" then: (set [str '(1 3 5)] "xy") will change str to "axcyeg". Elements 1 and 3 are replaced by x and y, respectively. Element 5, the letter f, is deleted, because the replacement "xy" has no element corresponding to 5. * lib.c (replace_list, replace_str, replace_vec): Implement new deleteion semantics for the case when the replacement sequence runs out of items. * buf.c (replace_buf): Likewise. * tests/010/seq.txr: Some new test cases here for deletion. * tests/010/seq.expected: Updated. * txr.1: Documented new semantics of replace, including a new restriction that if elements are being deleted, the indices should be monotonically increasing regardless of the type of the sequence (not only list). A value of 289 for the -C option documented, which restores the previous behavior of replace (breaking deletion by index-list, unfortunately: you don't always get to simulate an old version of TXR while using new features.)
* New function: hash-map.Kaz Kylheku2023-06-281-0/+3
| | | | | | | | | | | | | | | | | | | | hash-map converts a function mapping over a sequence into a hash table. * hash.[ch] (hash_map): New function. * tests/010/hash.tl: Test case. * genman.txr: The hash-map identifier introduces a hash collision. We have to deal with that somehow now. (colli): We put the conflicting entries into a new hash called colli which maps them to an increment value. (hash-title): Increment the hash code h by the amount indicated in colli, if the title is found there. * txr.1: Documented. * stdlib/doc-syms.tl: Updated.
* hash: support existing mutation+iteration semantics.Kaz Kylheku2023-06-201-0/+33
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | The original chained hashing scheme makes certain guarantees in situation when a hash table that is being iterated is also undergoing insertions or deletions. The original scheme meets these requirements simply, by putting a freeze against hash table growth while there are outstanding iterations. Chained hashing gracefully handles load factors above 1. Load factors above 1 are not possible under open addressing (and we don't even want to approach 1) but we would like to preserve the requirements. The requirements are: 1. If an iterator has already visited an item, it will not see that item again, regardless of insertions which cause the table to be reorganized. 2. It is not specified/required that an iterator will visit newly inserted items. It may visit some of those items, but not others. 3. If an iterator has not yet visited an item, and that item is deleted, it will not see that item, regardless of any insertions that reorganize the table. In this commit, we implement a "table stack" scheme. 1. When a table is resized due to insertions, and it is being iterated (h->usecount > 0), in that situation it will push the existing small table onto a stack, the h->tblstack (table stack). 2. Iterators take a hash table's current table and its size, and work with that snapshot of the table. If the original hash table grows, existing iterators work with the original table as it existed just before the reorganization. So after that they do not see any new insertions. 3. Whenever the delete operation (hash_remove) finds the item and removes it from the current table, it also walks the table stack, searches for the item in every table in the stack and nulls it out. This search is oblivious to nil; it's a blind search that goes around the table starting at the first probe position, looking for the identical cons cell to take out. This algorithm ensures that iterators will not see a deleted item, unless they already visited it before the deletion, of course. * hash.h (struct hash_iter): New members table, and mask. * hash.c (struct hash): New member, tblstack. (hash_grow): We drop the vec argument and recreate it locally (not essential to this commit). If we find that the usecount is positive, we push the existing table onto the table stack. Otherwise, we take the opportunity to obliterate the table stack. (hash_insert): Drop the restriction that hash_grow is only called when the use count is zero. Adjust calls to hash_grow to drop the vec argument. (hash_remove): When an item is found and deleted, and the table is in use by iterators, walk the table stack and delete it from each previous table. Otherwise, if the table is not in use by iterators, obliterate the table stack. (hash_mark): Exit early also if there is a table stack, and mark that stack. (do_make_hash, make_similar_hash, copy_hash): Initialize table stack in new hash. (hash_iter_mark): Mark the iterator's table. This is likely not necessary since we also mark the hash table, which should have a pointer to that same table. That wouldn't be defensive programming, though. (hash_iter_init, us_hash_iter_init): Initialize table and mask. (hash_iter_next_impl, hash_iter_peek): These functions have to walk the table snapshot taken by the iterator, using the captured mask, and not the current table. (has_reset): If the target table's use count drops to zero, obliterate its table stack. We add a missing setcheck here; this operation potentially stores a different hash into an existing iterator. It's not being done safely with regard to generational GC. * tests/010/hash.tl: New tests.
* sort: move tests into tests/012.Kaz Kylheku2023-05-021-45/+0
| | | | | | | | * tests/010/sort.tl: File moved to tests/012. The reason is that the tests 010 run with the --gc-debug torture tests. That test case runs way too long under that test because of the testing of many permutations and whatnot.
* sort: support stable sorting via ssort and snsort.Kaz Kylheku2023-05-021-0/+22
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | For array-like objecgts, these objects use an array-based merge sort, using an auxiliary array equal in size to the original array. To provide the auxiliary array, a new kind of very simple vector-like object is introduced into the gc module: protected array. This looks like a raw dynamic C array of val type, returned as a val *. Under the hood, there is a heap object there, which makes the array traversable by the garbage collector. The whole point of this exercise is to make the new mergesort function safe even if the caller-supplied functions misbehave in such a way that the auxiliary array holds the only references to heap objects. * gc.c (struct prot_array): New struct, (prot_array_cls): New static variable. (gc_late_init): Register COBJ class, retaining in prot_array_cls. (prot_array_mark, prot_array_free): New static functions. (prot_array_ops): New static structure. (prot_array_alloc, prot_array_free): New functions. * gc.h (prot_array_alloc, prot_array_free): Declared. * lib.c (mergesort, ssort_vec): New static function. (snsort, ssort): New functions. * lib.h (snsort, ssort): Declared. * tests/010/sort.tl: Cover ssort. * txr.1: Documented. * stdlib/doc-syms.tl: Updated.
* sort: larger test.Kaz Kylheku2023-05-011-0/+8
| | | | | | | | * tests/010/sort.tl: Add some test cases of larger list. The exhaustive permutation tests are good but only go up to a relatively short size, where the median-of-three doesn't even kick in. We also cover choosing an alternative less function.
* sort: replace Lomuto partitioning with HoareKaz Kylheku2023-05-011-0/+15
| | | | | | | | | | | | | | | | | | I'm seeing numbers aobut the same performance on a sorted vector of integers, and 21% faster on vector of N random integers in the range [0, N). Also, this original algorithm handles well the case of an array consisting of a repeated value. The code we are replacing degrates to quadratic time. * lib.c (med_of_three, middle_pivot): We don't use the return value, so don't calculate and return one. (quicksort): Revise to Hoare: scanning from both ends of the array, exchanging elements. * tests/010/sort.tl: New file. We test sort with lists and vectors from length zero to eight, all permutations.
* hash: new function, hash-props.Kaz Kylheku2023-05-011-0/+7
| | | | | | | | | | | | | | | | We don't have a function in the hash table module which can create a populated hash table in one step without requiring the caller to create auxiliary lists. This new function fills that gap, albeit with some limitations. * hash.c (hash_props): New function. (hash_init): Register hash-props intrinsic. * tests/010/hash.tl: New tests. * txr.1: Documented. * stdlib/doc-syms.tl: Updated.
* range/range*: tests.Kaz Kylheku2023-03-271-0/+103
| | | | * tests/010/range.tl: New file.
* New function: group-map.Kaz Kylheku2022-03-021-2/+2
| | | | | | | | | | | | * hash.c (group_map): New function. (hash_init): group-map intrinsic registered. * hash.h (group_map): Declared. * tests/010/hash.tl: New test case. * txr.1: Documented together with group-by. Extra paren removed from group-by example.
* hash: add tests for group-by, group-reduce.Kaz Kylheku2022-03-021-0/+16
| | | | * tests/010/hash.tl: New tests.
* quasiquote: make new @,expr work in dot position.Kaz Kylheku2022-01-181-0/+12
| | | | | | | | | | | | | | | Bugfix: the newly introduced @.expr fails in the dotted position because ^(a . @,expr) turns into (list 'a 'let ...). * eval.c (is_meta_unquote): New static function. (expand_qquote_rec): Replace existing shape test with is_meta_unquote. We must also use this test in one more place: whenever the cdr of a list has the meta unquote shape, we must treat the result similarly to a dotted atom, by converting to append format. * tests/010/qquote.tl: Test cases to cover this.
* quasiquote: support @,expr hack.Kaz Kylheku2022-01-181-0/+30
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | For better or worse, TXR Lisp has a dichotomy of representation that @<atom> produces sys:var syntax, whereas @<compound> produces sys:expr. This can cause an issue in backquoting. Suppose you want to use backquote to generate sytax like (a @b) where the b comes from a variable. The problem is that (let ((x 'b)) ^(a @,x)) doesn't do what you might expect: it produces (sys:expr b) rather than (sys:var b). This patch adds a hack into the quasiquote expander which causes it to generate code to do what you expect. Old behavior: 1> (expand '^(a @,x)) (list 'a (list 'sys:expr x)) New behavior: 1> (expand '^(a @,x)) (list 'a (let ((#:g0012 x)) (if (atom #:g0012) (list 'sys:var #:g0012) (list 'sys:expr #:g0012)))) In other words, x will be evaluted, and the based on the type of the object which emerges, either sys:var or sys:expr syntax is generated. * eval.c (expand_qquote_rec): Implement the above hack. We are careful to only do this when this exact shape occurs in the syntax: (sys:expr (sys:unquote item)). * tests/010/qquote.tl: New file. * txr.1: Documented.
* json: add tests with multi-line literals.Kaz Kylheku2022-01-101-0/+10
| | | | | | * tests/010/json.tl: New tests. These work. Odd; I'm seeing an issue whereby typing multi-line #J expressions into the listener does not work.
* txr: do not ignore regex in positive match.Kaz Kylheku2021-12-271-0/+11
| | | | | | | | | | | | | | | | | * match.c (h_var): Refactor the logic here a bit. Without regard for whether the variable has a value, we dispatch the regex, fixed field and function cases. These handle the binding against the existing value. Then before all other cases, we check for the existing value and convert that to a literal text match. The effect of this is that now the regular expression is processed even if the variable has a value. * tests/010/span-var.txr: Last two test cases hardened a bit so they cannot fall through to a successful exit, if they invoke the wrong case. This is not related to this change. New test cases for regex span. * txr.1: Updated documentation and compatibility notes.
* txr: function span variable must match existing value.Kaz Kylheku2021-12-271-0/+13
| | | | | | | | | | | | | | | | | | | | * match.c (h_var_compat): New function; verbatim copy of existing h_var prior to this commit. (h_var): If a variable has an existing binding, but is a function spanning match, do not substitute it with text. Handle it with the ordinary case, in which we now use dest_bind instead of cons. (v_var): Similarly, here, we must also use dest_bind, rather than always freshly binding the variable. (match_compat_fixup): For 272 compatibility, substitute h_var_compat for h_var in the horizontal directive table. * tests/010/span-var.txr: New test cases. * txr.1: Documentation updated and also improved overall. The behavior when a variable has an existing value is clarified for the regex and fixed field case. Also update and condense compat notes for 272.
* txr: allow variable to span vertical function.Kaz Kylheku2021-12-261-0/+15
| | | | | | | | | | | | | | | * match.c (v_var_compat, v_var): New static functions. (match_files): No longer recognize v_var specially; it is now handled via vertical table. (dir_tables_init): Register a vertical sys:var directive also via v_var function. (match_compat_fixup): New function. * txr.c (compat): Call match_compat_fixup. * tests/010/span-var.txr: New file. * txr.1: Documented.
* tree: new functions for priority queue operation.Kaz Kylheku2021-12-181-0/+30
| | | | | | | | | | | | | | | | * tree.c (tree_min_node, tree_min, tree_del_min_node, tree_del_min): New functions. (tree_init): tree-min-node, tree-min, tree-del-min-node, tree-del-min: New intrinsics registered. * tree.h (tree_min_node, tree_min, tree_del_min_node, tree_del_min): Declared. * txr.1: Documented. * tests/010/tree.tl: New tests. * stdlib/doc-syms.tl: Updated.
* tree: bugfix wrong tree-count.Kaz Kylheku2021-12-181-0/+10
| | | | | | | | | | | | | When duplicate keys are inserted in the default way with replacement, the tree size must not be incremented. * tree.c (tr_insert): Increment the tr->size and maintain tr->max_size here. In the case of replacing an existing node, do not touch the count. * tests/010/tree.tl: Add test cases covering duplicate insertion and tree-count. (tree_insert_node): Remove unconditional size increment.
* tree: support for duplicate keys.Kaz Kylheku2021-12-171-0/+33
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | * tree.c (tr_insert): New argument for allowing duplicate. If it is true, suppresses the case of replacing a node, causing the logic to fall through to traversing right, so the duplicate key effectively looks like it is greater than the existing duplicates, and gets inserted as the rightmost duplicate. (tr_do_delete_specific, tr_delete_specific): New static functions. (tree_insert_node): New parameter, passed to tr_insert. (tree_insert): New parameter, passed to tree_insert_node. (tree_delete_specific_node): New function. (tree): New parameter to allow duplicate keys in the elements sequence. (tree_construct): Pass t to tree to allow duplicate elements. (tree_init): Update registrations of tree, tree-insert and tree-insert-node. Register tree-delete-specific-node function. * tree.h (tree, tree_insert_node, tree_insert): Declarations updated. (tree_delete_specific_node): Declared. * lib.c (seq): Pass t argument to tree_insert, allowing duplicates. * parser.c (circ_backpatch): Likewise. * parser.y (tree): Pass t to new argument of tree, so duplicates are preserved in the element list of the #T literal. * y.tab.c.shipped: Updated. * tests/010/tree.tl: Test cases for duplicate keys. * txr.1: Documented. * stdlib/doc-syms.tl: Updated.
* tree-count: new function.Kaz Kylheku2021-12-171-2/+10
| | | | | | | | | | | | | * tree.c (tree_count): New function. (tree_init): tree-count intrinsic registered. * tree.h (tree_count): Declared. * lib.c (length): Support search tree argument via tree_count. * tests/010/tree.tl: Test cases for tree-count, indirectly via len. * txr.1: Documented.
* New function: delcons.Kaz Kylheku2021-09-021-0/+14
| | | | | | | | | | | | * eval.c (eval_init): Register delcons intrinsic. * lib.[ch] (delcons): New function. * tests/010/cons.tl: New file. * txr.1: Documented. * stdlib/doc-syms.tl: Updated.
* txr: @(eof) takes argument for binding termination status.Kaz Kylheku2021-08-052-0/+5
| | | | | | | | | | | | | | | | | | | | | | | | | | | | We extend the matching context structures to keep track of the underlying stream from which lines are being taken via the lazy list. Then the implementation of the @(eof) directive, when it hits the eof condition, can use this stream to gain access to the termination status. * match.c (match_line_ctx, match_files_ctx): New member, stream. (ml_all): Take stream argument and initialize new member. (h_call, do_match_line): Pass stream argument to h_call. (mf_all, mf_file_data): Take stream argument and initialize new member. (mf_from_ml): Propagate stream from line context to file context. (freeform_prepare, v_next_impl, match_filter, match_fun, extract): Pass stream argument where now needed. (v_eof): Implement termination status binding via the stream stored in the context. (open_data_source): Store stream in match files context. * tests/010/eof-status.txr: New file. * tests/010/eof-status.expected: New file. * Makefile (tst/tests/010/eof-status.ok): -B option for new test. * txr.1: Documented eof directive, argument and all. * stdlib/doc-syms.tl: Updated.
* parser: allow trailing commas in json, via opt-in flag.Kaz Kylheku2021-07-291-0/+17
| | | | | | | | | | | | | | | | | | | | | | | * parser.c (read_bad_json_s): New symbol variable. (parser_common_init): Propagate value of *read-bad-json* into read_bad_json flag in parser structure. (parser_init): Initialize read_bad_json_s and register the *read-bad-json* dynamic variable. * parser.h (struct parser): New member, read_bad_json. (read_bad_json_s): Declared. * parser.y (json_val): Support an opt_comma symbol just before the closing bracket or brace. (opt_comma): New nonterminal symbol. Recognizes ',' or nothing. Error is flagged if ',' is recognized, and *read-bad-json* is nil. * y.tab.c.shipped: Updated. * tests/010/json.tl: New tests. * txr.1: Documented.
* tests: guix fixes.Kaz Kylheku2021-07-131-1/+1
| | | | | | | | | | | | | | | | * tests/002/query-1.txr: Skip test if an executable /bin/sh doesn't exist, rather than the bogus reasons. * tests/010/json.tl: Change the condition for the command-put-json tests: not whether cat is found in the search path but whether /bin/sh exists and is executable. * tests/017/realpath.tl: Also quit if /usr/bin doesn't exist. * tests/018/path-test.tl: Exit succesfully if /bin/sh does not exist. Revert the earlier change. * tests/018/process.tl: Quit if no executable /bin/sh exists.
* tests: json: skip tests relying on cat, if missing.Kaz Kylheku2021-07-121-7/+9
| | | | | | | * tests/010/json.tl: skip several test cases which rely on the cat utility for testing command-put-json, if the cat utility is not found in the search path. Reported and investigated by Paul A. Patience.
* tests: json: fix accidentally excluded tests.Kaz Kylheku2021-07-121-2/+2
| | | | | | * tests/010/json.tl: Fix several tests being excluded from the (mtest ...) form to which they are expected to belong, one of them having an extra quote in the expected value, too.
* read/get-json: reject trailing junk in string input.Kaz Kylheku2021-06-201-1/+21
| | | | | | | | | | | | | | | | | * parser.c (lisp_parse_impl): If parsing from string, check for trailing junk and diagnose. JSON parsing doesn't use lookahead because it doesn't have a.b syntax, so the recent_tok gives the last token that actually went into the syntax, and not a lookahead token. So in the case of JSON, we call yylex to see if there is any trailing token. * tests/010/json.tl: Extend get-json tests to more kinds of objects, and then replicate with trailing whitespace and trailing junk to provide coverage for these cases. * tests/012/parse.t: Slew of new read tests and iread also. * txr.1: Documented.
* lib: new function, fill-vec.Kaz Kylheku2021-06-081-0/+32
| | | | | | | | | | | | | | * eval.c (eval_init): Register fill-vec intrinsic. * lib.c (fill_vec): New function. * lib.h (fill_vec): Declared. * tests/010/vec.tl: New file. * txr.1: Documented. * share/txr/stdlib/doc-syms.tl: Updated.
* json: improve escaping for script tags.Kaz Kylheku2021-06-031-2/+12
| | | | | | | | | | | * lib.c (out_json_str): Strengthen the test for escaping the forward slash. It has to occur in the sequence </script rather than just </. Recognize <!-- and --> in the string, and encode them. * tests/010/json.tl: Cover this area with some tests. * txr.1: Documented.
* json: fix two test cases for Windows.Kaz Kylheku2021-06-021-2/+5
| | | | | | | | | | | | * tests/010/json.tl: on Windows characters are limited to the BMP range 0 to #\xFFFF. The character escape \x10437 is out of range, and so throws an error, simply from that syntax being read. The two test cases which use this character are clumped into their own test form, which is executed conditionally on wide characters being more than two bytes. Because the expression is still parsed on Windows, we read the troublesome character from a string at run-time, and interpolate it.
* json: wrap up: test cases, fixes, tweaks.Kaz Kylheku2021-05-311-0/+124
| | | | | | | | | | | | | | | | | | | | | * /share/txr/stdlib/getput.tl (get-jsons): If the s parameter is a string, convert it to a byte input stream so that. (put-jsons): Add missing t return value. (file-put-json, file-append-json, file-put-jsons, file-append-jsons, command-put-jsons, command-put-jsons): Add missing object argument to all these functions, and a missing "w" open-file mode to several of them. * stream.c (mkstemp_wrap): Calculate length of suff the defaulted argument, not the raw suffix argument. * test/010/json.tl: New file, providing tests that touch every area of the new JSON functionality. * tests/common.tl (mstest, with-temp-file): New macros. * txr.1: Document that get-jsons takes a source which could be a string.
* tree: let tree-iter be iterable via generic iteration.Kaz Kylheku2021-05-121-0/+5
| | | | | | | | | | * lib.c (seq_iter_init_with_info): Recognize tree_iter object, and treat using tree iterator function. * tests/010/tree.tl: test case for tree subrange iteration with collect-each. * txr.1: Updated.
* tree: streamline iteration; provide high limit.Kaz Kylheku2021-05-111-7/+7
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | Getting rid of tree-begin-at and tree-reset-at. Now tree-begin takes two optional parameters, for specifying high and low range. * tree.c (struct tree_diter): New members, tree and highkey. We need tree due to requiring access to the less function. If the iterator has no highkey, the iterator itself is stored in that member to indicate this. (tree_iter_mark): Mark the tree and highkey. (tree_begin): Take optional lowkey and highkey arguments, initializing iterator acordingly. (tree_begin_at): Function removed. (copy_tree_iter, replace_tree_iter): Copy tree and highkey members. The latter require special handling due to the funny convention for indicating highkey absence. (tree_reset): Take optional lowkey and highkey arguments, configuring these in the iterator being reset. (tree_reset_at): Function removed. (tree_next, tree_peek): Implement highkey semantics. (sub_tree): Simplified: from and to arguments are just passed through to tree_begin, and there is no need for a separate loop which enforces the upper limit, that now being handled by the iterator itself. (tree_begin): Update registrations of tree-begin and tree-reset; remove tree-begin-at and tree-reset-at intrinsics. * tree.h (tree_begin_at, tree_reset_at): Declarations removed. (tree_begin, tree_reset): Declarations updated. * lib.c (seq_iter_rewind, seq_iter_init_with_info, where, populate_obj_hash): Default new optional arguments in tree_begin and tree_reset calls. * parser.c (circ_backpatch): Likewise. * tests/010/tree.tl: Affected cases updated. * txr.1: Documentation updated. * share/txr/stdlib/doc-syms.tl: Regenerated.
* tree: support indexing and range extraction.Kaz Kylheku2021-05-111-0/+18
| | | | | | | | | | | | | | | | | | | * lib.c (do_generic_funcall): Support tree object invocation with one or two arguments via sub and ref. (sub): Implement for trees via sub_tree. (ref): Implement for trees via tree_lookup. * tree.c (sub_tree): New function. (tree_init): Register sub-tree intrinsic. * tree.h (sub_tree): Declared. * tests/010/tree.tl: New tests. * txr.1: Documented: DWIM bracket syntax on trees, sub and ref support for trees, sub-tree function, * share/txr/stdlib/doc-syms.tl: Regenerated.
* tree: replace-tree-iter function.Kaz Kylheku2021-05-111-0/+11
| | | | | | | | | | | | | * tree.c (replace_tree_iter): New function. (tree_init): Register replace-tree-iter intrinsic. * tree.h (tree_init): Declared. * share/txr/stdlib/doc-syms.tl: Updated. * txr.1: Documented. * tests/010/tree.tl: New test case.
* tree: copy-tree-iter function.Kaz Kylheku2021-05-101-0/+9
| | | | | | | | | | | | | | | * lib.c (copy): Handle tree_iter_s via copy_tree_iter. * tree.c (copy_tree_iter): New function. (tree_init): copy-tree-iter intrinsic registered. * tree.h (copy_tree_iter): Declared. * tests/010/tree.tl: New test case. * txr.1: Documented. * share/txr/stdlib/doc-syms.tl: Updated.
* diff/isec: reset hash/tree iter instead making new.Kaz Kylheku2021-05-102-0/+11
| | | | | | | | | | | * lib.c (seq_iter_rewind): Use hash_reset and tree_reset to rewind the existing iterator rather than allocating a new one. * tests/010/hash.tl: New file, covering uni, diff and isec for hash tables. * tests/010/tree.tl: New tests.
* tree: new tree-peek function.Kaz Kylheku2021-05-091-0/+8
| | | | | | | | | | | | | | * tree.c (tn_peek_next): New static function. (tree_peek): New function. (tree_init): Register tree-peek intrinsic. * tree.h (tree_peek): Declared. * txr.1: Documented. * tests/010/tree.c: Work tree-peek into existing test case. * share/txr/stdlib/doc-syms.tl: Updated.
* tree: new make_similar_tree unction.Kaz Kylheku2021-05-091-0/+8
| | | | | | | | | | | * tree.c (make_similar_tree): New function. (tree_init): Register make-similar-tree intrinsic * tree.h (make_similar_tree): Declared. * tests/010/tree.tl: New tests. * txr.1: Documented.
* tree: new functions for reseting iterator.Kaz Kylheku2021-04-301-0/+15
| | | | | | | | | | | | | | * tree.c (tree_reset, tree_reset_at): New functions. (tree_init): tree-reset and tree-reset-at intrinsics registered. * tree.h (tree_reset, tree_reset_at): Declared. * tests/010/tree.tl: New tests. * txr.1: Documented. * share/txr/stdlib/doc-syms.tl: Updated.
* tree: use rlist in test case.Kaz Kylheku2021-04-301-1/+1
| | | | | * tests/010/tree.tl: Use rlist to express discontinuous range instead of appending ranges.
* tree: new tree-begin-at function.Kaz Kylheku2021-04-291-1/+28
| | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | | * tree.c (enum tree_iter_state): New iterator state tr_find_low_prepared dedicated to the tree-begin-at traversal. This state indicates that tree-next should visit the starting node that it is given, and then after that, treat anything to the left of it as having been visited. In the other states, tree-next does not visit the node it is given but uses it as the starting point to find the next node. (tn_find_next): Bugfix here: when navigating the right link, the function neglected to add the node to the path. But the logic for backtracking up the path expects this: it checks whether the node from the path is the parent of a right child. Somehow this didn't cause a problem for full traversals with tree-begin; at least the existing test cases don't expose an issue. It caused a problem for tree-begin-at, though. (tn_find_low): New static function. This finds the low-key node in the tree, priming the iterator object with the correct state and path content to continue the traversal from that node on . We need the tr_find_low_prepared state in the iterator in order to visit the low node itself that was found. (tree_begin_at): New function. (tree_init): Register tree-begin-at intrinsic. * tree.h (tree_begin_at): Declared. * tests/010/tree.tl: New test cases for tree-begin-at. * txr.1: Documented. * share/txr/stdlib/doc-syms.tl: Updated.