Module docstring
{"# Theorems about List operations.
For each List operation, we would like theorems describing the following, when relevant:
* if it is a \"convenience\" function, a @[simp] lemma reducing it to more basic operations
(e.g. List.partition_eq_filter_filter), and otherwise:
* any special cases of equational lemmas that require additional hypotheses
* lemmas for special cases of the arguments (e.g. List.map_id)
* the length of the result (f L).length
* the i-th element, described via (f L)[i] and/or (f L)[i]? (these should typically be @[simp])
* consequences for f L of the fact x ∈ L or x ∉ L
* conditions characterising x ∈ f L (often but not always @[simp])
* injectivity statements, or congruence statements of the form p L M → f L = f M.
* conditions characterising the result, i.e. of the form f L = M ↔ p M for some predicate p,
along with special cases of M (e.g. List.append_eq_nil : L ++ M = [] ↔ L = [] ∧ M = [])
* negative characterisations are also useful, e.g. List.cons_ne_nil
* interactions with all previously described List operations where possible
(some of these should be @[simp], particularly if the result can be described by a single operation)
* characterising (∀ (i) (_ : i ∈ f L), P i), for some predicate P
Of course for any individual operation, not all of these will be relevant or helpful, so some judgement is required.
General principles for simp normal forms for List operations:
* Conversion operations (e.g. toArray, or length) should be moved inwards aggressively,
to make the conversion effective.
* Similarly, operations which work on elements should be moved inwards in preference to
\"structural\" operations on the list, e.g. we prefer to simplify
List.map f (L ++ M) ~> (List.map f L) ++ (List.map f M),
List.map f L.reverse ~> (List.map f L).reverse, and
List.map f (L.take n) ~> (List.map f L).take n.
* Arithmetic operations are \"light\", so e.g. we prefer to simplify drop i (drop j L) to drop (i + j) L,
rather than the other way round.
* Function compositions are \"light\", so we prefer to simplify (L.map f).map g to L.map (g ∘ f).
* We try to avoid non-linear left hand sides (i.e. with subexpressions appearing multiple times),
but this is only a weak preference.
* Generally, we prefer that the right hand side does not introduce duplication,
however generally duplication of higher order arguments (functions, predicates, etc) is allowed,
as we expect to be able to compute these once they reach ground terms.
See also
* Init.Data.List.Attach for definitions and lemmas about List.attach and List.pmap.
* Init.Data.List.Count for lemmas about List.countP and List.count.
* Init.Data.List.Erase for lemmas about List.eraseP and List.erase.
* Init.Data.List.Find for lemmas about List.find?, List.findSome?, List.findIdx,
List.findIdx?, and List.indexOf
* Init.Data.List.MinMax for lemmas about List.min? and List.max?.
* Init.Data.List.Pairwise for lemmas about List.Pairwise and List.Nodup.
* Init.Data.List.Sublist for lemmas about List.Subset, List.Sublist, List.IsPrefix,
List.IsSuffix, and List.IsInfix.
* Init.Data.List.TakeDrop for additional lemmas about List.take and List.drop.
* Init.Data.List.Zip for lemmas about List.zip, List.zipWith, List.zipWithAll,
and List.unzip.
Further results, which first require developing further automation around Nat, appear in
* Init.Data.List.Nat.Basic: miscellaneous lemmas
* Init.Data.List.Nat.Range: List.range and List.enum
* Init.Data.List.Nat.TakeDrop: List.take and List.drop
Also
* Init.Data.List.Monadic for addiation lemmas about List.mapM and List.forM.
","## Preliminaries ","### nil ","### length ","### cons ","## L[i] and L[i]? ","### get and get?.
We simplify l.get i to l[i.1]'i.2 and l.get? i to l[i]?.
","### getElem!
We simplify l[i]! to (l[i]?).getD default.
","### getElem? and getElem ","### getD
We simplify away getD, replacing getD l n a with (l[n]?).getD a.
Because of this, there is only minimal API for getD.
","### get!
We simplify l.get! i to l[i]!.
","### mem ","### isEmpty ","### any / all ","### set ","### BEq ","### isEqv ","### getLast ","### getLast? ","### getLast! ","## Head and tail ","### head ","### headD ","### tailD ","### tail ","## Basic operations ","### map ","### filter ","### filterMap ","### append ","### concat
Note that concat_eq_append is a @[simp] lemma, so concat should usually not appear in goals.
As such there's no need for a thorough set of lemmas describing concat.
","### flatten ","### flatMap ","### replicate ","### reverse ","### foldlM and foldrM ","### foldl and foldr ","#### Further results about getLast and getLast? ","## Additional operations ","### leftpad ","## List membership ","### elem / contains ","## Sublists ","### partition
Because we immediately simplify partition into two filters for verification purposes,
we do not separately develop much theory about it.
","### dropLast
dropLast is the specification for Array.pop, so theorems about List.dropLast
are often used for theorems about Array.pop.
","### splitAt
We don't provide any API for splitAt, beyond the @[simp] lemma
splitAt n l = (l.take n, l.drop n),
which is proved in Init.Data.List.TakeDrop.
","## Manipulating elements ","### replace ","### insert ","## Logic ","### any / all ","### Legacy lemmas about get, get?, and get!.
Hopefully these should not be needed, in favour of lemmas about xs[i], xs[i]?, and xs[i]!,
to which these simplify.
We may consider deprecating or downstreaming these lemmas. ","### Deprecations "}