Commit ed72c382 authored by Dennis Müller's avatar Dennis Müller
Browse files

removed duplicate text

parent ce190872
......@@ -9,9 +9,6 @@ In addition to the above optimizations in the algorithm, we can normalize the th
The common logical framework used for all the libraries at our disposal -- namely LF and extensions thereof -- makes it easy to systematically normalize theories built on various logical foundations. On the basis of the above features, we currently use the following approaches to normalizing theories:
\begin{itemize}
\item Free variables in a term are replaced by holes.
\item For foundations that use product types, we curry function types $(A_1 \times\ldots A_n)\to B$ to $A_1 \to \ldots \to A_n\to B$. We treat lambda-expressions and applications accordingly.
For example: \[f = \lambda (n,m) : \mathbb N \times \mathbb N .\; n + m\text{ becomes } f = \lambda n : \mathbb N.\; \lambda m : \mathbb N .\; n + m\] \[f(\langle a, b\rangle)\text{ becomes }f(a,b).\]
\item Since judgments -- in our imports -- correspond to constants of type $\vdash P$ for some proposition $P$, we can use the curry-howard correspondence to equate the type $\vdash (P \Rightarrow Q)$ with the function type $(\vdash P) \to (\vdash Q)$, as well as the judgment $\vdash \forall x : A.\;P$ with the function type $(x:A) \to \vdash P$.
Since both styles of formalization are more or less preferred in different systems, we replace each occurence of the former by the latter.
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment