There is, however, a more general argument against reverence, whether for the Greeks or for anyone else. In studying a philosopher, the right attitude is neither reverence nor contempt, but first a kind of hypothetical sympathy, until it is possible to know what it feels like to believe in his theories, and only then a revival of the critical attitude, which should resemble, as far as possible, the state of mind of a person abandoning opinions which he has hitherto held. Contempt interferes with the first part of this process, and reverence with the second. Two things are to be remembered: that a man whose opinions and theories are worth studying may be presumed to have had some intelligence, but that no man is likely to have arrived at complete and final truth on any subject whatever. When an intelligent man expresses a view which seems to us obviously absurd, we should not attempt to prove that it is somehow true, but we should try to understand how it ever came to seem true. This exercise of historical and psychological imagination at once enlarges the scope of our thinking, and helps us to realize how foolish many of our own cherished prejudices will seem to an age which has a different temper of mind.
-- Page 39, Chapter 4, "A History of Western Philosophy" by Bertrand Russell
Cogito Ergo Sum
Tuesday, April 22, 2014
Saturday, April 19, 2014
Weekly Reading: Cosmology in Heraclitus, Parmenides, and Empedocles
On reading the book "A History of Western Philosophy", I'm at the part of ancient Greek philosophers.
Heraclitus ( Ἡράκλειτος) talks about the world in an perspective that everything is changing all the time. He viewed strife as a power of changing, not necessarily an evil deed, and thus war "is the father of all and king of all". To think about the construction of the cosmos, he believes that fire is the primordial element, out of which everything comes into being.
However, Parmenides (Παρμενίδης) stressed that the cosmos is in essential perpetual with no change. He takes the words in human languages as such an instance, saying that the name of a person is always associated with him, whenever before, during or after his life. However, Bertrand Russell, the author of the book, attacked this theory by proposing that for those who never meet the person, what comes to their mind is their illusion or image based on how they read or hear about him, which is conspicuously not identical to what he really is in the minds of his friends. But after all, Parmenides' theory is partially true in the sense of quantum physics, since scientists can always find the unchangeable part of any object in a smaller and smaller scale, i.e. from Atom to Proton to Hadron. Although this process has not reached its final goal of find the unchangeable kernel, it's widely believed that the constituent of such a property with respect to chemical reactions must exist.
Finally, Empedocles ( Ἐμπεδοκλῆς) combines the theory of cosmology of all the previous philosophers together as a concession. In his theory, the cosmos is made from four elements -- fire, water, air and earth. There are also two powers, Love and Strife, performing the construction and destruction of objects respectively. The two kind of reactions happens in order and consist an infinite cycle. Thus all compound things are temporary and destructible, while only those four elements remains forever. Due to Empedocles, there is a golden age when the power of Love totally supersedes that of Strife. Then people at this era would worship only the Cyprian Aphrodite, who is the Goddess of Love.
Heraclitus ( Ἡράκλειτος) talks about the world in an perspective that everything is changing all the time. He viewed strife as a power of changing, not necessarily an evil deed, and thus war "is the father of all and king of all". To think about the construction of the cosmos, he believes that fire is the primordial element, out of which everything comes into being.
However, Parmenides (Παρμενίδης) stressed that the cosmos is in essential perpetual with no change. He takes the words in human languages as such an instance, saying that the name of a person is always associated with him, whenever before, during or after his life. However, Bertrand Russell, the author of the book, attacked this theory by proposing that for those who never meet the person, what comes to their mind is their illusion or image based on how they read or hear about him, which is conspicuously not identical to what he really is in the minds of his friends. But after all, Parmenides' theory is partially true in the sense of quantum physics, since scientists can always find the unchangeable part of any object in a smaller and smaller scale, i.e. from Atom to Proton to Hadron. Although this process has not reached its final goal of find the unchangeable kernel, it's widely believed that the constituent of such a property with respect to chemical reactions must exist.
Finally, Empedocles ( Ἐμπεδοκλῆς) combines the theory of cosmology of all the previous philosophers together as a concession. In his theory, the cosmos is made from four elements -- fire, water, air and earth. There are also two powers, Love and Strife, performing the construction and destruction of objects respectively. The two kind of reactions happens in order and consist an infinite cycle. Thus all compound things are temporary and destructible, while only those four elements remains forever. Due to Empedocles, there is a golden age when the power of Love totally supersedes that of Strife. Then people at this era would worship only the Cyprian Aphrodite, who is the Goddess of Love.
Weekly Reading: Term Rewriting, Formal Logic System
My current reading goal is to review and finish the book "Term Rewriting And All That" by Franz Baader and Tobias Nipkow as soon as possible. This week, I covered the part of termination, confluence and completion. The first two combined together is used to judge if a term rewriting system (TRS) is convergent, which is regarded as a well-defined TRS, while a completion procedure is to compute a convergent TRS from a given set of identities by applying a number of inference rules if there exists one.
In general, the decision problem on terminating and confluence given an arbitrary order is undecidable. So in the chapter of termination, the book introduced a concept called Reduction Order, which satisfies the properties of well-founded, compatible with Σ-operations, and closed under substitutions, and proposed some widely-used orders as the interpretation methods which are consistent with the reduction order. Note that the property of well-founded can be replaced by that of Simplification Order, which is much easier to show the termination of the given order. There are three classical orders introduced here: polynomial order, lexicographic path order, and the famous Knuth-Bendix order. Polynomial order requires that all function symbols in Σ under interpretation must be monotone in order to satisfy the third property of reduction order. It's easy to define, but can only show the termination of those TRS's that cannot be bound by primitive recursion (like Ackerman function). Both LPO and KBO can address this problem, and the decision procedure using either of them is decidable, while it's an NP-complete problem. LPO and KBO are both recursively defined, but KBO applies an extra weight function and the occurrence of variables in terms for the comparison.
For the issue of confluence, it's closely related to critical pairs under the set of rules R. If TRS is terminating, then by Newman's Lemma, it's confluent iff it's locally confluent. Thus, in this case, if we can show that all critical pairs given E and R are joinable, then TRS is confluent. More generally, for a not necessarily terminating TRS, we have to show that it's left-linear (i.e. no variable occurs twice in l of any rule l -> r in R) and parallel closed over all critical pairs, which is Corollary 6.4.9 from the Parallel Closure Theorem (Theorem 6.4.8).
In the chapter of completion procedures, a basic completion procedure is introduced, which is correct as a semidecision procedure and quite inefficient. Then an improved completion procedure was defined by applying a set of inference rules, like DEDUCE, ORIENT, DELETE, SIMPLIFY-INDENTIFY, R-SIMPLIFY-RULE, L-SIMPLIFY-RULE, many of which are quite similar to those in the decision procedure of equational unification problems. The main idea is to add those identities in E and critical pairs computed from E and R into R, then simplifying them under R and forming all resulted rules in R satisfying the reduction order >. This procedure is iterative until E becomes empty, which means that all critical pairs are joinable and the procedure terminates SUCESSFULLY. Otherwise, if E is never empty, which means that there exists some un-joinable critical pair, or some (s, t) is not comparable, which means that neither s -> t nor t -> s satisfies the reduction order >, then the procedure terminates with FAILURE. A tool named proof order is used to prove the correctness of the improved completion procedure, which is also a fair completion procedure. After that, a famous Huet's completion procedure was introduced, which works in a similar way, is correct but not fair.
In general, the decision problem on terminating and confluence given an arbitrary order is undecidable. So in the chapter of termination, the book introduced a concept called Reduction Order, which satisfies the properties of well-founded, compatible with Σ-operations, and closed under substitutions, and proposed some widely-used orders as the interpretation methods which are consistent with the reduction order. Note that the property of well-founded can be replaced by that of Simplification Order, which is much easier to show the termination of the given order. There are three classical orders introduced here: polynomial order, lexicographic path order, and the famous Knuth-Bendix order. Polynomial order requires that all function symbols in Σ under interpretation must be monotone in order to satisfy the third property of reduction order. It's easy to define, but can only show the termination of those TRS's that cannot be bound by primitive recursion (like Ackerman function). Both LPO and KBO can address this problem, and the decision procedure using either of them is decidable, while it's an NP-complete problem. LPO and KBO are both recursively defined, but KBO applies an extra weight function and the occurrence of variables in terms for the comparison.
For the issue of confluence, it's closely related to critical pairs under the set of rules R. If TRS is terminating, then by Newman's Lemma, it's confluent iff it's locally confluent. Thus, in this case, if we can show that all critical pairs given E and R are joinable, then TRS is confluent. More generally, for a not necessarily terminating TRS, we have to show that it's left-linear (i.e. no variable occurs twice in l of any rule l -> r in R) and parallel closed over all critical pairs, which is Corollary 6.4.9 from the Parallel Closure Theorem (Theorem 6.4.8).
In the chapter of completion procedures, a basic completion procedure is introduced, which is correct as a semidecision procedure and quite inefficient. Then an improved completion procedure was defined by applying a set of inference rules, like DEDUCE, ORIENT, DELETE, SIMPLIFY-INDENTIFY, R-SIMPLIFY-RULE, L-SIMPLIFY-RULE, many of which are quite similar to those in the decision procedure of equational unification problems. The main idea is to add those identities in E and critical pairs computed from E and R into R, then simplifying them under R and forming all resulted rules in R satisfying the reduction order >. This procedure is iterative until E becomes empty, which means that all critical pairs are joinable and the procedure terminates SUCESSFULLY. Otherwise, if E is never empty, which means that there exists some un-joinable critical pair, or some (s, t) is not comparable, which means that neither s -> t nor t -> s satisfies the reduction order >, then the procedure terminates with FAILURE. A tool named proof order is used to prove the correctness of the improved completion procedure, which is also a fair completion procedure. After that, a famous Huet's completion procedure was introduced, which works in a similar way, is correct but not fair.
Tuesday, January 28, 2014
Minimal Comprehensive Groebner Bases
Goals:
1. Decide if the transitivity of non-essentiality is necessary;
2. Prove the generated Groebner basis is minimal, faithful and comprehensive;
3. Extend it to general comprehensive Groebner systems;
4. Try to define the Canonical Minimal Comprehensive Groebner basis.
Done:
1. We do need the transitivity of non-essentiality to guarantee the comprehensiveness of the generated Groebner basis;
2. Faithfulness and comprehensiveness are trivial to prove; Minimality is able to be proved;
3. Extension: replace "LPP_X is x" by "LPP_X divides x".
1. Decide if the transitivity of non-essentiality is necessary;
2. Prove the generated Groebner basis is minimal, faithful and comprehensive;
3. Extend it to general comprehensive Groebner systems;
4. Try to define the Canonical Minimal Comprehensive Groebner basis.
Done:
1. We do need the transitivity of non-essentiality to guarantee the comprehensiveness of the generated Groebner basis;
2. Faithfulness and comprehensiveness are trivial to prove; Minimality is able to be proved;
3. Extension: replace "LPP_X is x" by "LPP_X divides x".
Saturday, September 28, 2013
Thursday, June 13, 2013
Review on Decision Procedure on Linear Arithmetic
This is about the review of topics that De Moura covered in SSFT2013.
1. Here is my implementation on Sturm's Theorem in Univariate Polynomial case:
1. Here is my implementation on Sturm's Theorem in Univariate Polynomial case:
The language I choose is Haskell.
2. Here is my proof of Sturm's Theorem:
Friday, May 24, 2013
Day V: Summer School on Formal Techniques
Putting Numerical Abstract Domain to Work: A Study of Array-Bound Checking for C Programs (by Arnaud J. Venet, slides):
1. Review of Abstract Interpretation using Polyhedra abstract domain;
My Question: what is the difference between polyhedron intersecting and the narrowing operator?
2. Gauge domain: modified interval domain to help scale the operations within loops (specifically, widening and convex-hull operations);
3. He uses New Polka library and XXX's fixed-point algorithm:
a. Issue: Program Crashed for widening operations: too many dimensions (i.e. program variables). Solution: eliminate unimportant variables;
b. Issue: still crashed. Solution: directly propagate unchanged constraints rather than calculate again and again. Solved.
Verified Programming in VCC (by Ernie Cohen, slides):
1. VCC is heavily based on ghost data, which costs zero executing time;
2. There is no invariant check when VCC applies the owning operation on objects;
3. Verification on concurrent programs: lock-free; lock is container.
Program Verification and Synthesis as Horn-like Constraint Solving (by Andrey Rybalchenko):
1. verification using temporal logic (especially CTL);
2. Doing abstract interpretation using horn-like constraint solving strategy (using Interpolant).
Lab (by Ernie Cohen):
1. Write you own binary search program in C;
2. Use VCC (either in Visual Studio or with online version (http://rise4fun.com/Vcc)) to verify the code;
3. VCC code you need to write: function-level (pre-condition using requires; post-condition using ensures; terminating using decreases), loop-level (invariant specification).
4. My original code has 27 bugs, all of which are related to overflow. After fixing it for one hour, I get it bug-free under VCC verification.
1. Review of Abstract Interpretation using Polyhedra abstract domain;
My Question: what is the difference between polyhedron intersecting and the narrowing operator?
2. Gauge domain: modified interval domain to help scale the operations within loops (specifically, widening and convex-hull operations);
3. He uses New Polka library and XXX's fixed-point algorithm:
a. Issue: Program Crashed for widening operations: too many dimensions (i.e. program variables). Solution: eliminate unimportant variables;
b. Issue: still crashed. Solution: directly propagate unchanged constraints rather than calculate again and again. Solved.
Verified Programming in VCC (by Ernie Cohen, slides):
1. VCC is heavily based on ghost data, which costs zero executing time;
2. There is no invariant check when VCC applies the owning operation on objects;
3. Verification on concurrent programs: lock-free; lock is container.
Program Verification and Synthesis as Horn-like Constraint Solving (by Andrey Rybalchenko):
1. verification using temporal logic (especially CTL);
2. Doing abstract interpretation using horn-like constraint solving strategy (using Interpolant).
Lab (by Ernie Cohen):
1. Write you own binary search program in C;
2. Use VCC (either in Visual Studio or with online version (http://rise4fun.com/Vcc)) to verify the code;
3. VCC code you need to write: function-level (pre-condition using requires; post-condition using ensures; terminating using decreases), loop-level (invariant specification).
4. My original code has 27 bugs, all of which are related to overflow. After fixing it for one hour, I get it bug-free under VCC verification.
Subscribe to:
Posts (Atom)