vlion: cut of the flammarion woodcut, colored (Default)
Feeling rather more mentally perky this week. I think I've finally started to adjust to my job and can resume prodding the thesis. I've identified the logic error my code is making. However, the only fix I can - naively - see - involves actually parsing a machine instruction's encoding and fiddling with it from there. I don't know that I want to go that route yet; it will be tremendously technical with an enormous propensity for bugs. I shall have to contemplate tomorrow and contact the company with the concept to see what their solution is. I need to break this wall.

I am starting to look at Hindly-Milnder type systems in a philosophical sense. In particular, it seems to be claimed that System F/HM type systems correspond to (consistent?) intuitionistic second-order logic.

After spending entirely too much time on Wikipedia, I am phenomenally unconvinced that intuitionistic logic has any meaningful validity outside of being a subset of the Formalist logic. I see the formalist idea of math being a grammatical transformation on strings played out all too exactly in computer science operations for me to give the "anti-formalist" school significant credence. Brouwer seems to have had fuzzy ideas from what I can read about him.

Another concept that bothers me is that type systems are claimed to be, in and of themselves, to prove valid programs (ie, the Curry-Howard isomorphism). In my work with type systems in the Java and C++ sense, it's quite trivial to bollux up your work whether types match or not.

I have an... intuition... that type systems on a language reduce the expressivity power of that language to less than the power that an untyped language that maps onto a universal Turing machine (neglecting the question of finite memory, ofc). According to the ever-correct wikipedia, most type systems are consistent, which - if I'm not mistaken, and I might be - would indicate less than full computational power.

One of the classic formal proof issues with computer languages is the actual computer - memory and IO - gives formal methods hideous fits. One reason why the seL4 kernel from NICTA was such a breakthrough was that it built on work where pointers were partitioned into a more manageable approach.

I think if I wind up doing research on this area, I would probably start with some sort of imperative semantic approach based on a real machine and inducing semantics based on mappings of the real machine to the language and generally avoid the lambda calculus for proof mainstays. While the lambda calculus is much more manageable for semantic analysis, it just doesn't hardly seem practical when applied against more real languages and real computers. I suspect that building from the computer towards a unified semantic driven by Turing machines has more applicability than deriving from a lambda calculus semantic towards the computer.

Anyway.

I had best roll off to bed so I feel decent in the morning when I get onto the thesis.

Profile

vlion: cut of the flammarion woodcut, colored (Default)
vlion

August 2017

S M T W T F S
  12345
6789 101112
13141516171819
20212223242526
2728293031  

Style Credit

Syndicate

RSS Atom
Page generated Aug. 21st, 2017 07:28 pm
Powered by Dreamwidth Studios

Expand Cut Tags

No cut tags

Most Popular Tags

Page Summary