Tuesday, December 31, 2013

Summary, 2013

This blog got its start in July of 2007, right before its sister blog (7'oops7). As of today, there have been 199 posts with 16 categories.

The image shows the highest-read posts for the Past 30 days and for All time. Compare this list with last year's.

Past 30 days                                                   All time

It's interesting that a 2008 post has been of interest, of late. That same post is 2nd on the All time list. Too, a 2013 post has gone to the top of the All time list.


We'll have to get back to the computablity issues this coming year. Someone needs to address things from other than how one can screw the public (legally or otherwise) into handing over their monies. That is, so much of the modern value system deals with excessive accumulations (If you're so smart, ...). Too, the best and brightest have a wider sphere of influence, thereby we ought to have them under more scrutiny than the average joe. So, why is it best and brightest to be greedy and manipulative?

Remarks:   Modified: 12/31/2013

12/31/2013 --  

Tuesday, December 3, 2013

48 hours of freedom

Just came back from a period of being unplugged (as in, no use of electronic means to talk to the cloud-based chimera - yes, as if things there are more real than what we exist amongst - but, we'll have to go on about that, almost ad infinitum, in order to characterize the issues) for 48 hours.

Now, there are many things to discuss about this. Why was it done? Many other queries come to mind.

Aside: There was one little nibble, but it was on a public computer. I had to go to espn.com to see what the heck happened to the Jawhawks. Did being on the island relax Self too much? On the same day, the 'Cats took Duke's prominence apart (wasn't long after another such lesson was administered - ah, I digress - tis madness, indeed). But, that little thing was only a brief look at a couple of pages.

This unplugging wasn't the first such event. Some claim that one ought to do such on a regular basis. Who agrees?

But, let's tell a story.

About a decade or so ago, those who went around with their noses to a little device were manager types. Okay? I never saw, at the time, any technical person do such. But, then, those who I knew were dealing with more intricate matters. So, the manage types had to be e'mailed-in (and whatever else those early things did) to their peers in order to keep workers at bay and exploited. Also, what panache they exhibited (ah, we were all to kiss the feet or other).

Were they not so grand in their devotion to their job and so commanding in their presence (as in, aura of importance)? However, at the time, too, I never thought much of this activity on the part of these types as it was a good thing, too. How? You see, many bosses would not touch a keyboard. It was beneath them; in fact, many had furniture in their offices that hid such things from visitors. Remember, footprints and profiles were massive in those days.

So, if some little device got them to thinking would that not be a good thing? Wait, turns out to not be true. As, giving tools to managers has resulted in all sorts of perfidy (most of which is mis-action directed against lowly employees who only are trying to feed themselves and their families -- too, globalization's bane would not have been so harsh without the advances in computation - including networks).

Since that time, all sorts of things have come about: social media, interactive gaming, and more. Now, those who have their noses to devices are hooked (yes, we can even talk Google's glasses, augmented reality, and more) many times. Too, that cognitive processing which relates to interfacing with these things is directly opposite that which is truly smart (ah, let the little beasties based upon our manipulation of e-m phenomena rule the roost?). But, we'll go on about that more later.


In short, that whole overlay (easily characterized) is diminishing the grandness of reality (or attempting to; it cannot suppress being - another thing/concept supposed inimical to the modern viewpoint). But, how can that be explained? Give us time, as these things are only a decade or so old. That which really needs attention goes way back (it's in our genes and memes).

Remarks:  Modified: 01/23/2015

12/04/2013 -- 3rd try. If, at first, you don't succeed, ... (one side-effect of getting away? losing touch with the processes). Truth via computation will become increasingly important, yet, truth discernment via our traditional (age-old) means will not be replaced. Our computational prowess can be an assistant (perhaps, at times, a source) to our intuition. It will be interesting to see how the two interplay (yes, very much asserting the existence of intuition'al facilities far beyond what we might have considered - all sorts of decisions the past 10s of years have caused a divergence (chasm) that looks to be beyond span) in the future. We've seen a tremendous amount of change the past 10 years, perhaps, by another 10, the issues will be more clear (for one, that computational constraints apply as much to mathematical systems as they do to those of a logical nature).  

12/17/2013 -- History of the devices, the start - Blackberry.

01/23/2015 -- Software? Well, we are talking more than apps (latest craze). We are dealing with fundamental questions which, then, gives rise to normative issues in mathematics (and, by extension, to the computational).

Monday, November 11, 2013

Big-data daddy

Context: See Tru'eng anewfocus going forwardmathematics.


Big data? Yes, it's all one hears nowadays. If IBM has it in their ad stream, then you know it's about making money. What's next?

No one knows. But, it's nice having look backs that are insightful. Too bad that we don't have 20-20 hindsight (is foresight possible?).

An Atlantic article was a nice diversion from all those big-data dopes who push the smart (supposedly) devices that are really enmeshing agents. Dumb'd down, one might even claim (but, that's for another time). Hofstadter is from a former era. He's not the past, as his questions, and his work, are still relevant. The Atlantic had someone write about him who has a good feel for the issues.

So, reading this article is definitely worth the time.

   James Somer, November 2013,
                 The Man Who Would Teach Machines to Think

Atlantic, Nov 2013
Mentioning IBM was apropos: Deep Blue and Watson? Both of these got public attention, but they're dumber than a new-born baby, even one who might later be rated a less than average. Why say that? For several reasons. But, let's just say that both of these used brute force and the bullying of the data.

Aside: oh, don't some people act similarly?

You see, the idiots running after big data's worth (essentially, pushing toward the mean - with the large numbers involved) might appear to be obviously smart (yes, as in, the bigger the pockets, the smarter someone is - ignores ill-begotten gains, plus does not recognize the pervasive putrid-ness of near-zero realities). The techniques related to this type of work has been entrapping mankind in ways that do not portend well for the future. Of course, not all of mankind, as some have seemed to risen above the fray (which we could characterize several ways).

Like the article says, the approach of using large amounts of data might bring forth little nibbles of knowledge, but trying to get to truth? Won't work (we can, and will, describe how it ought to work).


These topics have all been discussed in this blog, at some time or another. Links may appear, at some point, here to those older posts. Not guaranteed, though. Perhaps, another approach is necessary.

In the meantime, it's nice to be reminded of Hofstadter's ilk (yes, those who are not chasing after that illusive buck (hugh accumulation) based truth set).


Same issue has another article of note.

       All Can Be Lost: 
            The Risk of Putting Our Knowledge in the Hands of Machines

It's nice that we can get assists from our tools. And, it our way to keep making better tools. Yet, to make the tool the focus is morally wrong (yes, golden sacks use the concept in terms of finance, so that gives me leeway); it doesn't take long, when looking around, to see the very deleterious results from the actions/decisions of the past couple of decades.

Aside: Hoftstadter's book, GEB, considers the importance of Godel's work. We've already addressed that, in part, under the guise of discussing computability, but many are not yet inclined to consider the ramifications (why? chasing after buck-based chimeras is too much fun).

Remarks:  Modified: 01/05/2015

11/11/2013 -- Yes, the Fed's data-driven illusion is on the table for discussion. 

12/31/2013 -- A popular post

03/02/2014 -- Analytics follow. Yet, we need qualitative analysis more (my put).

01/05/2015 -- Renewal, see Context line.

Thursday, October 3, 2013

Best and brightest of what?

Context: See Tru'eng anewfocus going forwardmathematics.


We'll be expanding upon this further, as we go along. The context deals with the Streets. Can the Wall ever use integrity without us laughing? Is Main really reliant upon what are the best and brightest of the Wall?

Can they be best and brightest if they don't know virtue from vice? Yes, let's talk that.

See related posts: Fed-aerated and 7oops7.


For now, we're pulling from Facebook a comment about a post at Kid Dynamic's blog. Notice that the post deals with cancelling trades, as in golden sacks was let off while another firm had to eat dirt (a lot of it). KD responded to comment to his post, as follows: ... as for predatory behavior: markets are predatory. they're not about charity - every trade I do is with the goal of making money by being smarter than the guy on the other side. that's how it works.. that's the OBJECT..

My comments was, as follows:

We know that golden sacks was let off the hook (as in, trades that they did in error were cancelled - yes, kid gloves for these ones). We also know that an earlier incident resulted in another firm eating a big loss. So, here we see a post and discussion of these two.

Too, we know that things like high-frequency gaming, dark pools, and much more go toward a daily extraction into special (oh, so special) pockets to the detriment of the many. In other words, financial engineering, and other things (computational methods), have made things a shambles, despite the occasional shining of the chimera.

Kid writes about all of this stuff. In one comment, Kid (himself) says: "as for predatory behavior: markets are predatory. they're not about charity - every trade I do is with the goal of making money by being smarter than the guy on the other side. that's how it works.. that's the OBJECT..." Yes, 'tis a direct quote, and, right there, the guy brings out the core issue.

He does characterize the thing properly (as have many others of similar views). I ask, from whence this viewpoint (by necessity, jungle, dwarinian flavoring?) which does not a sustainable economy make? We'll get into that. Does it not seem that such types ought to be wrapped in a strong cloak and provided with a playground (sandbox) with which to work out their little immature fantasies?

That is, these ones ought not be the focus for how things will unfold in the future. Where did we get the notion that the talking heads and their graphical displays and all of the other technical marvels which tell us about the activity of a few (comparatively and to wit, DJIA, et al) ought to set the tone for our daily lives? 


That whole notion of predatory necessity is something to address. Look. There was a time that the Vikings, at will, came down and trashed (burned, pillaged, stole) England, time and again. The poor monks could not handle the vicious cultural thing. Thank God for Alfred the Great. But, are the Vikings behaving thusly now? Well, excluding the fact that we could find some types of association on a close look, most descendants of the raiders are nicely civilized (of course, Norsemen had their culture - I'm 99% northern European, by the way).

Just because the wild web, advances in mathematics, and computing prowess both allow the proposition and disposition (used to be left to God) does not make it, by necessity, that we, and our unstable financial realities, revert back to 1000 years ago or so (even if Crusaders seem to be still in the offing).

Remarks:  Modified: 01/06/2015

10/04/2013 -- Oh, yes, two posts (Fed-aerated and 7oops7), but no mention of savers being slapped silly. Notice in the savers post that an image says no bullets left. Ah, yes, Ben panicked and used up his ammo. But, has he not shown all of us (and the world) that there was a whole lot of other maneuvering possible? But, too, does he know that he's cowboy'ed us into a corner?

10/27/2013 --Ben has sacked the savers for years now, slapping them silly. Why? A WSJ article looked at high-class pawnshops a couple of days ago. These fill the need for people who need money but cannot get it from the banks (stupidity there, too). So, they use collateral for a loan and pay high interest. How high? Some pay over 200%, per year. What is Ben paying or having banks pay their savers (customers)? Way less than 1%. That is the best example of being out of whack with economic realities that one could ask for. Yet, does the Fed see? Why is the interest low? To push savers toward higher risk? To appease the gaming crowd (most likely this, as these are big-pocketed folks)? To help people afford housing (on someone else's back?)? ... Janet's take on this is unknown, but she has to know that they're looking like idiots. You know what? Most of those pawn loans are paid, even with the high interest. And, still, Ben slaps the savers (King Alan mentioned saving, of late). We ought to ask the Fed, what happened to prudence or does it like to reward profligacy?

12/31/2013 -- A popular post.

01/08/2014 -- We're patiently waiting for Janet to get her feet wet. At some point, she'll get out of Ben's shadow. Hopefully, it will be soon for the savers who are being slapped silly by the day.

01/05/2015 -- Renewal, see Context line.

01/06/2015 --  Best and brightest3rd most read (7'oops7),  1st most read (Tru'eng)7th most read (FEDaerated)

Sunday, September 1, 2013


WDYTYA is a TV series that dates from 2010 in the U.S. It originally aired on NBC; now, it's on TLC. In each show, some personality of note looks at their ancestry (very briefly). The story line then includes the details of their search among their tree plus particulars related to the part of the tree that they follow. 

Aside: In case you aren't familiar with topic, it might be apropos to put a few words about magnitude when you think of your ancestors. Go back far enough (see later on Cindy Crawford), and you will have a ton of people (of course, wags can argue about the diminished influence from each, and we'll respond to such like this: nonsense. Arguable? Yes. Later. -- Consider: when they were here on their two feet, their being was as great as you might think that your's is. Now, if you don't think much of yourself, that's another issue to discuss.). Two generations from you are the four grandparents. Then, each of these has parents. So, you are talking eight families at the 1st great-grandparents. Stopping there, for a second, consider that each of the four females represents a male line that culminated with her. So, at each node, the mother (these grandparents are referred to as greats beyond the 2nd generation) has a line that is n-1, rather than n (say what? yes, every line extends back to God-only-knows how far - that is, even undocumented people stand on the shoulders of their ancestral giants - it's the height of hubris to think that things written subsume more than a slim margin of the truth -- ah, at last, how this relates to the topic of the blog -- mainly, truth, in part, is based upon humans -- many attributes to be discussed.) 


When you think of people in the U.S. with "known" lineages to the first comers (say, those described in the Great Migration Project of the New England Historic Genealogical Society), by the time you count going back, you are talking thousands of families. Most do not have a filled tree, even under the best of circumstances. But, still, for those who do the work, some interesting things pop up. 


Aside: As we see with several seasons of personalities, their discoveries are like fleshing out American history and more. Too, one sees the pain and suffering that went along with events. In fact, that many survived very hard conditions shows the quality of their character (no implication ought to be found here about the Darwinian-associated -- we'll get to that, eventually). Too, though there ought to be some kudos for the views of the editors, one does get more than the sanitized perspective so loved by the historians (abstractionist leaners (yeah, computer types, especially) need to realize that truth is at the core -- yes, it's very smelly -- core? we'll get there). That is, the real person came out of the presentation -- kudos, indeed, to those who created these episodes). 

Disclosure: I've enjoyed this series from the beginning (plus the Roots one of Prof. Gates). WDYTYA, essentially, started less than year after I first got into historic genealogy. On the first episode (Sarah Jessica Parker), I saw familiar names and really liked pointing out cousins. But, then, that gets old, as one finds all sorts of these relationships. Now, it might be fun to put numbers on it (oh, x-cousin, y-removed), but that gets stale, too. Another thing, early, was the show following ancestral links back across the pond. It was nice to see this under the guidance of experts who, supposedly, would not let things go astray (as, we have seen mischief all along with genealogy - ah, where is the ethics?). So, we saw people go back to gentrified folks (gentrified? think of these four classes - [...] (none ever granted or earned), (...] (forfeiture via rebellion or otherwise), [...) (favored, as have to start somewhere), (...) (blessed -how many of these?), and to others. Too, the subjects seem to not all go back to one area (as in, covering a lot of Europe and beyond, planned? serendipity?).


BTW, as the seasons progressed, the personality became more the researcher with helpers in the background. At the beginning, the experts were more prominent. So, we have the recent episode of Cindy Crawford as an example. She started with one of her grandparents and went backward to New England fairly quickly which then gets my interest. After some wandering there, she went back to England. 

Aside: I think that Lisa Kudrow has done a good job. In fact, this season has a coverage that is very nice. Say, the Civil War, War of 1812, WWII, and more. And, Christina Applegate's story was heartbreaking; she was brave to let the edit stand. 


Now, Cindy's episode left a bunch of questions about what happened over here. Take Connecticut. Some now celebrate the venture there. Well, it sounds like New Haven (purists - oh Lord, protect us from such) was not heaven on earth (but, then, earthly heaven would be more like Merry Mount -- sorry, comes from hearing Dr. Lucy Worsley talk about Charles II). Cindy's ancestor left his kids there (we didn't find out what happened to them all). 

Then, we hear that the guy supported Cromwell. Well, we all have found such in our trees. My beef with Cromwell, if I had one, would be his treatment of the Irish, but, in that regard, he joins a large set of evil doers. And, I have a lot more to read about all of this stuff. There were not many angels written about (hence, the reference to the undocumented above - except for the likes of Saint Margaret of Scotland, perhaps).

Later, Cindy learns that she is a Charlemagne descendant through her ancestor. After seeing her visit, I did go back to read about where he was buried and a little more.

Aside: One thing that struck me were the remarks by the white-haired guy (I've seen him somewhere) in regard to why everyone goes ga-ga about Charlie. I've wondered, as he did not have influence in that area now seen as Britain, or eastern Europe, or a number of other places. The guy said that Charlie (his descendants are Charlie's minors) was the father of Europe. Perhaps so, as they split up his realm for (some of) his grandsons. Also, we're talking western Europe; history tells us that the resulting littler domains kept quarreling. Probably we would find some relation from this to those cousins bickering during the War of the Roses. BTW, in terms of the classes above, Charlie's ancestry is of the [...) variety. Who can really go back further sufficiently to not raise serious criticism - meaning, by other than wags? 


There are many ways for truth engineering to be interested in the past, including genealogy and memes. All here manifest some feat of survival over unknown amounts of time. Some have leaned toward genetic analysis, yet that turns out to not be the silver bullet that one might think. Are we missing something? Essentially, yes. But, we'll get there in our own time (as said, perhaps, other than PTIME).


05/28/2015 -- White-haired guy? RCA of NEHGS, of course.

12/31/2013 -- A popular post.

09/02/2013 -- Given the way the western world works, entry of the class of (...] might have happened due to no issue or only female issue (sorry, ladies). I like the forfeiture part (have run across it multiple times enough to start a sandbox). Ireland? Poor dears were run over countless times. Does that make them all saints? ... By the way, being a favorite does not always pan out (poor Despenser - carved up as Isabella cavorted).

Modified: 05/28/2015

Thursday, August 29, 2013

Genealogy and memes

A recent WDYTYA highlighted part of Cindy Crawford's ancestry and showed her descent from Charlemagne. Someone estimated that 1000 hours of research had gone into this particular episode. These shows, and ads by sites like ancestry.com, have increased the interest in genealogy.

Of those 1000 hours, how many were related to backtracking from a dead-end? Humans have some type of reduction/closure operator that pares graphs in an ex-post-facto fashion so that the details (grunt work? - let's hear it for the Marines and soldiers who have their feet on the ground - and those who get themselves daily into the dirt for the collective well-being) get shuffled out. A look back then sees a glorified (dusted off) reality. Actually, if you look at the state-of-the-art of planning and controlling (it's under a general topic called, by some, earned value), it's too easy to miscalculate what is necessary going forward to accomplish some task (actually, we don't have 20-20 hindsight, either). To wit? Microsoft, Boeing, ... (it's a very long list, folks).


We had a similar increase in genealogical interest here around the turn of the century (the 1900 one). At that time, we had the 300th anniversary of Jamestown and New England. Too, it was a little past the 100th of the American Revolution. Now, we have seen, and are looking forward to, 400th anniversaries.

Aside: Our truth interests are several. Firstly, humans are directly involved with truth. We'll go into that further. Too, memes come into play when considering humans and their progress/regress.


For perspective, at the cohort level where some of Cindy's ancestor took the journey to this side of the pond (1600s), she would have 1000s of ancestors. What we see, or can retrieve, are single threads, sometimes twisted through inter-marriage. That means that there are holes in the tree; normal people don't have a filled in tree, say as compared to that of Princes William and Harry.

As a means to discriminate the unworthy, the upper crust had to be vigilant since they didn't want interlopers. Except, over here, people started to do data collecting, and analysis, early on (there are wonderful collections; yet, fires and other disasters have taken away lot -- even, to the current - the U.S. Army had a fire at its records site in Saint Louis, MO in the 1960s -- as you would guess, with losses).

So, being able to prove a lineage is more uncommon that we would like to consider. Most people have large holes in their trees. Or, some like to think of walls (solid material, like bricks) that keep one from venturing back. Trying to fill these in (or climb over the barrier) motivates a lot of work.

Aside: Any accumulation of genealogical knowledge resulted from prior work. It would be nice if it were like mathematics where later theorems, that are extensions, explicitly reference the former work. Not giving credit where due seems to be a norm in genealogy.

Aside: With Charlemagne, some still argue about the lines. Of course, Medievalist do more than just think of lineage. We need people to think of cohorts of Charlie or the state of the world when he was there (say, year of birth, 742).


Just as we see with genealogy, no problem/solution set is completely filled in. That idea, folks, is a panacea. The usual retort is that there ain't no silver bullet. But, we need to take it further (and will). A few comments, of late here, have involved the need to think of "singularities." Perhaps, that is an overused term (concept), so we'll need to coin something. The issue is partly mathematical, in scope.


03/03/2014 -- Example that will be the focus.

09/02/2013 -- Looking forward to future episodes.

09/01/2013 -- After seeing the Cindy episode in which she visits Charlie's grave, there is a guy (seen him before) who pompously says that Charlie was the father of Europe and Cindy ought to be proud (or something like that) about Charlie being her grandfather. Perhaps he was the father of Europe, as, after his son's death, they divided the kingdom amongst (some of) his grandsons. And, along with fighting the frequent raiders from outside (too numerous to name), there was fighting amongst cousins. Makes one wonder how these Christian countries adopted such practices as the Savior never condoned. Could there have been some type of united Europe back then? Well, the militaristic (aggressive) nature of some peoples, say Muslims, may argue against being too insular and defenseless. Or even the Vikings who found the monks and their property too easy to pilfer. So, are there lessons that we can learn from so long ago, 50+ generations?

08/30/2013 -- I browsed today a recent book on Charles II's ancestry. It has a recent publication date. Well, about the eighth generation, a hole appeared. Now, this would carry forward. And, one could probably find hole later one. Now, remember, eight generations is a couple of centuries. I don't think that anyone here would have a filled in tree going back 200 years. If there is someone who is close (say, as close as would be the Princes - see above - I would like to hear about it).

Modified: 03/03/2014

Monday, August 5, 2013

Mathematics I

Context: See Tru'eng anewfocus going forwardmathematics.



Our collective friend, Hitchens, said many things that we ought to think about now and then. And, based upon the websites that are quoting Hitchens, there are many who are doing just that. One of his quotes actually made it to the TED (deals with current, and future, technological issues) site.

In short (paraphrased), assertions without evidence can be dismissed without evidence. When I first saw that quote being reported in some web page from some source (actually, Google brings up several sites - but, my first awareness was several years ago), I felt that I would have responded to him, something like this (had I the quick wit under the spotlight and were in his presence): like axioms? You see, Hitchens was many times pushed beyond the point where he said that he didn't know. Yes, he said "didn't know" many times; but, either given his inherent talent at showmanship or mere frustration at putting idiots in a bottle or whatever other motive, Hitchens would go off on his erudition-showing response (sometimes, these were rants - however, well considered and stated).


Aside: Some attribute the quote to Richard Dawkins who made a similar statement. Wikipedia points to what is known as Hitchens' Razor with references. And, it's really a re-phrasing of an age-old slogan.


A constructive look:

Hitchens did not deal with mathematics, but he did respect results that were founded upon such. You see, that would be most of modern scientific models ("most" is arguable; however, we're talking 2000 plus and the fact that about every discipline has mathematized its knowledge base - many times to a large degree -- I ask, what discipline with any modicum of a quantitative does not use advanced statistical methods?). So, who of the modern domains has not used mathematics?

It turns out that there may be many who are not so learned (not meant pejoratively). As, this revolves around a central issue related to numeracy, or lack thereof, that has insidious implications for the future. We'll get back to that. Expect though, for me to show that numeracy limits us, many ways (not arguing against the great ideas founded upon numbers and the computational -- rather, consider, can our artifacts subsume Being?).

The basics:

We'll first have to deal with what exactly is this thing called "mathematics?" We can start with a few little pointers here and branch out. Please note, this approach is pseudo-constructive (albeit, not to the level of Bourbaki - yet, effective in its own way, as we will show).
  1. An encyclopedia is a good place to start: Wolfram's MathWorld comes to mind. I have watched this grow (and, even, to a small extent, contributed). As well, the Wikipedia view is phenomenal (actually, what I love about Wiki is that I can look at something (that is well-sourced, usually) without running down myriads papers and books). By studying this type of material, one could get a fairly good idea of what mathematics is, though several issues, such as motivation, would not be as apparent as could be. As Plato thought, arithmetic can be one of the starting points for mathematical education. Geometry, and its use in our world, is the next step. 
  2. We can also take a behavioral view and say this: mathematics is what mathematicians do (this little thing out of St. Andrews is wonderful - please look at the Mathematician of the day page). If we cared to study all that has been done, then we might have some idea of what this field encompasses. But, much of mathematics is mental (and, we don't normally expect people to read minds). One may see symbols and numbers, yet, their use is not always intuitively obvious. On the other hand, we can see what has been labelled under the concept, even if future perturbations, and extensions, would not be seen using this descriptive approach. However, there is one little thing that lurks: is mathematics more than what people do (next bullet)?  
  3. Ah, metaphysics coming in? Let's keep it simple, as we could discuss, for a long time, what might be called the differences between big "M" (Mathematics) and little "m" (mathematics). How can we circumvent that? You see, if you look at MathWorld (or what Mathematica does), you'll see all sorts of operative marvels (basis of STEM) with results that astound (our technological age) when looked at closely. Yet, just ask yourself if the reality behind whatever is being expressed (or computed or manipulated) by our artifact (assuming small m, okay?) is equivalent to what we understand via our use of the artifact? Another way to think of it is to consider the map-territory problem. First, extend "map" to encompass the sum total of our prowess with mathematics. The territory? Yes, we would really like to know (but it's not the former). In short, we tend to get these two merged (map-territory) as there seems to be no real way to do the differentiation (ability to note differences) many times (as in, that particular action is underdetermined - note, please, the poster is very much aware of proof-theoretic powers - yet, the poster sees the importance of being aware of quasi-empiricism and its issues).   
  4. Is mathematics that which can be built from axioms and proofs? We all know that mathematics involves theorems. Some of these are applicable to our lives; others are beyond the comprehension of most (if you listen to von Neumann, he would say beyond the comprehension of any - see first quote). Now, there are many ways to argue proof methods, but logic is involved. Then, that brings in knowledge and how we know. Also, we need to consider Russell's and Whitehead's efforts to put a firm basis under logic. It took hundreds of pages to get to proving (by terse equations) simple addition which we can teach early on to the human mind. Too, Russell, and later work, brought in computablilty issues which may or may not be problematic to the future of automation (ah, letting the cat out of the bag - the web is laden, heavily with pseudo-mathematical pursuits what need our scrutiny).           
Now, that listing is not trivial. But, the list does include the important things and has a progression that we'll be looking at in depth (basis for truth engineering's necessity and operational look).

  1. What? We know that we have mathematics and its use; such is abundantly evident. Much work in mathematics has been done over the years. The web has allowed an acceleration of means to get access to information about mathematics and provides a common platform for discussion and use. Marvelous techniques exist. The trend seems to be toward automation of some of these. The truth, folks, is that higher-order computation (as in advanced mathematics) requires, generally, human involvement in two, and perhaps more, areas. One is input (see the qualification problem as an example -- numeric processes have oodles of decision states prior to execution - in other words, setting up a solution attempt is a creative task). Then, we have output issues dealing with more than interpretation. For instance, ramification is important. 
  2. Who? We see people doing mathematics. Only some of those are involved with computation. But, certain classes of problems are computationally oriented and will continue to be so as technological progress improves computational artifacts and methods (of a very wide variety). That there is motivation to learn and do mathematics seems obvious, too. We have not seen a decline in the interest. More later. 
  3. Where? As we use mathematics, the larger picture does not go away. But, we'll defer all of that til later. 
  4. How? As, there is controversy about hubris and its appearance. It's almost impossible to know this up-front. In retrospect, we see it, or think that we do, many times. Unfortunately, right now, we have the finance community running off after multitudes of seemingly smart approaches; we need to push some type of sand box (or, at least, now, some notion thereof) into the awareness of these, folks; this we would like to demonstrate.

As stated at the start, there was some nod to Hitchens in this post albeit we are dealing with mathematics. The content here takes a broad view and tries to identify the major pieces. Getting into all of the nuances will be on-going, albeit the looks will be more broad than deep (for awhile, at least - at some point, we'll have to take on a few of the more important points in a technical manner).  


06/25/2015 -- ACM Communications had an article (Created Computed Universe) that suggest that our computional prowess ought to lead to agnosticism rather than to anything else. Of course, my initial remark: So many modern minds conjure and contort in order to introduce what is not much different than what some knew many millennia ago in the desert.

01/23/2015 -- Software? Well, we are talking more than apps (latest craze). We are dealing with fundamental questions which, then, gives rise to normative issues in mathematics (and, by extension, to the computational).

01/05/2015 -- Added context line at top. We're at a renew point.

03/03/2014 -- Acknowledgements, including math pedigree, will be expanded.

08/06/2013 -- Investigative journalism likes its Five Ws. That is fine for their topical views. To get technical, we have to add in, at least, an H (How?). Too, we need to change the order, according to various factors: domain, focal area, and more.

So, the order of the above list is "What?, Who?, Where?, How?."  There are other orders. For instance, "How?" have could come before "Where?" in the list. That, though, would have changed the emphasis (recall, truth engineering is the overarching theme).

We're ignoring Why? (why not?) and When? (ah, history of mathematics -- the future, too?).

08/05/2013 -- In this interview, Hitchens uses "horribly un-reflective" (around 12:00) in response to his Iraq stance's outcome (turmoil amongst those who wondered where he was coming from?) and whether he had, or not, tempered the position that he taken at the time. So he admitted to being reflective, which we would all have known anyway. As said above, lots of his positions were forced by reacting to idiots and their attempts to master him.

Modified: 06/25/2015

Sunday, July 28, 2013

Biases can be good

One moral could be that if we try to be without biases (and to always deal from first principles), we easily get into states of "analysis paralysis" in which we become Congress-like (do nothingness). On the other hand, we can realize that fore-knowledge (several connotations to consider) has a lot of situational usefulness to folks who perform (or succeed or any number of other things).


There have been many incidents, of late (splashed across the spaces that are perturbed by news people, mainly for the sake of keeping turmoil to the maximum), that bring to the fore the fact that people process, cognitively and under the stress of handling potentially overwhelming input (senses and more), their load (from multitudinous sources) using pre-suppositions that are based upon several things: prior experience, what they've learned (redundancy, I know), superposition by a stronger mind (if you would, the influence of dominating (overbearing) persons (many types here that we'll get into eventually), and a whole lot more.

That is (and, one can argue the issue all sorts of ways), how people go through their daily lives deals a lot with what can be called "biases" (please, drop the pejorative notions (reactions, except, also be aware of your own biases), for a bit, okay?) which are helpers as much as hinder'ers.

So, let's accept that fact and spend some time looking at how this little trick helps us daily (as in, understanding the usefulness and how best to not let it carry us away).


By the way, the most prominent bias on the planet resides in the minds of the highly educated (we'll get there, too; however, if you look at posts in this, and the related, blogs, you'll see plenty of references to this). And, also, those types are the worse in pointing fingers at the biases of others.

Aside: please note that I said educated, and not smart (yes, the ability to ace tests is not as strongly advanced as many seem to want to think). Of course, autodidactism will be a key issue to look at.

The insidious part of those educated types is that everyone else suffers the consequences of their actions. Some little poor person's sphere of influence is much smaller (thereby, more amenable the scrutiny of more minds -- those who shield themselves behind chimeras make use of this dynamic (we'll get there, too).

Aside: if you look at the economy, and what might be called criminal activity thereof, mostly, bad behavior dealing with smaller amounts are slapped harder than those considered too big to fail (entrappers). We just saw that, up close and personal (still a problem despite the bullish aspect of the chimera's playground). Same goes for biases. Some pre-supposition that is wrapped with silly, abstracted jargon is as much a bias as is a knee-jerk reaction.


Now, in the context of IJCAI (International Joint Conference on Artificial Intelligence), there was a talk titled: Why biased minds make better inferences. In other words, being of a homo heuristicus variety is part of our nature. Also, notice that the talk uses an example related to actually doing something (as opposed to what we see in DC now, with the poloticos, who have themselves wrapped so much in their biases that they are essentially ineffectual).

Aside: as opposed to algos. But, this talk is only one of many things to discuss, so don't get hung up on arguing about this one speech. However, notice that the talk does touch upon some modern techniques (about which we have made allusions) being done by those who need to acquire some familiarity with (even respect for) the issues of quasi-empiricism.


This topic was mentioned in an earlier post, about Baruch. But, it needs to be more prominently seen as an important part of what truth engineering is all about. There is a lot of work to do here.

But, for now, the message is for you to exalt in your biases. Embrace them. Thank them for helping you to make it through the day. And, know that progress along the scale of human advancement (I know, all sorts of arguable points here) does not entail that you lose your biases; rather, tuning them is what we need to learn how to do.

Aside: we want to mention Bayes, and his work. Even with this data-driven process, we can have biases (expectation being of essence). Thomas must be rolling over in his grave to see all of the misuse of his ideas that are the basis for the work (actually, raking in the money) of  Zuck, and the like. But, we'll get to that to boot, as money and biases go hand in hand.


We'll have to pick up several threads that were left dangling (our Basis, for example).

Also, just as biases can be good, we also know that there is a down side. Unfortunately, it's that latter that gets the attention. How our biases ought to be managed (by ourselves, okay) deals very much with truth and its wonders.


12/31/2013 -- A popular post.

Modified: 12/31/2013

Saturday, July 13, 2013

CMS, again

CMS? Yes, one has to state the context within which the acronym (assuming that is such) needs to be interpreted. So, we're talking computing but there are, at least, two ways to view this. Both important.

Earlier, we started down the content management track since Microsoft decided to remove OfficeLive users from their happy ways. In fact, some had built business processes (okay, wrong choice on their part) upon the OfficeLive stuff. These people had to scramble to find a replacement resource.

For us, OfficeLive offered nice templates and a good front end to building a website. Did I mention that it was free? So, OfficeLive going away forced me to have to get technical.

Aside: I've put together fairly productive sets of software systems on a good laptop using free software. And, the functionality rivaled what a company would pay for. And, "code" (as in what drives computation and decision making) was my focus for decades. As some have argued, it was the reality. However, prior to the content management study, I had consciously refrained from doing code. So, what could have been more interesting that code? Lots of things. Mathematics is definitely, is one doesn't go down the applied path beyond things that can be done on paper (don't laugh, we used to do algorithm testing by hand in the early days). Modeling of all sorts will always be of interest. Why? Map-territory issues will require us to revolve the virtual and the real (foregoing, for the moment, the differences between these two). One reality will the sensors (almost like QE, sensors now and forever). Wait! Google glass having a potential use beyond mere gadgetry? ... How about History? For instance, there has been a lot of effort put into historic genealogical work (say, NEHGS). There is the History that we read about in books. Then, there's the existential affair of people and details and more. What about Memes in this context? Dawkins, perhaps, tongue-in-cheek proposed the concept to handle the messy things that don't fit well into theory. But, you know what? All this fits and starts with computation, networking, and more? To me it looks like a recapitulation with all sorts of ontological underpinnings.

So, after the initial look (early 2012), I realized that I needed to get up to speed. But, I had neither the time or the interest (see comments on WordPress, Drupal, Joomla). I set up a little bit of pages using these three but was not happy with the results that I would get without a serious attempt at learning how they worked. So, what did I do? I went back to what I knew (static HTML - after all, I was doing markup prior to HTML; too, I had many website in place from the Mosaic days onward). The main problem with HTML was finding a WYSIWYG editor as I didn't want to pay for Front Page or any other commercial tool. Guess what? Sea Monkey's little Composer is nice (not perfect, by far). So, this Site is the basis for all comparisons (OfficeLive's little thing, that worked for years, disappeared).


Wait! What's another meaning of CMS? Configuration management. In both cases, the "s" is for system.


So, of late, I've realized that I let the ball drop. Too, requirements were looming that would force me to either code something (not that I couldn't, but, again time and energy are limited). So, I had to get back to the garden (I had already learned that youtube has more than nice music videos - it is nice to see things from decades ago, ah), so to speak. I had been watching classes for awhile. No structure, just following interest. One wonderful video of a mathematics class at Stanford sold me on the approach of video. As you know, all of the schools have quality items there: Harvard, MIT, etc.

So, I looked for a WordPress video. Fortunately, there was one that you could have a site up on one-hour. That video set the standard, one hour. There's no real content to the the site that I built as I followed the video. But, it's there. And, I can make comparisons with other approaches.

Not seeing what I wanted in WordPress, I went to Drupal. First off, I couldn't find a video that claimed setting up a site in one hour (Mind you, part of the time is getting the site's name, etc. I was already set up with that.). But, I made an attempt. My reaction was that it was for coders. Everywhere, it looked like I was going to have to do HTML.

So, I looked for Joomla videos. There was a one-hour site claim, using Joomla 3. I had 2.5 installed and installed the newer version. So, I followed the video religiously. Where the guy had images or text, I just made things up. Here is the first result (here is the video - again, there were highly detailed subjects being covered in other videos which I'll get back to -- the requirement was to get something running).

After the first pass, I decided to build a site that I need to maintain. To do this, I followed the video, again, but at a faster clip. Here is the result. As I tried to add new things, I've gone back to the video (have not read a line of documentation yet -- I used to read part of the book (when learning some new language) and then jump in -- or, in class, I would follow for awhile and then just explore).

By the way, part of the decision comes from seeing who is using what. Harvard and Sprint are using Joomla. I'm familiar with both sites. I know that I need to dig deeper in Joomla. Case in point: as modules have side-effects, one has to worry about order and such; these types of nuances are what an expert knows how to handle. For me? I've resolved several of these. Actually, I like the development environment (say, compared to the older ways of using a C++ or other language in a developer's workbench) and the underlying control/interpret scheme.

Aside: Last year, I sort of recoiled from the heavy database use in these approaches. That explains, in part, the fallback to HTML and files. Now, I'm getting a little more comfortable, but I'm also ready to argue the points (we'll get there, truth engineering) of configuration. You see, the package pushes talk content. Configuration control support is there. Perhaps, it's covered in the book. We'll see. However, as I've seen with all of this free stuff. You have updates and upgrades always coming at you. How things interplay must be taken seriously if you're in an environment with continual use. Who drive requirements for these things? Is it too much a gadgetry (feature) focus?

After playing with Joomla, I knew that I had to look at other approaches which are several. Concrete5 sort of jumped out. It's a front end for developers, one video said. Here is my little bit (didn't even use a video, as it seemed intuitive). I'll have to spend some time with this. And, also I'll need to look at others, to be complete.

In the meantime, I'll maintain the current site (static HTML -- blogging all the while), use Joomla as the future, and try to get the Concrete5 example to be like the Joomla. After things become more second nature, then a total redesign might be in order. Right now, I just cut stuff from the static view to the database. Perhaps, thinking about web design issues might be apropos.


01/21/2019 -- If you look at Content Management or CMS at this blog and the one for the Thomas Gardner Society, Inc. (CMS or Configuration), you will see lots of posts. We have a new site where we prove our work: TGSoc.org. Its role is portal but, for now, we introduce changes there, first. See the Discussion page with a link to our devlog. Based upon the direction I hear that Google is going, working this approach for our portal is right on. So, technical will be visible rather than not.

06/20/2016 -- Concrete5 example removed. Broken link in one library (at the ISP) mentioned by http://www.whoishostingthis.com/resources/php/.

05/31/2016 -- Continuation of the theme.

08/02/2014 -- Bit the bullet and updated the site (looks, behavior) using HTML/CSS. Of course, things are still pending, such as membership functions, business, ... We'll get there.

10/19/2013 -- Too many interesting things to get involved in. Is coding for the young? That is, not that the older mind cannot do it; rather, it's fairly mundane compared to other things. To be discussed. Perhaps, winters onset will push the mind toward the computational.

09/04/2013 -- While poking around more, I'm more inclined to think that the database, as a central feature, is not the great leap forward that some might think. Unless, using such is wedded with structural approaches (to be defined - unless someone can point to an existing discussion). Took a break for awhile, but did an edit in Joomla and in Concrete5, today. The latter keeps the HTML visibly present. So, that might lead one to think that it's coding, perhaps.

07/14/2013 -- I meant to mention other uses of CMS. Wiki has several that are computer related. We can't forget the Navy's CMS-2. Too, software configuration systems were known by CMS. But, there are many non-computer usages, such as Change Management System (related to process management).

07/13/2013 -- Ah, yes. design. Here is someone touting the powers of WordPress.

Modified: 01/21/2019

Monday, July 8, 2013


Entrapment? Yes, we're being overlaid with an insidious veil of supposed intelligence (singularities lurk). Earlier, we mentioned Eric Hoffer, as an example for us (outside the mania of consumerism, for one). Doug, the mouse guy and more, is another (was not like the current ones who are raking in billions daily for getting us further into perdition). We'll be adding to this list (Perelman, Patel, ..., lots of people).


We have people putting computer-based systems into processes without understanding how the two influence one another. Too, in some cases, they don't care as it pulls in the bucks (in the short term, then we have to bail them out). That's one big issue (push out the changes, make your user base adapt -- it's good for them).

Another is a map-territory problem (Google glass is only going to exacerbate the problem). You see, people get to believe that what's in the bits (such as, FB's trivia, etc.) is more real than the being itself. We'll have to go on at length about this type of mismatch. Being will (always does) win, folks, in the end.


We'll be seeing more of this type of thing; when will the populace wake up (suggesting that people with their feet on the ground and hands in the mud know more of this than high-flying execs of these grandiosely motivated organizations).

SSA error
AARP recently wrote of a 93-year-old, WWW II Vet who lost his Medicare coverage when someone in a SSA process updated his birthdate to an incorrect value. Then, they (SSA) had the audacity to tell him that their value was correct AND that he would have to use their value for his birth date correctly.

Naturally, the guy got irritated. As he, and his family, were working with the media (old type, not social - ah, much to say there, too) on a story about the idiocy, SSA discovered their error and apologized (as if that were enough).

Makes one think that we've forgotten GIGO as the "experts" talk about error-correcting processes, robust systems, and such (ah, entrapment, indeed).


All around, people have been making billions by spawning off half-systems (loaded with silly features - started early, yes Gates?) where errors, and losses, are the users' responsibility (has anyone actually quantified the bucks - trillions? - of accumulative losses from these half-witted types?). And, this witless (actually, it's a ploy to entrap) SOP has grown to proportions that are staggering.

Now, the solution going forward ought to be to have bucks devoted to 24/7 watch-dogging databases (say, watching the Equifaxes of the world) with the proviso that an error that can be certified by a human (even by phone) can be corrected (with appropriate notes and traces) quickly (with proper documentation, things can be looked at, and audited, at will - this is more than the financial folks can claim) and much more. There would be continual monitoring (many ways, not just NSA's little paranoia) by the people. Talk about jobs (yes, and this would be trainable -- look, please -- those who argue to bring in talent are looking for several things: no conscience, even less cultural attachment, reduced focus - some call this tunnel vision - so as to maximize pushing out crap, and more - yes, even though there are things like quality certification). There would be many, many jobs (see next for where the money would come from).

That is, all these billionaires are such because proper costs were not considered (near zero) in whatever the situation in which the bucks accumulated (yes, investors are problematic, to boot). In terms of cost, the populace bears the brunt in order for some to be luxuriously entertained (say, workers being screwed - too many ways to enumerate).


Take Equifax, for one. They could very well prevent the computational hell that is being imposed upon us. We'll get into that further, to boot.


07/30/2013 -- The future: economy and technology.

07/12/2013 -- Will wonders never cease? Jon knocks early-lookers?

07/12/2013 -- Comment on FB: Just read in the WSJ of a growing presence of a shadow "Supreme Court" (details to be discussed via blog). Think of it: police, judge, jury, executioner, mortician (and all the other roles) all rolled up unto one secretive group(s). Kings and feudal lords (slave owners) come to mind; but, do we have those nowadays? 

Then, we have shadow banking (its size is probably multiples of what gets caught under accounting's purview - Ben knows this). We have banks using hackers to stress their systems (what shadow activities can arise from that? - the Internet did not have to unfold like it did).

We have shadow government (supposedly handled by open-door policies). And, shadow business is cloaked under those leg-irons called non-disclosure agreements that are required for employment (makes working for oneself attractive, even if such causes one's life to border on poverty). The market addicts (those who go gaga (apologies to the Lady G) when Ben goes goo-goo) want their dark pools, and more.

Shadows are everywhere (no conspiracy paranoia intended, or unintended). USA Today talked about the shadow credit ratings (yes, the silly number that we get to see if not it, folks - that is, those who decide your fate look at their own little bit of stuff). The Big-3, and such ilk, need more scrutiny.

Well, shadows are real, to the extent of their natural properties. We have to deal with them. But, the rise of the computer (and its offspring - web, etc.) is raising the bar (for the bad guys - lowering it for we, the people) by changing the context of shadows (poorly understood at this time) in ways only hinted at by early concepts, such as undecidability.

Happy Friday thoughts.

07/11/2013 -- In a sense, Ben is trapped into the expectations of the addicts. Yes, there are several factors involved with this. Poor guy. Yet, for the rest of us, like the savers who are being tortured, we'll have to endure this idiocy for whatever time it takes. Yet, entrapment is making things worse day by day.

07/09/2013 -- How will we get the proper mind set to counter the web that binds? Is it trainable or existential (study dedicated to GEK III)? Some ask if the lack of accountability, or the observed absence of notions of responsibility, are major factors. Yes, there are aspects to this problem that come out of our humanness, worked over and over as generations pile up on life's shore. However, the larger problem is that the presumption behind many worldviews dealing with technology is that we understand more than we actually do. That latter overlaps partly with the concepts behind quasi-empiricism. ... Money, as truth, muddies the water, to boot.

Modified: 07/30/2013

Monday, May 20, 2013

Closer to Truth

In the last post, I mentioned an autodidact (Eric). Those types like to learn (which is not life-long learning, folks) in a self-directed manner. I am going to use him across all of the blogs (for instance, Eric lived simply; he did not have a mansion, large car, etc. -- yet, he is of the type that I've talked about where we could run finance better than it is now). That task is still pending.


Let me, first, re-introduce an important concept (mentioned here many times): quasi-empirical arguments. Wikipedia has a nice write up on the subject, for starters. See this 7'oops7 post on motivation.


even big T' aspects
I ran across "Closer to truth" today and wanted to get it brought to the fore. In one of the videos that I saw, Wolfram talked about his ideas on an oft-debated mathematical topic: is it invented or discovered?

You see, if the former, how could one be other than quasi-empirically oriented? From my own experience and from many comments that I've read, some practitioners (of the teaching ilk) seem to assume the latter which is accessible to special types of minds (which may be partly so - another topic of interest, as Truth is available to all).

One has to pop out to about 7:45 on the video for the quasi-empirical portion - or listen to that point to hear the preparatory matters). I've referenced Wolfram before (other contexts); it's nice that he understands the limits (perhaps, his computational focus brings this to the fore).

I'll be getting back to this topic, in the the context of computability in the world, and more.


This is one example video; I expect that there will be more to use (and big "T" Truth is covered, to boot, by the series).


05/21/2013 -- Penrose has a good take on the question. Mathematics, in so far as its involved with nature, was there long before humans came around. The humans are those who are the inventors of mathematics? Sounds silly, right? Of course, one has to note the different connotations of "mathematics". Humans worked out the language aspect (how the essence is conveyed and shared), no doubt, over a long period of time.

Modified: 05/21/2013

Thursday, May 9, 2013

Eric Hoffer

The longshoreman philosopher. He is on the list of notable autodidacts (a category to which I would like to belong).


This post is going to be put on the three related blogs (Truth Engineering, 7'oops7, FEDaerated [already mentioned in the context of autodidactism]) each post tailored to the context of the blog's theme) indicating a change of tone and of focus. I was in San Francisco during the time (1967-70) when Eric walked the streets, pondering matters, and ran across him from time to time. His most popular book was a nice read but did not resonate with me (those T-issues [Anselm as an example] that cause so many problems; these are best skirted around many times (mind you, did not say ignore entirely)), however I have always respected the guy and his work.


This itinerant citizen saw more of the American experience than did 99.999% of his cohorts and those of subsequent times.  And, he was 1st generation. It is probably fortunate that he ended up on San Francisco (where the song writer left his heart). I would bet that Eric did not associate much (if at all) with the beat view (which, if you look at it closely, are of the fortunate sons [thanks, CCR] - wait, there I go again [by character, cannot guarantee descending into the mocking stance which has become so prevalent -- why is that?]).

We'll go on about the travels of Eric from time to time. For instance, he spent several years in skid row of Los Angeles. I worked at the Union Rescue Mission (early 1960s; recently, I looked at the website and was saddened to hear that now they're dealing with families - not older, derelict men) for a couple of years.


If the USA is going to mean something, it won't be via military might (if I add, not by that alone, will you keep off the reflexive hero worship thing that forces many to rant interminably?). Rather, this country has many ways to continue (or to try to attempt) to be the "city on the hill" (if only John Winthrop knew). People like Eric [and many other types, but not financial pirates] will be key.

Too, his self-learning was important. How does one test the knowledge of such types? Well, truth engineering will deal with that. To boot, the independent view that we all need [the inverse of this view is the politician whose saliva kicks in at the sight of (or the thought of) a buck (or any other money) -- there I go again, except I'm waiting to see how Elizabeth W. pans out [a note arrived today with the subject that she needs a partner])] was one of Eric's many qualities.


This is only a start. Let's not forget Eric (Jul 1902 to May 1983) and his kind.


07/09/2013 -- Was there a time when father knew, whether all or most of the time? Many sons railed against that, GEK III, for instance. Some sons had absent fathers, who were no more than some ideal without any material substance. Some sons even followed their fathers. All sorts of positions along an axis. However, there is something new, now. An insidious overlay is threatening us; its origins come from advances in prowess that are less understood than those who practice think is so. Have we left serfdom to a feudal lord behind in order to be wrapped in a more dense veil? The key notion is quasi-empicisism and its being ignored.

05/14/2013 -- Capitalism magazine published, 10 years ago, Thomas Sowell's The Legacy of Eric Hopper.  Praise be the Internet!

Modified: 07/09/2013

Thursday, May 2, 2013


I ran across him today while browsing through Wessex history. Anselm was involved with the discussions related to Edith marrying Henry.

You see, Edith, in this case, is the daughter of Malcolm III (King of Scotland) and his wife, St. Margaret. Edith purportedly had taken religious vows (in a convent) earlier. Could she marry without incurring some type of retributive reaction by the Church?

At the time, Anselm was the Archbishop of Canterbury which is a fairly high-ranking position in England, both then and now. So, he had to weigh in.

Well, Edith argued that being in the convent was by her mother's ruse to keep little Edith from falling into the hands of the Normans (those lusty ones from the continent). So, Edith won her argument, changed her name to Matilda and married Henry.

Henry? Yes, Henry I of England who was the son of William I and, thereby, partly Norman.


But, that was a little aside (not a Who am I?) and introduction. This post is a start to re-establish Anselm's importance which we'll be looking at more fully. One motivation is the upcoming 1000th anniversary for him. He was born (circa) 1033. So, we have 20 years to get ready for this.

By then, we'll have had a lot of experience with all of the celebrations coming up that are 400th in flavor (numerous American cities and states), 800th (Magna Charta), and more.

Anselm's "ontological" argument has been scorned by some modern intellectuals who, for the most part, use mathematics as their ruse. And, these modern arguments use a sleight-of-hand to direct our attention away from the less-than-solid basis of their argument.

One of our tasks is to illustrate this problem and to discuss what it means to our future. There has already been a slight start: nods to the quasi-empirical views (several posts).


Does this post open up the big "T" issues? Not exactly. There is plenty of territory (hint, here) to cover prior to that.


10/13/2015 -- "God" as name or predicate? How about Being?

05/21/2013 -- The big "T" issues can come to fore. By the way, "truh" in the file name? Mis-type, of course. But, interesting enough to not correct. Will have to start using the "Preview" feature more regularly (note the statement that these posts use no fancy editor; iterative write/review/edit steps are the key; just like the old days whey people wrote on paper). The process pops up the page in a fully formatted manner (nice, blogger.com) that allows the necessary pre-edits (some of which I used to do after the fact - incremental improvement, evolution, ... -- it bothers me when someone shows a final product as if it were obtained by magic (or Deus ex machina); let me see the details and intermediate steps, please).

05/03/2013 -- Either/Or.

Modified: 10/13/2015

Saturday, March 9, 2013

Genetics and truth

It's a nice diversion to consider lessons from the past (Zeno from the far past). We all know how involvement with the present seems to erase (or so some think) the past and gives us a better view. Markov (ought to have identified Andrey earlier; his contribution was to allow us to forget the importance of how we got to where we are [all sorts of ways to discuss this] in order to just use current states for making our choices) would be appalled to see how his ideas have been mis-appropriated (can talk to this quite long and far).


But, some progress, that has only been alluded to before, needs to be brought to fore in order to receive proper attention. We're talking about biology and computing in all of the possible permutations of such. We have, many times, mentioned that engineers go up against nature which, then, can lesson the hubris leaning tendency we find with humans. Or not.

For now, let's just look at genetics and at how it has motivated computational solutions, has hinted at future trends and uses, and has provided a new way to think of some things, including how to handle massive systems of equations more efficiently. All of this is apropos.

The following are links to material that is relevant and that we'll get to.
--- added 05/03/2013 ---

To finish this off, let me add a brief comment about something read recently. You see, someone did the math to figure out when life would have begun using known information about state changes. Think of this being in terms of how far in the past. Well, the amount of time was beyond earth's age. The net effect? Thinking of seeding from space as being tied to our selves and more. We'll get back to this, as truth has a biological basis, in part.


05/03/2013 -- Added in link to Andrey's Wiki page, plus a little coloration. Yes, we have much to thank the guy for; but, there are side-effects to consider to. For one? I was reading in the WSJ suppliers bemoaning how some modern thinking has pushed problems upstream (that is the pure truth). Of course, those who are doing the modeling behind this stuff think they're geniuses. Too, all those who suffer (yes, IPad slaves -- need to find the link for this) because we push the bad stuff off to other people (off shoring, okay?). 

Modified: 05/03/2013

Sunday, January 20, 2013

Zeno, in the modern context

It's common knowledge that the modern world knows things through computers. This is true from the most recent phenomena, of noses to smart phones parading as intelligent behavior, to the wide expanses of cosmology's modeling of the heavens and exploration of multi-universes as an explanation, of sorts. In between, we have IRS's use plus business computing, such as design, planning, and a number of other things.

So, where does one go to look at issues related to such knowledge? The ACM is a good start. Say, their Communications of the ACM. Then, we have a whole lot of other folks, such as IEEE, IJCAI, and such.


The following is motivated by a viewpoint, expressed by Phillip G. Armour, in the ACM. His article is titled: The Business of Software: How We build Things (in paper, slightly different on-line). There are two things to mention here, though the article ought to be read.

He uses Zeno in talking about what I had called Earned Value. I only used Zeno once (Fedaerated) in several posts on three blogs. Why? I had talked about this with my colleagues on many occasions. It seemed that referencing the guy was more useful in person as then one could get off on the peripatetic issues.
Zeno, Veritas et Falsitas

Why Zeno? He's the guy of the arrow. Or, as the joke goes, the mathematician who doesn't get the girl. So, Phillip asks: why do people guess that they're 95% (or some such number) complete on a task as if they're monotonically approaching, with no end in sight?

Phillip laughs it off. I don't as it was a regular occurrence as we tried to assess completion of a project with lots of people and oodles of modules. Nowadays, it's not an issue (say, with Zuck's stuff) as they can just push out system changes (with a recovery method, hopefully, to use if things go bad) without regard to testing status. This is not true for other parts of the business world, say like the 787 (even a most-specified test plan will still leave room for judgment calls -- we'll get to that).


Phillip's use gets me thinking, again, that I need to bring the topic forward, again.

First, though, a useful exercise would be to gather all of the posts, for each of the blogs, that dealt with the subject of earned value. For each of the blogs, I have a list of posts that include the term. Then, I provide a list of a few of the important posts and the count of posts with the term.

        Fedaerated (18)        7'oops7 (41)        Truth Engineering (20)

Now, for this blog, all were in 2009 and before. That sort of indicates the shift to looking at finance. Engineering worries about things like this. Finance seems to have this short-term view of the content of the current day's pocket. That is idiotic, pure and simple.

So, we'll bring this subject up to date and relate it, as it ought to be, to fair value.

A sampling of posts follows:
  • Minsky anew (Apr 17, 2009) -- There are many types of speculation, including projections of when something might be done. This applies more to fair versus earned, sometimes. 
  • Value and truth (Jan 12, 2009) -- Value and its determination seems to have been given short attention as if it's a resolved issue (like the downplaying of risk management's complications?). 
  • Effort and truth (Oct 10, 2007) -- Moving along a value line takes effort; yet, modern bookkeeping and modeling seems to suggest otherwise. That is, cook the books to get what you want to see. 
  • Measuring progress (Sep 19, 2007) -- The blog started with an engineering focus, but, by the time of this post, there were murmurings of financial idiocy which burst out a little later. So, there is a flavor of both earned and fair in terms of value.  
  • Complicated or difficult (Sep 3, 2007) -- We have issues of depth and breadth. The former can seem boundless as they are more mathematical than not. The latter seem easy as we throw database technology and computational power at them. Both contribute to the problems. 
There are a lot more posts to look. We'll get back to that.


Phillip used some mathematics to show the problems related to knowing where you were with a project (the managers, like kids, say: are we there, yet?). Nice article.


03/03/2014 -- Acknowledgements, including math pedigree, will be expanded. -- 

Modified: 03/03/2014