Friday, September 28, 2012

Hint, heuristic, algorithm

Context: See Tru'eng anewfocus going forwardmathematics.


We were wallowing in the muck (it's an election year, so give us a break) wondering what could pull us out of the morass (we do need to get our tops up and spinning). Well, those with oodles of bucks, unless they are extremely careless (read: stupid), find it easier to remove themselves (not exactly, but that's a big T 'truth' issue) from the mess (they leave the crap to others). And, big bucks now are wandering after 'apps' which are oriented, for the most part, toward personal devices (mobile ubiquity).

Aside: Most don't enjoy that opportunity. Someone used baseball to characterize those with large pockets, in the following sense. They categorized the person's success by their starting position. Ellison, of the relational thing, was from the batter's box. Some start from first base. Others already had a home run at birth. You know, most are not even in the ball park. Z of FB was on one of the bases.

So, big bucks HOPE to make more by chasing after dreams (illusions -- as in, chimera). There may be technical aspects to the process that brings things forth, yet one would think that appealing features are more the focus than outright computational prowess. Calling something an algorithm does not make it so (below).

Sidebar: another issues that we'll discuss: A. 3 years with Vista and no BSOD -- only 2 weeks with this newer thing - Windows 7 - and I already have one; B. using the recent version of a program from a famous company, I kept seeing the thing go back to the initial screen. Well, it turns out that this is the default error mode (more friendly than the BSOD?), but it does the unwind without leaving any hint (see below) of what was wrong. It's like going back to square one and starting over (ever heard of Sisyphus?). But, getting a top to spin is not trivial. C. ...


So, let's look at this by using the three concepts from the subject line. They can be thought to have a progressive association somewhat like the following describes.

  • hint -- We can view this several ways but consider a math problem. Many times, there may be a hint about how to approach a solution. One could say that a hint is sufficient to the wise. Unfortunately, computers are not wise and need explicit instructions. That, folks, is both the boon and the bane. It's the former since we can (try to) know what went into some computation (only under certain restrictive assumptions dealing with things like side-effects, black-boxes, etc.). Another way to think of a hint is that it lays down markers, within a landscape, whose presence helps keep things moving toward an end. Let's face it. Smart folks do not like to follow explicit instructions more than once (why then do we put people into the position where they are, for the most part, automatons? --- stupid, from several sides). You know what? We cannot give a hint to a computer in the same way that we can smart folks (forget the so-called smart devices - dumb-ass'd things, people). Except, if a person is in-the-loop during a computational event of complicated nature, the person can nudge the system away from divergent tendencies (yes, we'll expand at large about this being one way out of the quagmire -- suggesting, of course, human talents not yet developed).  
  • heuristic - We could say rule-of-thumb, but it's to mean more than that limiting view. A heuristic view can be fairly powerful approaching, somewhat, the 'nudge' mentioned in the prior bullet. Why don't we hear people talking this, rather than the loud pronouncements about their algorithms? If anything, any decision logic would need to be heuristically based (using real data and analysis thereof) in a real world situation. Developments after Hilbert's question about a program to do all mathematics (Mathematica, and its ilk, notwithstanding) suggest such. 
  • algorithm -- There is no agreed-upon definition. But, one strong notion is that an algorithm has (ought to have) more rigor behind it than not. Now, looking at the various characterizations (thanks to Wikipedia editors) can be interesting. Knuth's view probably best fits my experience. Namely, we have this: finiteness, definiteness, input, output, effectiveness. Let's look at those. Finiteness. Some argue that we have this by the vary nature of our domains. Not so. I've seen too many things loop (everyday, operators get lost, for various reasons on the web, which requires redo). Combinations of non-finite spaces can look extremely large from certain viewpoints. Definiteness. Unless there are explicit user requirements, there'll always be a problem here. Convergence, through interest and use, may be the modus operandi, yet it approximates only. Input: thankfully, menu interfaces inhibit problem emergence. But, just a few days ago, someone was trying to show off his new device. And, he started his dialog as if a 'smart' person was on the other side of his Shannon experience. Hah! The thing barf'd. I've seen this too many times, folks. Output: Crap, just look at mangled messages out of supposed smart fixup and fat fingering. It's almost epochal. I've seen too much effort by people to demonstrate that their smart device is such. We dumb ourselves down, folks, in order for this to happen (many, many ways). Effectiveness: As measured by what? Showing off to others? I'll be impressed when we see some type of verification process in places (no, not just testing). However, this is a hard thing to do. 

Reminder. See my remarks about Mr. Jobs' little demos from the earlier days. My thought still stands, despite IBM's success on Jeopardy. 

Aside: Don't get me wrong. I love that the software frameworks have evolved immensely. It's nice to open the covers and see the progressive build upon what one hopes is a sound foundation. But, it took oodles of time, effort, and caring work to get here. How do we try to show the extreme tedium, and angst, that has been the case from the beginning? You know what happened, folks? The tide was turned so that computation can be faulty without any user kickback except for not buying. Or complaining? And, the vendor saying: sorry. HOW many millions (billions) of dollars of work has been lost through errors that might be of an egregious type if we could get under the kimono to know for sure? Well, losses are not finite for several reasons (cannot be counted, or accounted for, and the secondary effects are not measurable). 


Oh, who with money (yeah, venture people, you) is even interested in the longer term problems? 


01/23/2015 -- Software? Well, we are talking more than apps (latest craze). We are dealing with fundamental questions which, then, gives rise to normative issues in mathematics (and, by extension, to the computational).

01/05/2015 -- See header.

03/23/2014 -- SAT solvers as an example of large class of heuristics.

02/25/2014 -- Put in a context link, up top.

07/20/2013 -- So, today, found out that a new handle for algorithms is algos. Okay.

06/14/2013 -- Moshe's editorial at the Communications of the ACM discusses algorithms. Comments, and subsequent letters responding, to Moshe's thoughts indicate the wide range of opinion in the matter.

05/02/2013 -- This has been a popular post (second most popular), of late. Perhaps, it's the growing awareness of the ever-increasing gap between what we think that we know and what we actually know. Well, the theme of the blog is to look at these types of problems. Perhaps, we can bring something forward from the past (such as, we not learning Anselm's message). I mentioned it somewhere (I'll track it down), but the motivation for this post was in part hearing young guys brag about their algorithms in a public place (of course, weekend situation, so some latitude is allowed). That concept is bandied about these days without people giving the notion its proper respect.

10/05/2012 -- Related posts: Zen of computing (in part)Avoiding oops.

10/05/2012 -- IJCAI has a newer flavor. It, at one time, had a theoretic thrust, mainly. Need to re-acquaint myself with the conference (attended 1987 Milan, Italy 1991 Sydney, Australia 1993 Chambery, France 1995 Montreal, Canada). This talk from IJCAI-2011 relates to the theme of this post: Homo heuristicus.

Modified: 01/23/2015

Tuesday, September 18, 2012

Celebrating our frailties

In another context, while discussing Ben's largess to business (gamblers) but not to labor (or savers) - though, he claims his rolling out the dough is to create jobs, the notion of someone being perfect came about. Essentially, to err is human. The corollary is true, too: to be human is to err (even for those high-falutin' types that claim to be divine, et al).

To try to be perfect is not human; it's machinations in action. As in, someone doing something 'perfectly' does not mean that the person is perfect. No, that sort of thing is not much different from equating the map with the territory (a real problem nowadays, with ubiquitous computationally motivated intrusions in our lives).

So, does this mean that we cannot be better, etc. (or strive to be so?)? As in, practice makes perfect? No, as systems (and role playing - with effects, of course) are the things that we can have approach perfection. People, while in a system, can get better, to the extent to which they can overcome human limitations. As in, roles and essence are two different things.

But, do we expect that systems are perfect? Well, we do trust them a whole lot more than we ought to. See the comment on the main page about computablity, and problems thereof.

However, we can change the context to that dealing with expectations and realizations. As in, we can get close to fault-tolerant states, even if that comes about from error-fixup techniques that are, undoubtedly, clever.


Why do you think that the borg (to wit, Star Trek's type, et al) idea is so strong? People sense that we can have a symbiotic relationship (with what?) that is uplifting. But, too, we can descent into slavery (ah, financial indebtedness is just that).


The corollary thought is that 'labor' pertains to a class of people. Whereas, our progress rests upon us seeing that 'labor' relates to roles which are attempted (or fulfilled) by people. No matter that most who descend to this type of activity do so with little choice. Not all people who work with their hands are incapable of more advanced challenges. For some, it's a matter of choice.

Perhaps, that choice is seldom made nowadays. Not true, though, from my experience. One has to listen with the right ear to hear intelligence covered with the detritus from having mostly gross experiences from life. Some never get to shine themselves up. Yet, their potential is there.


As an aside, to the young mucks. The thing of having succeeding waves of people entering the market with the latest knowledge running things is part of our problem. Actually, a very large part. That sort of thing is a recent phenomenon, brought on by advances in technology and computation. So, its analysis can be done now since we have had several downturns over the recent times for people to recognize the problem, if it is expressed so as to be obvious (the stench goes all the way to heaven).

Expect some more attention to this theme; too, proposed ways to handle things better will be in the offering. Why not?


09/28/2012 -- One type of hope.

09/21/2012 -- This discussion ties into 7oops7's bailiwick (see Remarks 09/21/2012). We know that we cannot, too much, celebrate our faults. Why? Descent into the quagmire. Oh, wait! Being concerned with the faults of others has the same downward spiral. I have to admit that we need to find a way out of the mess (will it happen in the U.S. any time soon given how screwed up election dynamics have become (partly due to the Court's claiming the fiction that a corporation is a person and that its money is free speech?). Virtues come to mind as essential. Yet, many argue not. From whence, then, we have to ask them, come the motivators (lifters)? Hope is one of the virtues. We hope that we not get pulled into some type of downturn. We just went through one that was caused by those (see prior Remarks) who could run amok yet have not learned their lessons. Ben saw to that with his largess. Well, it's Friday; we'll get back to this later.

09/20/2012 -- It's imperative that I mention the best and brightest. Yes, we've talked of them before. Here is another definition. They might be considered as those whose arrogance causes them to see themselves as perfect. The real tragedy is that these types have always mucked up, and seem to continue to trash, the common world shared by all of us (now, and our progeny in the future). Too, those whose attributes are less stellar (by what is their ascendancy measured anyway? ..., very short term looks? ...) have to clean up after these type whose crap falls on the world from leaks in their diapers.

09/19/2012 -- Rick starts out: perfect memory.

09/19/2012 Let's say, for now, that perfection, like beauty, is in the eyes of the beholder (Wiki has a nice historical view of the concept's breadth of use). See, 09/19/2012 Remark (Fedaerated).

09/19/2012 -- Any sense of 'perfection' would require some way to judge whether something has attained the state, or just is such. All sorts of issues lurk about. So, let's look at the role aspect, for now. We'll get back to the need for 'ídeals' in this matter. That, too, raises considerations for us to look at. In terms of sensors and actions, one might think about adaptability as being an indicator. Yet. we all know about the pejoratives cast'd at the person who is good at playing the chameleon. Thinking of faults, are they not abundantly clear? Oh wait. Much effort goes to covering these up, not unlike our clothes hiding all sorts of imperfections (for the most). So, we're not far from the core issue of truth engineering. How do we know? Even if we can (are allowed to) see under the kimonos, do we know what exactly we're expected to see or even how to do other than react? One approach might be to look at what is considered perfect by category. And, then identify things that are almost perfect in that case, to wit ballplayer, surgeon, and much more. Some might even be 'perfect' in several roles. Others may have few such qualities. Ah, that is nudging up against an important subject, to be discussed later (per usual).

Modified: 09/28/2012