Wednesday, January 15, 2014


As we said in an earlier post, Big Daddy Data, everyone is hot after analytics. These are framed computationally, you all know, bringing out little gremlins from the tracings that we leave of ourselves. Okay?

Much ado about nothing, to boot. The commercial analytics are the clearest way to perdition, people.


So, the other day, in a conversation, on a Sunday, so that gives me the right to be philosophical (actually, I don't need any permission, motivation, or whatever - it's always there), a conversant makes note that it's interesting how I can relate "analytics" (of course, IT milieu) to philosophy. Now, in the guy's favor, I was broaching upon the topic of truth engineering.

Say what? I thought. Then, I said, you know, philosophy was into analysis way before the latest craze which is the result of oodles of years of work by many people all culminating now with idiots misusing knowledge (say it isn't so - MBAs are the worst of the bunch).


Earlier, I wrote about algorithms in the context of apps (a big load of stuff to weed through and potentially a big source of errancy - in terms of opening one self up to manipulation). The term is used loosely now. Of course, if it's in the analytics framework, it does have a mathematical basis.

But, and it's a big butt, tell me analytic'ers, what say you about the quasi-empirical (to be discussed) issues? Well, that will be one focus henceforth. Firstly, bring up the topic in this discussion so that there is recognition of its reality. Then, look at the issues and at how they're important to our computational future.

Lord knows that the term was used continuously since the beginning here (44 of 200 posts) and in other blogs (FED-aerated - 25 of 211, 7oops7 - 38 of 248). Mind you, it was my oversight to not go into it further, just as I assumed too much about near-zero's recognition (still need to address that in depth). At least, these topics are sufficiently complicated to keep me busy for awhile.

Remarks:   Modified: 02/10/2014

01/16/2014 -- The recent Communications of the ACM had several stories on big data. Their claim is that the loads of data collected within the past five years or so is a sufficient set to make claims. Well, actually, the idea is to generate predictions, thereby getting a slap on the back from science. However, all sorts of things come to mind which I'll get into. First of all, that parallel universe of data that comes along with internet trafficking and just plain use tells us what? No matter its size, and the duration in which it is collected, the stuff, by no means, describes a person except for some small aspect of themselves. It does not subsume the being. And, even if someone is wrapped by the collection and analysis of this secondary data, it's not real. But, more on this. ... And, mathematics does come into play, misused (ah, the worldview of the MBAs gone mad).

01/16/2014 -- After starting the above example, complications started to lurk that we could ignore for awhile, but, at some point, these would have to be addressed. Say, after a few items were sold by the one who had them first (arbitrary boundary situation, here), those who bought would look at how they could, perhaps, make more than illusory money by selling at a higher price. The value determination then would become some type of functional problem bringing in difficulties that have been long the domain of the mathematicians (the ultimate abstraction'ists, somewhat, but, analytics would be involved). While looking at pedagogic material that would be of interest, I ran across this web site (Intuitive guide to exponential functions). I have not read this yet, but the fact of the amount of comments that have ensued, plus those who commented, got attention. Too, John von Neumann said that we don't understand mathematics (higher-order types); no, we get used to it. However, there must be an intuitive aspect if we are going to appeal to truth and people's place in its determination. So, that usage resonated with my thinking. Too, though, we have a class'ist split that is happening under our noses. In one sense, it relates to numeracy. But, the more insidious part deals with overlays (computationally enhanced) becoming more real than the reality itself (we'll get into that ad infinitum).

02/10/2014 -- Put a comment at this post: of-g-o-d-and-god-concepts.html We said that we would raise meta issues (as in, big T), at some point. It's been two years since we had the first of a sibling collection depart.

No comments: