A certain amount of debate has already swirled around the issues raised - and the details not given - in the position paper, The emperor's new clothes -- an outsider's perspective, that I have submitted to the OOPSLA'96 Workshop "Business Object Design and Implementation II", chaired by Jeff Sutherland.
I thank every one of you who have been involved in various ways so far, and especially Jeff for his initiative, open-mindedness and hard work, as well as Ralph Johnson of the University of Illinois at Urbana-Champaign and Andrew Watson of the OMG for our long and sometimes quite technical email exchanges. Thank you, Andrew, also for your coda explaining how the OMA was inevitable at the time of its evolution.
This rather selective "FAQ" gives some provisional answers to some of the questions as I have rephrased them from the above and other sources. Any further questions or comments are of course welcome and invited.
I am stirring controversy and expect to be spattered in return. So far I have personally found it all most usefully instructive, though I hasten to add that my architectural viewpoint remains firm:
Since we all place practicality at the top of our list of priorities, let me repeat this snatch from the paper: "... considering the unreadiness of the whole OMG architectured toolset (I've no doubt that, d.v., Metaset/MACK will easily beat them to it!), it's surely not too late for the OMG Titanic to contemplate a change of course, difficult though such moves always are."
As for the credibility of my claims about Metaset and MACK, well, yes, of course, it's difficult a this stage! But please judge that aspect after you have read this FAQ.
I apologize if the nature of my approach seems ill-suited to the bureaucratic niceties and constraints in which Standards inevitably "live, move and have their being". But as an outsider, and trying to be a head-down programmer at this stage, I can't afford such luxuries right now. As a businessman I have chosen a different short-term tactic, which is to go over the OMG's heads and appeal to you, the reader, as a representative of the market.
To summarize: I am hereby inviting entrepreneurs, managers and technicians to put together a plan to buy into my existing project, help speed up the implementation of Metaset, promote the adoption of MACK and - together, hopefully, with the OMG - launch its evolution into a fully standard ACK. (See question 11.)
In my answers below, your concentration will frequently be tripped up by my rushing off on apparent tangents, and frequently autobiographical tangents at that! I blow a few personal trumpets, but how else is one to sell a hidden product, if not with some reference to its background? So please bear with me.
Those reflections are all relevant threads in the interwoven fabric of a very extensively coherent story. Please persist in trying to see that coherence. In the end it is quite cogent. (My paper is also a not entirely bad even though extremely dense example of such interlocking coherence, and I would recommend another reading of it after you have studied this FAQ.)
Since Metaset is not ready for launch, and I don't want to release all the details of MACK just yet, I can - at this stage - only sell the project on the basis of that broad coherence, including the coherence between the enormous market needs and the way this proposed architecture has evolved over many years.
Further credits are given in context below, but don't worry: MACK bears little resemblance to the usual product of a committee! On the other hand - fortunately too! - others will contribute further in future (see questions 8, 9 and 11), even before the open market takes it further.
If I haven't succeeded in conveying those coherences to you, and that broad drift of my strategy does not come across at all convincingly, don't despair ... either of me or yourself! However and whenever it comes to fruition, Metaset as an information medium will make it much easier for others as well as myself to learn from you and share with you fully. It was, after all, designed to help people simplify complexity together -- that being the main and recurrent theme in the paper and in the questions below.
Meanwhile, I permit myself the luxury of perhaps preaching only to the converted: I know that The Mainstream will in the end seem like old hat to a good number of people out there (See question 3). It is they whom I want to link up with at this stage (See question 11).
They might take it as a compliment! I almost totally ignored other related de jure or de facto standards bodies such as the ISO, X/Open, ANSI or Microsoft, which in my relative ignorance I regard as being more securely ensconced on their respective branches of the Charybdian figtree (For more on Scylla and Charybdis, and further to the paper, see item (e) at the end of question 2).
The OMG is more susceptible to pressure from stakeholders such as business, software designers and end-users (or so it seems to me, though I'm very open to correction). They appear more open about their standards process, and better organized in their dialogue with their stakeholders. Considering the short period of time they have been operating, they seem to have achieved far more than the others.
Certainly, I have personally found them to be surprisingly accessible, open and cooperative ... albeit with some understandable hesitations when confronted with my unconventional ways!
Anyway, I have now embarked on rather roughly attempting to get them to turf their OMA out and adopt MACK instead. At least, my paper was the beginning of my undermining of public confidence in the whole CORBA scene, and the little start to my attempts to re-orientate the thrust of the IT industry's Business Object and other generalized marketplace-facilitating standards expenditure.
The present background is most propitious for such a titanic exercise. The OMG's whole standards-definition process is still so glaringly incomplete, and the OMG's own eyes are so open (See, for example, their document 960806.txt, which is the minutes of a recent Object Model Sub-Committee meeting), that I believe it should be quite clear to all concerned that they are so far behind MACK (as I claim it to be...) that they cannot help but seriously consider it as an alternative and far more complete, powerful yet simple architecture.
* * *
In the meantime, there is too much unquestioning faith in OMA/CORBA, the glorious underclothes of the otherwise naked emperor. Since we in IT tend to pride ourselves on being rational people, I shall provisionally overlook Medawar's Dictum (and many other similar wisdoms, e.g. as mentioned in item (a) under question 2), and urge the OMG's stakeholders to push them to take up the challenge in the paper under the heading "Some innocent questions [...]". The OMG architects should be able to point in a clear technological direction through the complexities I mention.
And then the market must demand to see the rest of the attire! (See the third bullet in the Introduction to this FAQ.) I am sure it will be found that the CORBA "foundation garments" are totally incompatible with an appropriate full styling.
Here's another angle from which to view them. In my paper (under the heading "MACK helping the OMG get back on track") I only briefly alluded to the technicalities behind this paragraph:
"With the benefit of MACK-based hindsight, one can see how both OMA/CORBA and COM/OLE are irretrievably misshapen by their common history: conventional procedural programming spawning RPC hence "interface inheritance", thoroughly confusing algorithm time and real time."
Without getting into the detail of it (some of which is touched on under questions 4 to 6), try this as a provisional mental exercise: Object behaviour as defined by a call interface can only have meaning in algorithm time. Workflow, now coming to the fore so strongly, is inextricably part of real time. How can the two possibly meet cleanly (for meet they must, despite the occasional valiant attempt to separate them)? Behaviour as conceived-of during design for then-conceivable realtime scenarios is prima facie incapable in general of providing for all possible realtime scenarios in a complex and changing world. One might try refining the algorithm time granularity of behaviour to below that of real user time. But what would that do to the notion of a "Business Object" or component that must have a far higher-granularity "business meaning", and in a transparent and flexible way, as a basis for "plug and play"?
(Incidentally, that exercise helps one understand why RPC (which includes CORBA/IDL-based operations) is really only good - as many observers have already noted - for the construction of relatively hard-coded client-server applications developed under one developer roof, and not for the multi-vendor component and product market that we're all aiming to facilitate in a plug-and-play way. And why COM/OLE works okay, sort of, for desktop controls but not for large applications ... Microsoft at present standing out for their apparent desktop-bound inability to see beyond the "naïve realtime" mentioned in my paper and under question 3 (Find "Anaheim" below).)
There surely is some OMA answer to such an abstract exercise ... such is the wealth of logical possibility in an edifice of the OMA's generality. But I don't see it, and I also have a persistent abstract-aesthetic feeling that there can never be a good one. Over to you. Remember, it is an issue that is absolutely crucial to the very possibility of the OMA's eventual success (and COM/OLE's too). A hint for OMA-proponents: you will also encounter this qualitative time-mismatch problem if you dare to address the "innocent questions" recalled above.
If on the other hand you don't see that problem at all and then still stick with the OMA's and COM's interface inheritance, then - in my biased opinion - you are going blindly into the future.
Is it better to go blindly with MACK? Yes, it is better, and no, it isn't blindly, for MACK has a simple answer: it knows only real time, and any algorithm time is totally subservient to it. See also question 6 on how MACK's realtime is realtime with a difference.
My tactic was to create an immediate impression that this was a paper with a difference. And to inject some levity into it, what with all my heavy words and thoughts. I also wanted to take the mickey out of the OMG, to show that I was unimpressed by their heavy claims too. We need an irreverent and sceptical attitude all round, a willingness to start again ("evolution well needs a good rout"), despite all the usual hazards of attempted so-called revolution. (The Mainstream and MACK have helped me see that it's safe in this case!)
We will continue to need such an attitude (See question 8).
More solidly, there is always more to fairy tales and nursery rhymes than meets the eye. Here is some explanation, and details for those unfamiliar with the traditions of my own childhood:
(a) "The emperor's new clothes"
The title alludes to the story of the emperor showing off his new robes and being congratulated by his fawning courtiers, until a child observes innocently: "But where are his robes?"
"... it is sales that count in the fashion world." (from the end of the first paragraph): Decision-making under uncertainty, which is unavoidable when a complex situation forces one to take action, tends to fall back on superficial or emotive judgment. Then, as usual - for we are generally practical people - we tend to accept the consequences of our choices and live within them, however ridiculous the robes or limiting the system.
But refutation is difficult (cf. Medawar's Dictum again, or the lengthy advice to Odysseus not to even try to take up arms against Scylla, or Jesus' advice to his disciples to leave cities where they are not welcome). We have many psychological defences against refutation of our choices. We are often imprisoned in our own words, being unwilling to look at their other possible meanings, and certainly unable to consider all their implications.
Remember mathematician Lewis Carroll's tight explanation, in Through the Looking-Glass, of the old rhyme: "Humpty Dumpty sat on a wall, / Humpty Dumpty had a great fall; / All the King's horses, / And all the King's men, / Couldn't put Humpty together again." Why did that beautifully egg-architected creature have a great fall? " 'There's glory for you!' 'I don't know what you mean by "glory",' Alice objected. 'I meant, "there's a nice knock-down argument for you!"' 'But "glory" doesn't mean "a nice knock-down argument",' Alice objected. 'When I use a word,' Humpty Dumpty said in a rather scornful tone, 'it means just what I choose it to mean, - neither more nor less.'"
Ah, the classic fragility of abstract edifices such as OMA, COM and MACK! Especially when perched on a wall, far above the earth of experiment.
Then there are further and more "rational" reasons protecting oversimplifications: complex situations are not easily repeatable, so the experimentation part of scientific method can often only be applied within time horizons that are too short to be very convincing, especially as extrapolation in time is generally known to go haywire because of hidden oversimplifying assumptions. But it is better to try nonetheless, so that brings us to the next point:
(b) "... where angels fear to tread"
The first heading borrows from the proverb "Fools rush in where angels fear to tread." Hence the "fool" in the first line of the limerick. It has to be a crazy venture to nail such theses on the cathedral door. (... if I may presume to sympathize with Martin Luther and his cheek!) Nothing venture, nothing gain. That's experimentation. We'll see what the market will say about OMA vs MACK. At least the challenge to OMA has been issued (as the "innocent questions" and in the second part of the reply to question 1).
(c) "pull a plum out"
The last line of the limerick harks back to the English nursery rhyme: Little Jack Horner / sat in a corner / eating his pudding and pie. / He put in his thumb / and pulled out a plum, / and said "What a good boy am I!"
The origin of this rhyme is also relevant (and little known, what a pity!). It relates an incident in that classic Church/State struggle in which Henry the Eighth, King of England, began confiscating the properties of the Church. A Bishop tried to cut his losses by giving the King some of the Church's properties secretly: he baked their Title Deeds into a pie which he sent with the kitchen boy, Jack Horner, to the Palace as a present. Jack extracted one and kept it. To this day that property is still in the Horner family.
(Those are the facts as I remember hearing them. Maybe some reader can give better precision? And is it the King's surprise described in that other English nursery rhyme: "Four-and-twenty blackbirds baked in a pie / and when the pie was opened / they all began to sing. / Now isn't that a merry dish to place before a king!"?)
The incident was a classic win-win: the Bishop got to keep his other properties, the King didn't have to lift a finger for the ones he received, neither knew about the missing one. So in an updated way we may identify with Little Jack Horner: the current suppliers of information products - represented by the Bishop - might be more secure after some major adjustments, the customer - who is King - will be much better off without really trying, and we - by being instrumental in a small but key way in the transaction - will make something out of it.
(d) "No one has all qualities"
The "one" in the third line of the limerick refers most obviously to the "authorities" in the fourth line, but it also refers to any one individual. Certainly, I know I can't do it alone! Others will evolve MACK into ACK (See question 8), and maybe also help with Metaset (See question 11).
(e) Scylla and Charybdis
Am I not trying to squeeze too much out of that episode from Homer's innocent fable? No, not at all. His Iliad and Odyssey were together the repository of the wisdom of his age. In those pre-Socratic days, deep experience was still experimenting with sharable distillations. That was before most of the horrible obfuscations by the conceptual frameworks that history has knocked together. It is fair to assume that they could then still more easily "see the wood for the trees" ... the Charybdian figtrees of later systems. It is no coincidence that Homer's millenium saw the great flowering of the world's religions, then still in their infant simplicity, free from their later dogmatic oppositions and other accretions.
For a detailed and extensive interpretation of the allegory I can only refer you to my own 1986 book, Beyond Apartheid. Its Appendix A even consists of lengthy extracts of the most relevant parts of the Odyssey, reprinted with permission from the publishers (Penguin Books) of that translation (by E.V.Rieu), from which my quotations here and in the paper have come. Though "Scylla and Charybdis" is well known as an image for two extremes to steer between, I am not aware of anyone else's having made the interpretation of "oversimplification and overcomplication", though a classical scholar has confirmed to me that it is fully plausible. Certainly, the more specific interpretability goes down to an uncanny degree of detail!
May I grasp this opportunity to recommend that book (If you've read this far then you may well be interested...) also for much more on the philosophical as well as the political and IT background to Metaset and MACK? It is available from me for US$50 (postage and packaging included), a price - for a mere 128-page A5-size book - which represents a very presumptuous attempted tax on you, and thereby also some further project-related market research by me.
Good question... So many good people with my best interests at heart advise me to cut it out. But that would be to cut out the real heart of MACK! So I shall now try to indicate why I regard philosophy as essential if one is to argue in favour of such a radically new approach as MACK.
It is also essential if any reader is really to appreciate MACK in the absence of Metaset or another realization of MACK.
I can certainly vouch that I have found it indispensable in MACK's design. Since 1992, when I started coming into contact with Object Standards efforts, I have found a remarkable convergence of all my previous work - as related below - with those standards objectives.
If nothing else, at least that bodes well that here might be some fresh genes for the rather inbred standards line!
But if the philosophy is so central does that not then make MACK virtually impossible to sell? Not at all, for the problem is only temporary: Metaset in action will immediately be compelling at almost every level, and where relevant it will sell the philosophical aspects too (See again the last paragraphs of the Introduction, and consider also the last paragraph of the "Thirdly" angle below).
At this stage, then, I shall merely present - in almost as dense a way as in my paper (for which I apologize again!) - some kinds of argument in favour of the alleged central relevance of philosophy to Information Technology Standards. Fortunately for you, I make no attempt to argue them "fully" (whatever that might mean).
But I do try to bring out their relevance to the question:
Why should anyone in the standards world be interested in MACK, especially
before its details are known? (See further under question 11.) Hence
also the personal history related.
Firstly, the impending penetration of IT into the very fabric of everyone's lives is already insistently asking some very philosophical yet practical questions, such as: What is distinctly human and worthy of preserving in the face of computerization? What is distinctly human about us and worth cultivating with a future PDA (Personal Digital Assistant)? What is "ideal" governance? The Internet clearly favours a more Direct Democracy, but how are we to cope with the burgeoning Babel? What will the role of human managers be? (or: What are the responsibilities and rights of Supply?) Where will the remaining jobs be? (or: What are the rights and responsibilities of Demand?) What are the limits of genetic engineering and how should it be managed? What are we free to undertake (in all the senses of those words)?
Now it is true that the relevant standards people have already largely abstracted their current brief out from all the irrelevant complexities in the above. Hence the resulting focus - which I share - on reusability and interoperability in a plug-and-play way, all of which is already - as in Metaset/MACK - seen very much in a market context. So what's new in all my wordiness?
Many of those other complexities are not so easily brushed aside. There are many further thrusts in "penetrating computerization" that are usefully faced at the same time, such as the nature of "profound congeniality" in the user interface, the many aspects of the interface with the "learning organization", the dynamics of privacy, what should not be computerized, and what we haven't yet dared to ask for (e.g. many people have so burnt their fingers with AI, so how would questions about Artificial Creativity be received anywhere other than on the sensationalist fringes?). There is much philosophy in all such issues, and standards and practices must keep up with it as closely as they can.
But such demands appear to be far from the preoccupations of those presently working in the standards world. Meanwhile Metaset has grown up whilst facing them all with a most urgent focus, as I shall now partially indicate. Personally, I thank the Scylla and Charybdis allegory for having helped me consider all those issues together. ("For better or for worse..." you may well add!)
While not always having had an IT-standards thrust, Metaset may also appear to have had a far too parochial intent: cf. the title of my book, "Beyond Apartheid", introduced at the end of my answers to question 2. But Part III of that book was entitled "Philosophy", while especially Chapter 6, Enabling a fuller democracy, was an early example of a recurrent Information Superhighway theme.
A propos those three titles and their present relevance: In South Africa we have long had this greater need to try to build from "first principles". Then, considering our frontier history of shooting first and asking questions later, developing through "can-do" farming traditions into hideous bureaucratic outgrowths, what a feast that has been for both Scylla and Charybdis!
Is that not a problem for Metaset/MACK: a beginning perhaps irrelevant to global standards and a minor setting unpropitious for growth? Far from it! On the one hand, where would have been better? After all, Europe with the smug self-satisfaction of its various elites is one great big Charybdian figtree jungle, while the US in its collective ignorance and arrogance is still being devoured by Scylla's many heads, and - quoting Homer again - "I have never had to witness a more pitiable sight than that." Of course, those are my own summary judgments, doubtless far too Scyllan for many purposes, but being practical I find it quite helpful to assess my own business risks by the imperfect light of such perspectives. What about Asia? I have too much respect for Asia to categorize them quite so simply. They are an enormous and fascinating mixture of ancient wisdom and immobility leavened with youthful enthusiasm and impetuosity. None of the above seems a very propitious ground for controlled experimentation.
South Africa, on the other hand, though we are all a bit shell-shocked by tumultous changes, is small enough and varied enough and agile enough - and almost unanimously eager for more change - for it to be the best social laboratory for the world ... as indeed we have long been used, especially by Asian manufacturers, to market-test high-technology products.
Meanwhile, feeling closest to the American tendency, here I am, rushing in where the European angels fear to tread. Nothing venture, nothing gain. Patior ut potiar - No pain, no gain - is even my family motto. I've needed it: "The Mainstream", like "The Internet" or a standard "Architecture for Common Knowledge", is international and global, else it is nothing. Hence my approaches to this Workshop.
If it is true that the above IT/Philosophy questions seem very far from the preoccupations of standards organizations such as the OMG, then that would only show how far they lag behind MACK in real experience. (I shall now start blowing other people's trumpets as well as my own:)
One major area where MACK is ahead is in its integration of the whole market concept, with the market as a "philosophical instrument in the fullest sense" (as I said in my paper). That development has been going on for some time: it was the explicit aim of Beyond Apartheid (published June 1986) to seed and promote the application of electronic networks to boost the discovery of the complex democratic reality of Demand through its simplification by Supply. By April 1988 (with great credit due to my then colleague Stephen Davies, and with as great forbearance by my long-time software-house partner, Robert Gibson) we even had an ER-based prototype running on Beltel, the South African national Prestel-protocol public-access network.
That phase of MACK's evolution in the area of relevant standards came on top of its first CASE phase, which followed shortly after my own entrepreneurial beginnings. The latter were based on much indispensible experience with distributed computing between 1972 and 1975, doing successful pioneering work starting well before the "distributed" word was current in IT. That experience was largely thanks to my boss, Tom Bezuidenhout, as well as his boss, Anton Rupert, South Africa's greatest international entrepreneur, whose concept of "Partnership in Industry" is still an example to all other multinationals.
"IDIOM" (which I trademarked in 1976) stood for "Interpretable Design for Integrated Operation and Management". As that expansion implies, it was a proto-CASE system. That was of course long before "CASE" had been invented. But it was also CASE integrated with an "Application Operating System". That full description still well describes Metaset (which also still relies on an underlying host operating system)..
As Robert Gibson observed immediately on learning of the name, "And that leads naturally to Interpretable Design for Integrated Operation and Tears..." Now that's experience ... and repartee! That inevitable obstacle notwithstanding, I persisted with the name on account of its usual meaning, "the characteristic language of a sub-culture", which was most appropriate both to the multi-user IT world it catered for and the diverse and multi-location cultural setting it grew from. After a long and tortuous history (many more thanks being due to Robert Gibson and Stephen Davies, as well as Ivan Socher and Computer Automation Inc, Irvine, California), I eventually put the project on ice so that - other longstanding priorities having been raised - I could concentrate on what became Beyond Apartheid in 1986.
Relevant practical lessons (other than the CASE experience) are illustrated by the fact that the training for a later IDIOM subset started with a morning's workshop on "Errors: their prevention, detection and management". That angle developed into what a later colleague, Guy Bullen-Smith, dubbed "the Persistent Application", which we will encounter again under question 6. (I am indebted to Guy for much else in Metaset, including its very name!)
Further relevance to MACK may be noted from the fact that "the time component of data" aspect (which question 6 discusses) was first observed in IDIOM in 1976 (Thank you Aubrey Cohen!) and was more fully identified and developed in a paper I presented in Anaheim, California, in 1984. That paper introduced the naïve/relativistic realtime distinction, which because of its sociological aspects had so much relevance to Beyond Apartheid that it was included in that book as an appendix (Reflect on that seemingly odd juxtaposition!).
So we have had a long time - under appropriately varied circumstances - in which to explore the difficult business and politics of federated activity with end-user control, within and outside the IT domain. The immediate practical relevance of all this philosophy to IT standards has been very real to me, and should contribute to the present apparent solidity and future real stability of the development.
Any open-eyed entrepreneur worth his or her salt
would bank on that!
Secondly, a comment which I tag onto the above IT/Philosophy item as a largely rhetorical flourish at this stage, but also to set the scene for the "Fourthly", "Fifthly" and other items below:
Since "philosophy" means "love of wisdom", and wisdom must surely be grounded in knowledge, an "Architecture for Common Knowledge" surely should be able to accommodate and even facilitate productive discussion on any of the deepest questions that philosophers have tried to ask.
But such high-level comments are seldom helpful. Like "God is love", it isn't easy to see what they mean in practice.
Does that last comment make me a philosophical Pragmatist or Instrumentalist? Yes it does, in a sense, but in that case I could also lay claim, in a sense, to most other alleged alternatives and related philosophical labels, such as - "but not limited to" - Realist, Idealist, Empiricist, Rationalist, Romantic, Materialist, Naturalist, Economist, Positivist, Atheist, Theist, Agnostic, Phenomenalist, Existentialist, Structuralist, Deconstructionist, Postmodern!
When or if confronted by them (There's a warning to you not to do so!), I enjoy showing - at least simplistically - how their post-argument essences needn't be seen as mutually exclusive. One might even consider coopting them all into The Mainstream! The only labels I can think of right now that really have to be left out are "Pessimist" and "Nihilist".
The Mainstream takes account of all the philosophical positions mentioned, and - do you hear my own drum-roll? - it represents the largest attempted core that is at once coherent and consistent with them all. Or at least, The Mainstream can accommodate its own representations of all those positions...
On the other hand it is often better to avoid them all. The Mainstream is really so simple and everyday. Metaset will help bring that out in more detail - and far more accessibly to almost everyone - than I am capable of here and now.
But let's not persist with apparent attempts at a ridiculous SuperHegelian synthesis! See rather question 8 as well as the more constructive points below.
It is however worth noting here that that quick excursion through the extravagances of European philosophical history has the same "common factor" intent as the following points.
Thirdly, as Bertrand Russell said of the problems of philosophy: the most difficult problem is to see that the problems are difficult. Thus "Philosophy and Education" will always go together, in a long and never-ending haul (Plato's Republic refers again, and I have just quoted the title of a 10-lecture course I gave in 1975 at the University of Cape Town's Extramural Studies Department). We have difficulty facing complexity precisely because oversimplifications are so pervasive, so difficult to combat, and quite impossible to banish forever!
Fortunately, in the standards arena it is common cause that we must be especially vigilant against the Scylla of Procrustean standards and the Charybdis of everyone's apparent impotence to do anything about them. Still, the problems are difficult, and the temptations acute.
The allegory of Scylla and Charybdis helps us face and deal with such problems (At least, that has been my own experience, and very strongly so). Odysseus was given such "meta-" advice too: "Call on Cratais, Scylla's mother, who brought her into the world to prey on men. She will prevent her from making a second sally." (Homer had already dwelt in great detail on the inevitability and tragedies of the first sally of those multiple heads.) It helps protect us to know why we oversimplify. But what does that mean: "know why"? For now let's just see it in the sense of E.M.Forster's imperative: "Only connect!" Coherence again...
But Charybdis is a problem too: we are even physically incapable of seeing complex reality "as it is", since our very senses are the most inescapable component of our simplifying mechanisms. (That consideration emphasizes the key "RE" aspect of MACK -- see question 5.) Amidst the fury and chaos that predominate when Charybdis erupts, we may only glimpse (in Homer's words again) "the dark sands of the sea bottom" in "the interior of her vortex". ("Chaos" etymologically even means "gaping open", and the god Chaos was part of Hesiodís Theogony, Hesiod having followed Homer in Greek writing).
But especially when designing standards we have to take the existence of those dark sands into account somehow.
Fortunately, there is an increasing volume of circumstantial evidence in favour of what I so presume to call "the infinite complexity of reality": Future Shock, Complexity Theory, Catastrophe Theory, Chaos Theory, AI, CASE, BPR, the PDA, and even OOPSLA are all more or less formal attempts to face and grapple with real complexity (cf. that beautifully apt title of Booch's OO A&D's Chapter 1: simply "Complexity").
Complexity is thus already an ever more dominant mainstream theme.
So the philosophy is also a kind of entrance examination:
if you aren't interested in any of the above, or don't see philosophy's
relevance to any of the more concrete situations where sheer complexity
is the increasingly recognized problem in shared problem-solving - such
as in genetics, psychology, education, crime, the environment, the economy
- then don't read further at this stage. Rather wait for the kinds of discussion
that Metaset as a medium will host: they will air complexity in a way that
you can tune to your way of seeing things. You will help it help
you to see and simplify as much complexity as you are ready
and willing to consider, explore and exploit for your own perceived
and thereby evolving ends.
Fourthly, I am prone to moral-looking pronouncements with words like "should", "must" or "ought". Considering moral relativity and the generally controversial nature of any talk about morality, that certainly looks like a bad way to sell to the universal market! Despite the inextricability of philosophy, ethics and standards, it is surely true that morals and marketing don't seem to mix well.
But it is worth arguing the exception, and precisely for marketing reasons.
Let me resume with the observation in the paper that "The simplification imperative might even be presented as a fairly high common factor to all the most respected ethics."
The simplification imperative has many alternative formulations, such as: "We must steer between Scylla and Charybdis." Or "We must work at perfecting the market (for those more varied and better products which are then better simplifications of demand)." Or "We must pursue the quest for knowledge (by making ever more varied and accurate simplifications of complexity)."
I am alleging merely that what they have in common is also a mutually-acceptable common factor to all major moral or ethical systems.
So when I say something like "This is what you ought to do!" I really mean "If you want the greatest number of people to buy into your position, then try this!" Thus that presumed universal imperative is the product of some informal market research. It is a mere abstraction or generalization into an alleged universal need: "We all need to simplify complexity!".
So OO people should be able to relate to the methodology. That posited virtual superclass allegedly encapsulates features that are characteristic of its many concrete subclass refinements.
Why is it merely a virtual superclass? Because it is too abstract to be lived by alone. "In my father's house are many mansions. If it were not so I would have told you." Diversity rules, thank heavens.
The opening quote in Ride The Mainstream! was by H Zemanek, past president of IFIP: "An architect does not tell people how to live. He creates an environment in which people may live their own lives creatively."
I find that the simplification imperative is not a very "fragile superclass". However one tries to refine it, it persists. All those whose ethics or judgment I most respect seem to be adhering - albeit usually without knowing it - to some pre-existing refinement, which they know chiefly by virtue of inheritance from some apparently orthogonal metaphysical, theological, ethical or enlightened-self-interest superclass that they are already committed to.
I am convinced - and when launching Metaset and MACK I shall be asserting - that they will all often find this alternative formulation rather useful as they explore and elaborate the practical consequences of their own positions.
I presume to judge none of the conventional formulations - ancient, modern, or even standards-related - for what they are. It is true that I do sometimes judge them (e.g. the OMA!) for their ability - or inability - to help people simplify complexity, but in all such cases there seems to be "chapter and verse" from their own "scriptures" with which MACK and The Mainstream are at least compatible and even fully consistent and coherent.
Yes, the previous paragraphs do imply that I am treading a fine and dangerous line, one that might at any moment precipitate me into the Procrustean abyss of moral temptation: the inevitable temptation of every law-maker, leader, teacher, parent to think one's own experience gives one the right to prescribe to others despite all different viewpoints.
Further dire warnings are implicit in the interlude (also quoted and interpreted in my book introduced at the end of question 2) between Scylla's feast and the Charybdian figtree, in which Odysseus was exposed to the temptation of killing the Sun-god's cattle on "the Island of the Sun, the comfort of mankind, [... where] our deadliest peril lurks."
On the other hand, it doesn't help to be the "liberal" of those fine caricatures: "The liberal is someone who can't take his own side in an argument." Or "He is like a feather cushion: he bears the impression of the last one who sat upon him." (Can anyone supply the references?)
Then, I am positively reassured by further observations: I would say that it is objectively verifiable that if MACK were to make good headway then many old-fashioned and virtually universally-accepted virtues - such as openness, compassion, respect, humility, courage, wit, ingenuity, etc - would find themselves reinforced and promoted. (My thanks go to my parents for having drawn my attention to those virtues, and to my wife and our children for continuing that hard work!)
Compassion, especially, is often suppressed for practical reasons. The MACK-boosted market will remove many of those constraints. By its very functioning it will enable greater openness and respect, it will help refine humility, strengthen wit, expand ingenuity, and stiffen and reward courage in many practical ways. The power of compassion will be released. And there will be many other beneficial side-effects that that simplistic presentation obscures but all will approve of.
Finally, mindful yet again of Medawar's Dictum (after Sir Peter Medawar, Nobel Prize-winning biologist), I am reassured further because MACK will in the end be "judged by its fruits", and not by any criticisms from "Pharisees", or self-proclaimed adepts of doctrines or unwitting upholders of dogmas. So the experimentation will continue.
With such "market research" I believe I am
selling to people as they actually are, and not necessarily as anyone
wishes us to be or insists we should all be. So this sometimes
apparently ethereal or insubstantial or even devious philosophy also has
solidly down-to-earth and plainly controllable sides.
Fifthly, The Mainstream consists of nothing that is not squarely and undisputedly behind the most reputable and objective methods in the mainstream of present-day practice in science, technology, the humanities and society.
Many professional philosophers cannot credit such a bland and apparently uncritical statement, but it needs to be insisted upon and developed. However, academic philosophers tend to be put off by such a synthetic style of philosophy. They prefer "Analysis". As if analysis could ever be complete or "deconstruction" feasible! By themselves such activities are as futile in the long run as winnowing without a wind, a wind of purpose blowing in a clear direction: the grain and chaff just fall together to the ground. (Re that "clear direction" that so promotes effective consensus, cf. the "Simplicity Principle" throughout Beyond Apartheid.)
Without synthesis, construction and possible eventual applicability, it is only the pleasure of the activity that remains (and I must thank Johan Degenaar and colleagues at the University of Stellenbosch for many such eye-opening pleasures). But analysis needs great understanding, patience and art (such as I can make no claim to possess) for it to offer any of the substantial pleasures of being of service, while, alone, it can make no headway against the Second Law of Thermodynamics.
That was a guarded invitation to Philosophers to help elucidate and clarify the apparent Mainstream in a practical way (See further under questions 7 and 8).
Maybe living in the Third World makes one unusually aware of mere pragmatic needs such as coming to grips with chaos? As the most completely mislabeled "Moral Sciences" or philosophy department at Cambridge University put it to me, officially: "We don't do philosophy like that here." But that was the mid-Sixties, when the only spirit they had was the ghost of the "later" and wiser Wittgenstein still wandering their corridors. Maybe they have advanced since then. But I would doubt it, such are the undeniable pleasures of the academic branches of the Charybdian figtree!
Fortunately, however, that university also has other departments and at that time the even more boldly-named though eyebrow-raising "Faculty of Divinity" harboured the truly Homeric figure of Donald MacKinnon, the Norris-Hulse Professor of Philosophical Theology, at whose knees I sat for over three years (1964 to 1967); or - for those who knew him - under whose glare I wondered though never withered. Amongst much else, I am indebted to him for what I have found to be pearls of great price: "Theology is the finite trying to talk about the infinite." And: "Death is life's last great adventure." (Can someone supply the references please?) So magnificent were his raised eyebrows! What ruthless intellectual honesty, so appropriate to our human approaches to Infinite Complexity (my term), whatever that is. He was consistent too:
To my regret I only heard about it years afterwards, but you will be able to see how this story is fully in character: Towards the thereby-precipitated end of a talk by the dignitary in question, and (being seated adjacently) after an agonized and ever more pronounced forward-rocking towards the rather too confidently-perched limb, MacKinnon bit into the purple-socked ankle of the speaker, an eminent Bishop, doubtless to illustrate that there was flesh and bone inside. Afterwards he explained his heart-felt exasperation: "He may speculate on The Almighty's thought processes, but I couldn't leave him talk as if He were his own personal friend!" It is indeed impoverishing - in our inevitable human ignorance - to bring the Creator down to our own level (and even to talk uncritically about "the Creator").
Fortunately, we all have a lot more in common than mere disputatiousness might lead one to think. The Mainstream is broad, objective and bouyant common sense. Especially when more correctly interpreted under the critical eye of the market (as the classical Athenians well discovered).
Then, interpreted further, it does lead to some radical suggested revisions such as MACK (and another I shall not delve into now, though it is discussed on pp 82-89 of Beyond Apartheid (my 1986 book introduced at the end of question 2)). After all, "evolution well needs a good rout" too.
Now, if you find that confusing ("You can't agree fundamentally with everyone and then differ radically, almost in the same breath!"), please take heart: I also find it confusing that MACK is so implicit in everything and yet seems so new and different and better. The Mainstream is all old hat, but it does have some surprising consequences ... and so it should, otherwise we would be caught in a Determinist trap!
But then, I believe, based on such recurrent empirical observations from the whole "phenomenon of knowledge", there will never be an end to reality's ability to surprise us pleasantly! (That was the title I started writing to in 1963, then already heavily influenced by Teilhard de Chardin's The Phenomenon of Man ... though I wasn't such an earnest little 22-year-old as that makes me seem!) See also Beyond Apartheid (as the book eventually became) pp 90-92, as well as this from my paper:
"That there will indeed be a yet greater confluence [or Mainstream] to ride is indicated by the apparently infinite possibilities in that infinite yet evidently humanly-accessible complexity, which point compellingly at infinite humanly-meaningful opportunities in Creation."
And that, I find, is largely compatible and even consistent and coherent with every optimistic theism and seriously-held secular philosophy of life, as well as being a mere extrapolation of the normal everyday mainstream stuff that this point started with.
It goes no further than being a "fairly high common factor" of the kind we saw in the moral department in the above point. It is no complete theology or philosophy of life, and aims merely to be of service in the practice of those which already exist.
So the market research and resulting product development behind MACK has extended into surprising domains, and its findings are sufficiently universal to indicate a cast-iron entrepreneurial bet.
Most further philosophical details are of course less
sure. I leave them to the market, including its interpreters of history,
to take further. However, there is another potential advertising loophole
in my construction:
Sixthly, any talk about the "mainstream" is often regarded with the greatest suspicion in many quarters, and quite rightly so, as the term has such exclusive connotations. I clearly need to take up the challenge and positively justify using that word (as I said in my paper could be done).
The mainstream is indeed generally associated with insiders rather than everyone. It has particularly sinister anti-democratic connotations when used as synonymous with the "Corporatist State", seen as run by and for an increasingly cosy Charybdian figtree with its coterie of branches for politicians, business leaders, the media, organized labour and - horror of horrors! - even "mainstream NGOs", with the whole fat lot of them catalyzed and granted intellectual respectability by consultants from academia.
Considering the presently dominant "neoliberal" politico-economic orthodoxy, which evidently so marginalizes the excluded and seems to increase their number so, my adoption of the term "The Mainstream" - in the assertive singular too! - looks distinctly provocative if not ominous or downright silly. Why consider taking that risk?
The problem is even perennial and deep-rooted. As Odysseus found when the only way of escaping annihilation in the Charybdian whirlpool was to cling to its "great figtree", "I could find no foothold to support me, nor any means of climbing into the tree, for its roots were far away below, and the great long branches that overshadowed Charybdis stretched high above my head." There was no apparent way to escape and progress towards his goal.
But the mast and keel episode followed (see under the relevant headings in my paper). The keel represents the mechanisms of communication and construction in the abstract (see question 5) that are the basis of the market and its products, those mere simplifications of the realities of democratic demand as represented by the mast.
So, is the market of such mixed-blessing neoliberalism good or bad? In the "Fifthly" point above, it looks bad: it's even as if I am trying to sprinkle sophisticated Holy Water on it! That's very suspect practice. Well, I plead guilty. I also happen to see the sense in Francis Fukuyama's "End of History" thesis: I don't think I am capable of imagining anything other than the presently ruling liberal-social-democratic ideology as being at all applicable in this day and age (many though its variations obviously are...). The coherence with all my other observations is too overpowering for me. After all, I believe I am talking about The Mainstream as well as the mainstream!
However, the market can surely be made good. "The System" of such notoriety is on the contrary eminently usable and correctable, particularly with the help of the more effective and efficient market mechanisms that we in IT know we can - or should be able to - produce. I am even coopting "the market" into "the democratic struggle", as indeed many other people have long done (for this is the mainstream).
Note: "the market" here is not just the commercial market for goods and services that knows only about money. It includes all cooperative effort in which people are of service to other people and where they were not in prior person-to-person contact with each other. It comprises all but the most elementary and insulated social or family life, and even there we communicate with words that we have mostly picked up in that broad market.
Both Demand and Supply separately are each more inventive when individuals can work together. They are stronger when people can unite more numerously to a clearly agreed common goal which they can abstract from their own individual and unique situations (See the "Simplicity Principle", virtually throughout Beyond Apartheid). And Demand and Supply need to be brought together in that product-creative and -productive way.
So the market also includes the "public service" that one tends to associate with Governments, as well as the "voluntary service" that is associated with organized religion and other NGOs of that amazing number and variety in a developed Civil Society (In France, where they have to register officially, and where they are perhaps more appropriately called "associations", they number about 650,000, while 50,000 new ones form every year). The total number is said to be about 55,000 for South Africa.
Finally, the totally "Free Market" is virtually a contradiction in terms: it has always created both the demand and the supply for the mechanisms of its own control. We need to face that fact and follow its consequences openly and consistently.
Certainly (as I am continually saying) in our ever more interconnected world all such market-like activity has ever greater relevance and is usefully promoted with ever greater vigour.
What about that singular -- why "The" Mainstream?
The [one] Mainstream represents what all human beings have in common. As I said in the paper (under the heading "Synthesis [...]"):
"That confident prediction [of ever-faster application development by mere composition from the open market] is consistent with the notion of real complexity and its abstract equivalent, interconnectedness, whose surely ever better sharable form supports the thesis of the unity of The Mainstream. (True, such greater sharability will - hopefully - partly result from this kind of propaganda! But then, that kind of supply-drivenness is quite natural in the market, and will come under ever better demand-side or democratic control...)"
However, see also under question 8, where I raise the issue of how one aspect or form of that unity is to be interpreted and evolved.
"The Mainstream" concept greatly reinforces the impact of the Common Knowledge foundation of MACK (see questions 4 to 11). If The Mainstream is as broad as I have alleged it to be, and the simplicity imperative as universal as my "market research" seems to indicate (see under "Fourthly" and "Fifthly" above), then the Common Knowledge concept goes that much further and its beneficial impact throughout the use of MACK-compliant applications will be that much more powerful. That is an extremely significant result, which I cannot emphasize too strongly.
Yes, I know you can't yet see the execution-time advantages
of that Common Knowledge concept, but you must be able to see at least
some of the coherence of those theses with the rest of the paper and this
Seventhly, though I am continually urging caution on the basis of Medawar's Dictum et al, and I have been warning us all of moral temptations too, there is another message of The Mainstream when illuminated by the light of the above kinds of philosophical reasoning:
We can afford to be bold: bold in pushing towards
ACK and bold in reorientating the IT industry.
Eighthly and finally, after all the metaphysics and the ethics there is still the epistemology, i.e. all the apparent rambling in my paper about the nature of knowledge, in terms of abstract systems, their manipulation and their application. I apologize if the extremely high degree of coherence is not at first obvious, but it is there!
It is the epistemology that underpins and explains the effective paradigm shift that I claim is implicit in MACK. In addition to the sections in my paper headed with "mast" and "keel" and "synthesis", see further under question 5.
Thus the epistemology is the immediate, concrete
and practically-effective link between the philosophy and the technology
MACK, with all my talk in the paper about reflectivity and orchestration of methods, might make it seem as if complexity-hiding along with encapsulation are thrown right out, together with all the other OO vocabulary that I scorned (with the "BOO!" rhetoric in the paper). Certainly, in the paper I drew attention to MACK's different conception of complexity-hiding (Find the two references to "RE" and the two to "encapsulat").
And as the OMG's Andrew Watson relevantly drew to my attention later, a perceived dilution of encapsulation was even the OMG's main objection to the "generalized object model" whose disappearance from the OMA I branded as such a retrograde step. "How otherwise," the argument went (in my words), "especially in a distributed situation, can a generalized operation or request (i.e. on no one specified object) be interpreted and despatched to the relevant objects if the despatcher does not know anything about those objects? And anything it can access about those objects amounts to an infringement of complexity-hiding." That is a very good point. (And Andrew, would you agree with that summary?) It is in fact consistent with the first comment on encapsulation in the paper. (Though one might also ask - in the case of a classical object model - on the basis of what information the requester knows which one object to invoke, and how is the requester to obtain that information?)
I have two kinds of answer, based on the two main thrusts of the concept of encapsulation, namely complexity-hiding within packages and componentization in terms of packages:
The advantage of complexity-hiding was first formalized with Modular Programming's precept of "high module strength plus low module coupling", the second half of which was concretized in the "simple interface" inter-module-glue that has developed into "interface inheritance" with its concomitant emphasis on behaviour rather than state. The intended benefits are two-fold: simplicity and hence safety in design, and security in execution.
But that call interface is a major problem. It has so much semantics hidden in it, every bit of it a possible source of "semantic drift" between calling and called modules. There is no way that any mere call parameter signature can in general adequately express those semantics. And yet that is what every behaviour-based object architecture assumes is possible!
Fortunately, the OMG is aware of such limitations (see their 96-08-06 document referred to under question 1). Provision has to be made for more semantics to be recorded in respect of those capsules or methods, "e.g. for trading" (as stated in that document), and that must include "plug-and-play" application composition in the open market. Note, however, that the indicated degree of exposure is limited to metadata and does not require access to the instances themselves.
But metadata brings us to the whole issue of repository and its necessary integration with Analysis and Design, which the OMG is so far from having resolved. MACK has already integrated it (Believe me for now, but for some plausibility consider the IDIOM background related under the "Firstly" item in question 3).
MACK's provision for such semantic metadata to be available (for some non-private access at least) is the limit of its infringement of encapsulation from the complexity-hiding point of view. And even the OMA seems headed in such a direction in future.
Encapsulation from the componentization and packaging point of view is more interesting. In general this is the scene of various current concepts, such as frameworks (in the OO world) and configurations (in the CASE world). MACK has simple and clear answers here, but see rather under the remaining questions. /FONT>
As I said in my paper (under the "synthesis" heading), it is based on binary entity-relationships spun into a semantic web. A good familiar start!
But mere recursion in such terms typically leads to a spaghetti picture, semantic drift, Babel, chaos, increasing entropy, all such bad things that we are trying to get away from! So what are those appropriate units that can lead us to make useful constructs, things that we can deal with practically, share accurately, and apply reliably to the real world?
The problem is a classic abstract system one: what are the undefined terms and axioms, and what are the combination and inference rules that, via theorems, allow us to formulate interesting and certain abstract constructs? It is those constructs which, ultimately, make such systems applicable in domains where the axioms also seem to apply.
(Here I am using concepts that were perhaps first extensively explored by the mathematicians David Hilbert and Felix Klein. I thank the late Dr R Kannenberg of the Mathematics Department at Stellenbosch University for introducing that work to me. I again thank Donald MacKinnon of Cambridge University, this time for having shown me how it is also squarely part of the philosophical and epistemological mainstream, having been one of Kant's major preoccupations (in his Critique of Pure Reason) and being traceable back to Plato and earlier.)
Firstly, plain binary logic is implicit everywhere in the basic model. (Three- or four- or many-valued or fuzzy logic can all be fitted into it at a different level).
Next (taking a quick though partially misleading way out when talking to conventional OO people): there are types (no separate class concept), inheritance (single and multiple), and it is "true" or "implementation" inheritance with full polymorphism. Encapsulation might appear to be partially left aside due to a kind of visibility of metadata, but it is much stricter than usual in the methods, which in MACK are called RE-methods to distinguish them from conventional methods and to draw attention to their unique role, which is to help apply the abstract model to the realworld. So we have some unique features here already.
Maybe at this stage reread the section in my paper headed "Synthesis: [...]". MACK's fundamental architectural paradigm shift is implicit in these four paragraphs from that section:
"The plain logic in the metadata - with its various constructs unique to MACK - is available to the reflective process. Thus (on the one hand) much conventional program or method coding is eliminated, in extensive reuse of standard common logic facilities in the abstract domain.
"On the other hand, such visibility is in partial conflict with the precepts of classical encapsulation, with its usual "complexity-hiding" claims. Compared with MACK the latter are an oversimplification of application decomposition: objects are not islands, as the generalized object model reminds us.
"There are however RE-methods, which do fully encapsulate realworld knowledge which is outside the abstract model. The rules here are much more strict than in conventional methods.
"However, following those rules, those residual methods have a high degree of invariance with related semantic net topology changes. Moreover, their total number will increase merely logarithmically, while application volume will grow exponentially on account of the easy hence high reuse."
The next level up is less fundamentally different but more easily appears interesting. This is the main kind of coarser granule: the "typology", or set of coherent types plus associated metadata (including RE-methods). It maybe corresponds most closely to the conventionally-sought "framework" (whatever that is ... ask Taligent maybe?). However, see question 7.
But it is also the closest MACK equivalent to the "Business Object", for its metadata comprises all often so-called "business rules" too. There is no need for complicated multi-tier architectures here: there is a great generality and an elegant uniformity in the way constraints of all kinds are specified and applied.
The typology is applied in an interesting way. I now jump to Metaset in action:
The task of a MACK-compliant "MVC-like" and "ORB-like" kernel or "application operating system" (such as Metaset has - see also the references in this file to "IDIOM" (yet again!)) is to help the users grow and use the builtin distributed database, profiting from coherence and redundancy yet maintaining high consistency.
This works by dynamically blending typologies into (or out of) each user's "scene" (or set of currently applicable or nearly-applicable typologies), and "dragging" the data with them, creating and managing appropriate dialogs with the users where indicated by the typologies with their various consistency criteria, and keeping the physical data in good shape all the while.
It's difficult to see what that means in practice, but in that context you can compare a typology with a conventional subschema supplemented by special triggers, though with the time-related issues mentioned under the next question being taken into account.
But the more I try to help you see it in familiar terms, the more you must think of all the well-known difficulties of those conventional ways and which don't apply to MACK! And the picture I tried to sketch must really look like a smoke-and-mirrors show of words! (For why I don't make it clearer, see question 11.)
So I end with this attempt at a reassurance:
Due to the special nature of the MACK metadata and the tight architectural positioning of RE-methods (which are themselves currently merely coded in C), the entire process is really very simple.
That is the beauty of the basic model.
It is also crucial how the technical picture is kept straight by its compelling coherence with the entire end-user world of the Market.
It is crucial not only to the natural and tunably-congenial behaviour of the resulting Personal Digital Assistant but also to the maintenance of internal efficiency despite the apparent complications (See the uses of the word "binding" in my paper, which (by the way) I use to refer to typology-binding rather than the OO-usual method-binding). See further throughout the remaining questions.
(By the way, by the PDA I do not mean the present generation
of hand-held devices, generally distinguished from the PC. I use that succinct
name in a fully-inclusive sense, referring to the future ubiquitous personal
interface with the digital universe.)
There are many needs here, of which I list some:
That is the scene of the "Persistent Application" which penetrating computerization lands us in (as introduced under question 3 - find "IDIOM" in this file again).
It is common cause in modern application design and implementation that that set of problem-areas is probably the most complicated one to address. One often sees blanket assessments to the effect that 80% of technical application complexity is encountered there.
First a negative answer:
I think that TSQL2 (the proposed temporal extension to the SQL-92 language standard), though it is not far wrong in its assessment of some of the above user problems, is barking up the wrong tree entirely (at least in its present form). Fortunately, SQL3 will need to be sorted out before people will really be interested in TSQL2, and I am happy to see that SQL3 is having a rough time in the market, as it confirms my position that linear text languages are not the way to go when the issues get complex, and I am interested in tools for broaching complexity. We need "hyper-" tools. (See also the episode related in question 11, third paragraph.)
(The same criticism applies to HTML and Java too. In addition, procedural languages like Java - unlike SQL - imply that much-deprecated algorithm time (See question 1). But maybe I don't have to belabour that point so? After all, it is a major - or "mainstream"! - trend to aim for declarative rather than procedural approaches.)
Meanwhile - though it may not appear so - I am relaxed about TSQL2, as I am certain it will not get anywhere because it only complicates and does not offer any help in simplifying things. No progress can come out of that. IT systems are already becoming too complex for us to get our minds around. (At least they are for me and many others!) And that is of course why this Workshop is being held: to cope with such complexity we need to analyse and construct in terms of appropriate units, so what are such units?
That question brings us to the more positive answer:
MACK deals with the time component of data mainly by offering facilities for metadata to apply to data/metadata at various and more relevant levels of granularity (than in TSQL2), and then providing for standard logic to apply appropriately to each level.
MACKís time management integrates with much else, such as resource identification and transaction design and management, or workflow design and management. (It seems self-evident that it should, but how tiered architectures complicate the picture with their oversimplifying divisions of functionality!)
So granularity and the granules themselves are the key issue. And in MACK that granule is of course the typology that we saw under the previous question and brings us to the next question.
With its help Metaset can grasp what is perhaps the most fearsome nettle in the future computing environment: that notion of the "Persistent Application", which must address all of the notorious problem-areas of application design and implementation for 7x24 use, such as those listed above (see also question 10).
MACK can presume to address all those problems in a simple and powerful way purely because it has an appropriate conception of realtime, is integrated into workflow, and has well confined the baggage of algorithm time, thanks to its unique conception of the "operation", which itself is thanks to the simple information model with its epistemological appropriateness of the "RE" or Realword Equivalent concept, together with coherence and consistency criteria, to the ever more interconnected world that we can no longer escape and will profit from facing up to (Phew what a horrendous sentence! But what a beautiful chain of coherent and far-reaching abstract concepts!) Now read on please too:
First another question for the OMG (Sorry, but I can't resist the temptation to be so cruelly needling! Andrew, do please update and/or correct me where necessary?):
In the OMG's "Letter to OMG Members" and "Proposal for OMG Organization/Evolution" of November 14, 1995, Revision 2.0, a "new dimension in the OMA Reference Model" was proposed: "Application Framework", which may be defined as "a collection of objects that provide an integrated solution within an application domain and which is intended for customization by the developer or user".
But in the subsequent Press Release, "OMG REORGANIZES - Focuses on Users, ISV's and Vertical Domains" of December 5, 1995, there is no mention of the word "framework", while the only approximate change was this: "two changes have been made to the OMG Reference Model in order to match the activities within vertical domain task forces. Domain Interfaces are vertical application interfaces specific to a vertical market. Application Interfaces will continue to be non-standardized application-specific interfaces which are horizontal end-user oriented Common Facilities and general Object Services likely to be used in any object-based program."
Does that mean that the OMG is unwilling after all to define an Application Framework? Does that explain their new-found enthusiasm to sidestep that problem/opportunity and go straight to Vertical Domain technology? Maybe they're still comng to it. So they would do well to read on:
Meanwhile, here is Grady Booch on the OMG's own "World Guide to Object Technology" CD:
"Perhaps the most strikingly consistent feature I find among the successful projects I encounter - and noticeably absent from those projects which seem to fail - is the presence of a strong architectural vision. Architecture is, unfortunately, a much abused word, but in my experience a well-engineered object-oriented architecture has at least two important dimensions to it: a sea of classes that capture the vocabulary of the problem space and a set of mechanisms that specify how certain classes collaborate.
"The first dimension is fairly obvious. However, there are two important lessons that many beginning projects miss. First, for systems beyond a certain complexity, the class is a necessary but insufficient vehicle for decomposition. Second, no class is an island.
"Indeed, this is the point of the second dimension: classes collaborate, and the common way those collaborations play out is fundamental to the conceptual integrity of an architecture. Furthermore, actively applying a small set of common mechanisms helps to bring simplicity to an architecture for even the most complex domains."
I ask that question of the OMG, and quote from Booch (my italics), because MACK seems to have what appears to be the key framework concept they are all looking for. At present I am calling it the "typology" (as we saw under questions 5 and 6).
A MACK typology is a coherent set of types with their associated metadata, including RE-methods.
"Typology" (to extract and translate from the french "Petit Robert" dictionary definition) is "a system of types" and also "the science of the elaboration of types, facilitating the analysis of a complex reality." So the word is most appropriate to its use in MACK!
(I prefer "typology" to "framework". The latter has undesirable connotations of rigidity. Sure, "typology" perhaps implies an undue eagerness to "typify", to say that something is "typical" - even to "stereotype"! - but that consideration usefully draws attention to that epistemologically critical step of "perception in terms of", which is that insidious simplifying and all too frequently oversimplifying step of which we must keep so aware. It seems easier to "re-typify" than to "re-frame", and in Metaset it will be easy too ... and far easier than we are accustomed to with present technologies! The "-ology" part, though rather heavy, draws attention to the merely verbal or conceptual nature of the truths thus provisionally represented.)
It is the well-defined coherence of a typology that gives it the unity of "a reusable component". A "complete product" is also specified as a typology. So there is no intrinsic difference between a component and a product, and generally the word "product" covers both.
That is not strange. No product is standalone: every typology overlaps at least partially with at least one other, namely the host MACK-realizing and also MACK-compliant kernel (of which Metaset will be the first).
The overlap between one typology and another is their "common knowledge" which is the semantic starting-point for their interoperation. Thanks to The Mainstream, there is usually a high degree of common knowledge, making interoperation a natural and comfortable process, while yet not being imposed where there are genuine differences.
The set of all products currently "in use" by a user is also a typology, so those products have the status of components when seen from that perspective. That product is then that user's own customized and individualized product.
(Hence the minimally "thin" client throws many opportunities right out the window).
The label "product" is however on the whole more appropriate than "component" because each typology is associated with one developer/supplier (even though component typologies may originate elsewhere).
In Metaset both the spatial or qualitative and the temporal or dynamic aspects of applications are fully described in terms of typologies and their dynamic interplay.
The dynamic aspect is implemented completely as activations and deactivations of typologies. Thus there is no procedurality implicit in a typology. The kernel figures out the applicable realworld procedures or workflows. It does so on the basis of its own special typologies, driven by an irreducible event-loop supplying the basic real time, which thereby becomes the one source of "application algorithm". That is the basis of the "untangling of algorithm time from real time". Familiar algorithms such as mathematical ones are encapsulated in RE-methods, which may themselves run synchronously or asynchronously using builtin Metaset facilities.
The way typologies blend is completely unique to MACK, and exists entirely thanks to the unique fundamental model consisting of abstract interconnectedness plus RE-methods as already described.
The blending of typologies then closely mirrors how in real life our attention swaps quickly and smoothly between the various complementary views on whatever we are thinking of. With its static and dynamic binding options, the process is appropriate to routine as well as discursive mental activity.
That multi-context capability then also generalizes to support multi-user databases with minimum redundancy.
The process of integrating a new typology into a larger typology is called "binding", in which common knowledge is used to bind (or link) the newcomer into the present environment ("RE-method binding" is different, and is an almost trivial function of the kernel). The binding process itself is a manifestation of a typology in the kernel.
Binding has very interesting antecedents (Buying), typically entails a quite extensive immediate process (Installing), and has very worthwhile consequences (Predictability and efficient processing).
The buying phase of binding is what happens in the marketplace. It is the result of the matching of demand and supply, which is carried out by a user under the guidance of typologies reflecting on needs and products.
The needs include specific problem descriptions, the user's self-developed personal interest profile and the application environment. The latter typically describes organization structure and other external constraints, including the profiles of other involved users.
Prospective products will typically have been described in both supply- and demand-orientated terms. Demand-orientation helps directly address prospective customers in terms of their own needs, while supply-orientation, though often misleading, is sometimes suggestive.
Evolution in the market will quickly lead to a range of such terminology-sets or idioms, which will marry IT-structural and domain-specific aspects. Prime contributors to such evolution will be those designers who can abstract from the ways people actually work, with or without computers, and - analogously to current pattern abstraction - can cast them in the form of such "trading" typologies.
That will be a major simplifying force in the total knowledge-creating market. It will also enormously distinguish a MACK-based information world from the existing chaos on the Web.
That feature too builds on "the mainstream" of IT thought. Consider this from Tim Berners-Lee: "The job of classifying all human output is a never-ending one, and merges with the job of creating it." (from: http://web.mit.edu/afs/athena/org/t/techreview/www/articles/july96/bernerslee.html ) There is fine epistemology in that observation by the creator of the Web.
The key to the effective and efficient resolution of how to match Supply with Demand is the common knowledge concept in MACK (with philosophical justification in the whole concept of "The Mainstream"). The higher the common knowledge component, the easier the whole trading and binding process.
On the other hand, certainly, the whole Procrustean syndrome is to be guarded against: Supply must not be allowed - in its own interests - to distort Demand. "Watchdog" patterns will be abstracted from current practice, while new opportunities too will similarly gell into such typologies. The "consumer interest" will gain new force, yet we can expect collaboration rather than confrontation to be the dominant emergent mode. That will not be merely because it seems desirable. It will happen because such evolution will be most effective in meeting the needs of both Demand and Supply. (Though - if my style in the paper and this FAQ has any success - we may expect shock to be productive sometimes too!)
The Common Knowledge (or "CK") aspect will make the job of the roving market agent that much easier. The agent will select more easily, based on a greater knowledge of its sender's needs and a greater disclosure of relevant candidate product details (Openness will pay!). And the buyer will be able to control the agent far more easily on account of the integrated design and operational environment.
Installing is likewise typology-driven. "Design patterns", or typologies reflecting on implementation matters, will often automatically integrate a chosen typology in an efficiently-executable manner.
Where that is sufficiently dependent on unknown future usage patterns, Metaset will help the user/implementer takes informed decisions. Once such implementation matters have been finalized data-integration is carried out (It may even have started before finalization, as simulation is an important part of the process).
A high degree of asynchronicity of detailed processing is implicit.
The relative invariance of RE-methods despite semantic net topology changes (see question 5) contributes spectacularly to the ease with which a MACK-compliant application executes and may be evolved.
The buyer will usually want to refine bought-in typologies. In many cases a MACK-realization will automatically do some "data-mining" on the user's own data and suggest promising refinements. Metaset as the first MACK-realization will have some perhaps surprising data-mining quite close to its kernel functionality (It's all to do with that emphasis on coherence, building on that growing "interconnectedness that is the abstract equivalent of complexity"...).
But that will be the mere scratchings on the surface of the data-mines. Pattern people will find many rich lodes to exploit.
Efficient processing is possible despite the native interpreted mode of Metaset (though MACK does not mandate that mode).
Firstly there is this effect (quoted from the paper, under the "keel" heading): "MACK's simple and general-purpose nature gives Metaset a reflectivity far beyond that offered by any present architecture. Such reflectivity enables extensive self-manageability. Internally that means continuous self-monitoring and self-tuning of resource use, including garbage-collection, content-addressibility support, physical data representation and location-contiguity optimization, redundancy management and other memory and database grooming on the fly."
Then there is the main efficiency opportunity (quoted from the paper, under the "synthesis" heading): "Binding is intimately associated with the whole market scene, where suppliers make commitments to their customers. That in its turn creates the opportunity for highly-bound high-granularity application components to be code-generated for efficiency where it really counts. Such relative inflexibility will tend to have high user-legitimacy hence high specification-reliability."
All that is made possible by MACK's unique transaction and associated resource-reservation concepts (the latter also having been a central feature in IDIOM - see question 3), themselves also defined in terms of typologies.
Many typologies are presently being "manually-bound-and-code-generated" into the initial Metaset program, but in due course that will take place more formally, in automated routines.
So we can begin to see how most if not all processing is driven by common patterns of the kind that our present pattern experts are already familiar with. Their intuition and creativity will find an ever-expanding scope in the MACK world, for they will be able to cast their patterns immediately into executable form as typologies, and test them in short order. Pattern evolution will speed up too.
Fortunately, the MACK-based public communication medium and marketplace will host and even stimulate powerful forces for change. That's what markets are meant to do.
But unless the architectural framework can flex easily, it will break under such forces and a completely new architecture with all its legacy-application nightmares would eventually impose.
So the question boils down to this one: What are the limits of smooth version migration?
There are none that cannot be overcome using the inherent semantic flexibility and translatability in MACK-compliant metadata and data.
I certainly see that process comfortably addressing changes throughout the design of Metaset and almost throughout the design of MACK.
But - for present trade-secret reasons (see question 11) - I won't go into the details either of the vestiges of fixed features in MACK or of its provisions for smooth version migration. I'll merely leave you with the tantalizing - or frustrating! - blanket assertion that both OMA/CORBA's and COM/OLE's so-called version provisions are totally laughable in comparison.
For example, what Microsoft is pleased to call its "solution" merely addresses multiple version coexistence, and not the cohabitation and migration needs that are both more difficult and more indispensable in modern computing. The latter require inter-version interoperability which is usually automatic and always extensively assisted and controlled. MACK provides for all such needs.
I must, however, mitigate such blithe claims and even arrogance with this little feature of Metaset: Metaset is being built using MUCK, "Metaset Universal Common Knowledge". Such alleged universality is a heavy claim, so that's where the "muck" theme comes in, with the message that it's something continously renewable, especially with the help of creative seeds that can flourish in it. It's not to be treated with any reverence. On the contrary, it's to be regarded as homely, to be trodden on and trodden in, replete with associations of fuming fertility.
MUCK is merely the MACK-based specifications for Metaset.
I foresee that my own initial rendering of them will virtually all be modified by the open market. "Metaset" has long had the internal abbreviation "mt", taking its first and last letters, which - I enjoy explaining - symbolizes what will happen in the market: all its insides will in due course be replaced, and the original Metaset will be "mt", that is, "empty": just the shell of its former self. Rather like the smile of Lewis Carroll's Cheshire Cat.
The "Universal" bit relates to its "Common Knowledge" aspect. CK is always relative to the two parties in any exchange, so no universal repository or product catalogue is mandated for most CK. But MUCK is that CK which is common to everyone. That's all. Well, not quite, as - thanks to The Mainstream - it will soon be found to be extremely extensive.
It will surely soon be discovered, too, that its precise form is worth extending as well as evolving. Shared conventions will grow, so MUCK too can and will grow and evolve. I expect, too, that it will evolve quite quickly beyond my easy recognition.
It seems, however, that there will always remain a "MUCK-cultivation" function: MUCK's consistent growth and evolution will need to be hosted and managed, while every MACK user will need to be able to verify the universal character of any communicated MUCK that pretends to be such.
For the revenue-earning potential here, see further under question 11.
Academics and other theoreticians will also have a key role to play. MACK is beautifully suited to abstract modelling and analysis with a view to further correct automation of many design and operational activities. Those will vary from nitty-gritty requirements such as code generation and resource-reservation optimization, to high-level needs such as the composition of help messages and self-teaching materials. The latter will extend right up to the "Philosophy and Education" field already mentioned under question 3 and alluded to in this faqís introduction and elsewhere, for it is all part of the simplification of complexity.
Thanks for the compliment! While Metaset has been extremely demand-driven - see question 3 - it has been a very supply-driven product too.
Often, during a design process, one inevitably projects existing technology into new areas. Existing abstractions enable one to "dare to wander" (see also the mention of creativity under the "keel" heading in my paper). And on the basis of Metaset's long history (see question 3, item "Firstly") I not only wandered, I hurtled headlong into the OO & Business Object standards goldfields. Sure, "when all you have is a hammer, everything looks like a nail." But that's what appropriate generality enables, while coherence and experimentation help one control it.
At the end of any well-aligned and well-stabilized marketing process one should indeed have a product for which the supply-description resembles a needs-description.
No, Metaset does not exist yet, but yes, it is emerging, as foreseen, even though still more slowly than it will once it is bootstrapping itself, which isn't too far off now. A major breakthrough has been to find a way to start building canonically before the canon can be represented and interpreted, and the finer details invented and used. For a qualitatively new architecture and representation, that was quite difficult for me to get my mind around.
(That also helps explain why I am relatively relaxed in describing as many details as I am, while yet not fearing that anyone will re-engineer my "trade secrets" (see question 11).)
A few implementation details might help give an impression of greater concreteness:
Such apparently mad ambition is feasible because of the canonical nature of the entire development, even at its present early stage (after much experience with earlier approaches and refinement of techniques!).
And yet that initial Boot Product will be nothing,
in view of the rapidity of its own market-based evolution that it will
enable its users to precipitate immediately.
Existing DBMS (R- and OO-) is one of the areas of artificial complication that the notion of the Charybdian figtree has helped me keep an eye open for. It was a great relief when I realized that much of that complication did indeed not imply real complexity.
Once again, it was the correct approach to realtime that enabled that cutting of corners. The major simplification was in the virtually complete revamping of the notion of transaction definition and management, as indirectly implied under question 6.
But that was still all done in an integrated way. I quote from my paper again:
"MACK has a kind of equivalent to CORBA's IORs (Interoperable Object References), though it is an integral part of the whole message structure scene, which is radically different from CORBA's. As may be expected, message structure also rests solidly on the far-reaching and very practical notion of common knowledge. It benefits greatly from the MACK coherence and consistency strategies. Applying cleanly both to local messages and to IPC, it is integrated with all memory management, including virtual memory, persistence and recovery mechanisms. It forms a natural foundation for meeting distributed-system, mobile-agent and broker/trader needs."
So both database and metadata repository are seamlessly integrated into the overall architecture, as indeed the "Persistent Application" concept of question 6 as well as the plug-and-play application composition process of question 7 seem to demand.
Federated application- and database-distribution is a mere variation on the universal MACK theme.
But what about legacy applications? It is remarkably easy to map existing data statically into a MACK-compliant world (that being another of the fortunate consequences of MACK's metadata-driven finite state machine aspect), and to apply appropriate validations and transformations. It is rather the dynamic aspects which are difficult to map, what with transaction definition and management being so qualitatively different (though so much easier on the MACK side).
As a result, given the exceptional ease of data-digestion into a MACK-compliant realworld model or database, as well as plug-and-play application re-engineering and version migration in a MACK-compliant world, one may expect migration rather than coexistence or cohabitation to be more popular.
Either way, Third-Party developers (the Supply side of the MACK-compliant market) will have a field day, whether they supply executable or plain information.
What about SQL?
As already implied in question 6, I see no pure-MACK role for it. Metaset's extensible MVC-like capabilities are more appropriate to the easy representation of simple views of complex models. They will be much more intuitive than conventional command-line styles. The latter are effectively David-vs-Goliath attitudes, belonging to more primitive do-it-yourself days. They are no longer applicable as that Goliath grows in size and usually-concomitant exponential interconnectedness.
Yes, everyone gets frustrated trying to get an idea of what exactly MACK is while I duck-and-dive most suspiciously, and apparently try to pull all kinds of sophisticated wool over their eyes! So indeed, what could I expect the Workshop to work on?
My problem is twofold:
The first is that MACK-compliant "source" is in a very hyper- form. I long ago gave up trying to reduce complexity to linear text form (that having been one of the lessons of one of the IDIOM episodes - see question 3 - in which my own hyper- approach was shoe-horned by some brilliant young programmers into a baroque syntax complete with BNF-based compiler).
Not only am I very bad at composing fluent narrative (as you can see from my spaghetti writings which I allow the conventional linear-text medium to squeeze out of what sometimes comes across as disconnected mental hyper-junk), but linear text also complicates "understanding" by computers. That is especially the case in the ever more interconnected world of modern information systems and their design. So to see it most easily and meaningfully requires interpretation and presentation by a realization such as Metaset, and Metaset is not yet sufficiently programmed to do that well enough for my liking.
The second problem is that "MACK source" is really very simple, and if I expose too much now, somebody might copy me. That would be bad for me, and it might also be bad for MACK: at this stage - I think - either a competitive battle or an unfocused development process - or both - would most likely hinder efficient and constructive evolution. Using present communication and discussion media, it would unnecessarily dissipate everyone's energies, especially - as far as I am concerned - my own.
An open development and further evolution should wait, I think, until it can be hosted by Metaset as a first realization of MACK. One of the reasons for developing Metaset in the first place was to host the airing of complex issues and the cultivation of simple consensual outcomes. So I prefer to look forward to its own faster bootstrapping in that market-based way. (See the definition of the "Boot Product" under question 9.)
You might well comment that those two possible problems (i.e. lack of understandability and excessive stealability) are mutually exclusive, but as I progress and the first problem recedes, the second looms larger.
So what am I trying to achieve now?
Two kinds of discussion might take place:
Following the second theme, then, I see two alternative further project strategies at this stage (Any other suggestions are welcome!):
So with my approach to this Workshop I am now testing whether the first strategy is feasible. If not, the second continues. The first is impossble? Maybe so, maybe not.
But all I need to find is one person who knows the IT needs and can talk to the right kind of investors and managers.
Of course at first sight it will seem like a bit of a gamble for an outsider, but the project would be fascinating and the potential reward enormous. With the right initial partner(s) it would not be very difficult to put together an appropriate kind of project.
See also all the business-risk-reducing considerations set out in question 3.
What about the kinds of revenue that Metaset/MACK might produce?
My inclination - to promote the quickest enduser and information-product developer take-up - is to give Metaset away as soon as it can be downloaded and enable the user to join the Metaset- and subsequently more generalized MACK-compliant market.
Core project revenues could in due course come solely from hosting MUCK-cultivation (see question 8). Some single agency has to perform the residual globally-central function, even after the maximum distribution and delegation that would be best for the open market..
If MACK (and in due course ACK) were to achieve the potential I obviously foresee, such revenues here could be as high as the project owners wish them to be (where I am assuming they would be few enough because not many are required, and that they would all be duly modest in their estimation of their own abilities to spend it responsibly).
That blithe projection makes some initial assumptions which need to be tested, and the Workshop could be a suitable occasion to do so, up to a point at least. (E.g. my assumption that such an irreducible central function would remain and could be protected using existing intellectual-property mechanisms if there is no other implicit or builtin way of doing so. Other comparable IT-related globally-central functions could throw some relevant light on this situation.)
Initial project revenues, as well as eventual Third-Party product development revenues (as distinct from those from design, education and other implementation services), would come chiefly from RE-method development and sales. RE-methods are more protectable - both logically and physically - than abstract-model metadata, which will tend increasingly to become CK. So they promise to be the most appropriate embodiments of intellectual value.
There will be enough scope in RE-methods for an enormous Third-Party market. Integrated authentication needs and data-signing opportunities will do the same for the "non-executable" information market.
So there will also be vast potential for information-product market-related services of all kinds, including advertising both products and needs, matching them, concluding transactions, billing, delivery, training, support, upgrades, user-feedback, thus cycling back to needs-refinement and product-design. The general market will lack only the delivery portions of that cycle.
As a result of the power of the CK concept, and leveraging the Internet as it will, the adaptability and sheer velocity of such a market would be dumbfounding - Future Shock squared! - were it not for the Scylla and Charybdis perspective and related MACK-based appropriately-simplifying market mechanisms that will keep it under control by helping our minds to cope with it.
And if you can grasp that new-world-in-one-sentence
on one reading, then we're well on our way!
Please don't think I haven't asked myself the same question! (I do have my eyes open, I think... And I have long had the image of Scylla to help keep them open!)
Remember too that I am inviting help (see question 11): that way we will "hopefully get a better-quality Metaset out more quickly."
This is how I try to explain to myself how it is possible to be as self-confident as I am, despite the evident craziness of the project: The basic MACK model is really so simple and so general-purpose that there isn't all that much programming to the Boot Product's kernel. Then, as I said in my paper (see under the "keel" heading), the simpler the underlying architecture the easier the resulting reflectivity and power. So the entire superstructure is merely MACK typologies applied. True, those typologies do include quite a number of those C-coded RE-methods, but remember, RE-methods do also have some special qualities that facilitate programming in a dynamic environment (see under the "synthesis" heading).
I know it's all quite far from done at this stage, but - having only relatively recently immersed myself in its programming (and virtually redesigned the internal structure) - I can see it so clearly now that I have no doubt that it will work. The remaining grey areas are already so well circumscribed that no surprises can emerge from them to upset the whole apple-cart. (And as you can deduce from my answers to question 3, the "Firstly" item, I've had some experience in many relevant problem-areas.) And then my degree of "supply-drivenness" (see the start of question 9) has been very convincing of the basic framework.
How can I get you to believe me? (Other than "telling all", that is... And even then, at this stage you still wouldn't necessarily see it all immediately. Hence the guarantee implicit in the upfront payment I would expect from any partner at this stage.) Until Metaset is demonstrable I can only offer circumstantial evidence, as indeed I have been doing throughout this faq.
Another way is to point out that I have needed the simplicity of MACK in order to be able to conceive it. I am not a top-class programmer by any stretch of the imagination. But there have been all those needs pushing me, so Iíve had to look for a simple way. Then it has certainly taken me long enough to work on that! Hence all that coherence that I keep insisting on. Iíve had to find and expand it in order to buttress the whole construction, especially as I progressively discovered the extent of its potential (See also the opening of my answer to question 9, re the simultaneous demand- and supply-drivenness of Metaset).
Yet another point also derives from that long history. It has been so unusual that there is some plausibility to the thought that such a wide detour might well lead to some new ways of doing things. (Even if I then turn around and assert that this way is in fact in The Mainstream of the evolution of human thought!)
As for the architecture itself, I'll try again. As I've said, there is a binary ER model "spun into a semantic net" underlying MACK. Everybody knows that that concept is both simple and general-purpose. Many practitioners somehow feel that much more can be made of it. But nobody really sees quite how.
A semantic net is a static picture. As you will know from most of my answers above, I am claiming that it is the dynamic aspects that are crucial. Even here (though I obviously can't yet share my hindsight with you much!) there is a very simple answer in the form of the MACK "operation". So we are back to "behaviour" very soon! (Do you remember how in question 4 I disdained the "behaviour"-based approach of "interface inheritance"?) And the time aspect of behaviour brings us back to the time-management aspect of MACK that I have insisted is so appropriate to the Penetrating Computerization that complexity is pushing our noses into in such an ad hoc way (when we should be pulling on it together in a mutually-agreed-standards-based way ... and drinking from it and regaling ourselves on it!)
See also the last paragraph under question 6. (And please excuse that further attempt at coherence-knitting. (How I long for Metaset to help bring out these rich complexities simply and appropriately to your own situations!))