|August 18:||many minor changes & additions, more on A&D, and an Appendix on Xerox PARC's AOP|
Here we go a short way beyond my OOPSLA'96 Business Object Workshop paper, which was already "where angels fear to tread".
More specifically in this paper: Where are we in BO headed? And how do we get there better?
(If it may seem that I sometimes go rather too far beyond BO, try zooming in on some more immediate relevance by using your browser to find this paper's use of words such as "workshop", "workflow", "pattern", "component", "framework", "business" and others from this Workshop's excellently practical Goals and Focus.)
As set out more recently in the background to my "OMG finds true love" allegory, and cutting a long story short, the initial objective of this evidently "foolish" and hazardous project is:
to supplant most of the Internet software infrastructure above the basic transport level, by providing a new, simpler and naturally market-extensible foundation for the component-based development and operation of all interoperable Internet-leveraging applications.
The new dispensation - based on a software product now being programmed - will completely supersede the World Wide Web and even e-mail, much though for most users those utilities are the Internet. It will also surprisingly soon supersede all present database systems. Neither effects will be by chance, as it has been designed - since 1987 - to involve all such functionality, including an extended newsgroup concept too. By 1988 the project was already explicitly aiming - on political and philosophical as well as economic and commercial grounds - at cultivating a complete and universally-accessible market infrastructure or "market vehicle", though it was only in 1993 that DBMS was targeted for first product release, as the basic foundation further proved its potential.
The ultra-thin 1988 Entity-Relationship-based prototype (which Microsoft's poor 1997 Repository calls strongly to mind) ran only on South Africa's Prestel-protocol "Beltel" national dial-in service, with the national X.25-based dial-in service targeted too. Since then, of course, the Internet has become the obvious platform, thereby also permitting more immediate international ambitions. (The Internet, by the way, illustrates excellently that standards can be effectively supplanted, even in the face of a most well-ensconced standards body, in that case CCITT/ITU. By the way in another way, South Africa, for various interesting and little-appreciated reasons, has long been at the world forefront of certain national-network-based applications).
"Ride The Mainstream!" has been the project's slogan since 1990, as a lead-up to a serious upscaling of effort. The product's ER-based "semantic net" foundation soon spread into Object-Orientation, but in a way very different from conventional OO implementations. That has put it in conflict with particularly the OMG's whole body of work. It was only in 1996 that I put aside my reservations in speaking out - without its yet being demonstrably programmed - about replacing OO as it is now known, along with the OMG's OMA and most of their other standards, as well as Microsoft's DCOM or distributed-object architecture (which so resembles the OMG's in fundamental ways), together with Java's and the Web's ... if indeed any of them can be said to have such an architecture worthy of the name. All the above would-be standards are thereby declared "out on a limb" of the evolutionary tree, venerable, massive and tentacular though that branch evidently is. The Mainstream will mischievously go "BOO!" (Beyond OO), to announce and emphasize to conventional so-called OO that it realizes the essence of OO objectives in some plain logic. Such confidence - already! - is due to the many other fundamental ways in which the entire project has long been working towards riding The Mainstream -- and has had its basis at least partially confirmed, as we shall briefly glimpse below.
The project is at present producing Metaset® (registered trademark of Metaset CC ("Close Corporation"), of which at this stage my family is the sole owner and I the sole employee. "Metaset is a better bet" rhymes, and appropriately so).
It will be the first implementation or realization of "MACK", the "Metaset Architecture for Common Knowledge", which will soon become the open standard that all information product developers will want to follow. Considering all circumstances, it will also be less "proprietary" than all other candidates (And I have made no attempt to trademark it).
During 1998, d.v., Metaset will start quickly bootstrapping the MACK-compliant market and will begin hosting massive compliant application-development by ISVs (Independent Software Vendors), such will be the material attraction of its open-system and open-market features, on top of its basic simplicity.
Its conceptual and behavioural consistency, giving plain ease of learning and use, whether one wants application-development speed or architectural and commercial strengths, will quickly win allegiance.
Its immediate functionality, which we shall look at below, will easily break most initial barriers to both attention and involvement.
Simultaneously, it will start hosting a "MACK-to-ACK" process, whereby a universally-acceptable formal standard Architecture for Common Knowledge will evolve in easy steps.
That fully open and truly collaborative process will be well supported by the Metaset market vehicle. That could be under the aegis of the OMG (that is, if they wake up to the opportunity in time).
I expect the ACK stage will be reached during 1999, such will be the velocity of this self-booting hence exponentially-growing market, especially considering that the burgeoning MACK-compliant market vehicle will remove the application-development bottleneck to the exponential penetration of the Internet (as distinct from its mere spread) -- the last major bottleneck whose very nature of resolution has still been so unclear.
(Here might also be the best means for the radical BPR and simultaneous rewriting and reimplementing of many old non-Y2K-compliant applications.)
Around that time, most of Metaset itself will probably have been superseded by other realizations of the evolving architecture, though few will really notice, and none will care about it.
But is Java not already doing all that? One might contrast the Java frenzy with the approaching MACK phenomenon by comparing sex with love. Java, like the OMA and DCOM in having a predominantly service-based approach, is still anchored in computer processing, whereas MACK is aligned with coherence and truth. And Java programmers tend to do it alone with gimmicks, whereas MACK will appeal rather to those who really want to see Supply and Demand truly match and shape each other, in an evolving and ever more profound way. We shall look in more detail below at some aspects of that maturing process (And, once again, see "how the OMG finds true love").
Though the first release of Metaset - the "Metaset Boot" - will run only on Microsoft's Windows® platforms (though not on the 16-bit versions) as a host, that will greatly facilitate migration of the majority of existing users to the new environment. The Boot will also contain some novel portability features that will spark intense competition and hence rapid evolution of its host platforms.
Thus Microsoft will be given a head start, but will soon thereafter be faced with some serious threats to their ever-growing hegemony. (Once they are interested, I would be happy to collaborate in some way with them on the former, and thereby speed the new mutual growth, but only if they can convince "us" that they will play fair with the market on the latter. What do you think? I think that they would be well advised at this stage to hedge their bets, at the very least, and that you and I would need to do some very careful thinking... One vital safety factor will be that the MACK-boosted market will ever more powerfully guard and expand its various equilibrium factors. But at what stage will that be the case with sufficient certainty and force? During the build-up to that happy state, some other form of reassurance would appear appropriate).
Sounds like a tall story? Is all that software (and pretension) not even more than the OMG is aiming at? Yes it is, most surely (and in a moment we'll take a cursory look at some of the many philosophical issues around the phenomenon of "Simplicity" that have also most practically helped shield simple foolish me from the enormity of it).
The emerging environment will simply supplant the OMG's entire OMA and CORBA, most of their Common Facilities, about 90% of their Common Object Services, and transform the little that remains. That in turn will transform the OMG's relationship with Microsoft (which is at present - and at best - a very uneasy coexistence). The objective is more even than the aims of many other de jure and de facto standards bodies together with those of the OMG. In due course the joint process will supplant or transform all present operating-system user-interfaces, all present DBMSs, all present System or Application Repositories, all present middleware, all present EDI, for example, as well as create a new environment for all procedural programming languages and invite the development of a new one. Judge the feasibility of such a project as this paper proceeds, but more immediately, is it feasible to expect the OMG's (or anyone else's) cooperation?
Until Metaset is launched and market-tested, when its new paradigm will sink in and its practical simplicity will quickly convince sceptics, it would appear quite infeasible. But I think that - until Metaset is launched - the reorientation process will be helped by small beginnings such as this paper and its predecessors (See also the background again). And - to call a spade a spade - there is no doubt some attraction, to either the OMG or to Microsoft, or to both, in an architecture that aims to replace not only one's own architecture but that of one's archenemy too.
Some more detailed negative argument is not completely out of place. Clearly, though they are admirably further than other cross-industry standards bodies, the OMG is not getting there. In case the Workshop is interested, I shall have at hand some detailed evidence, from their published documents, in further support of that dismissive assessment. Some more factual data and public debate on that issue would be relevant, but I would prefer not to be very involved in it (Considering Medawar's Dictum - "Theories are not displaced by facts, they are replaced by better theories." - I should rather concentrate on that replacement). But the field could use a thorough ploughing so that the seed might find a more receptive ground.
The matter is being presented at a Business Object forum (rather than, say, to the OMG's Architecture Board or Object Model Sub-Committee) for many reasons, including these:
Back on track with MACK, we will pick up speed dramatically, all of us together, in a highly-collaborative way. The whole field is fertile indeed.
This Working Paper describes some of the first crop that may be expected, then hazards some general reasons why the simple seed is the right one, and (trying to blend my metaphors...) we briefly look ahead to that cascade, that torrent, that Mainstream of varied and golden grain that we may expect to produce together. At various stages I solicit further input, and explain why this Workshop could be an appropriate source.
During the Workshop, and seeing that this is The Mainstream, and on the basis of Jeff's BO Workshops I and II (and the first two already submitted to this number III), I confidently expect to be able to show how each of the other contributions hides some perhaps unexpected fertilizer and seeds for the shared harvest.
At the Workshop I shall give some further and more up-to-date pointers as to when others will be able to sow their own various kinds of seeds in the newly-charted and fast-readying field.
The core of our common aim is an IDE: an Integrated Development Environment. And since the growing trend - as an important current of The Mainstream - is for Users to become their own Developers, it has to be an Operating Environment too.
Another burgeoning Mainstream current has Web browsers coalescing with operating systems and into applications: Metaset will be a leap ahead in network/local integration and synergy. It will show the way towards the joint creation of "Profound Congeniality" in the individual's interactions with the universe of Common Knowledge. It will further prepare that fertile field and plant the initial seeds.
It will not claim to do it all. Far from it. The market will take the job further - of course! - though we may not only count on it but build on it and have great expectations of it ... if it is duly cultivated and stimulated.
So the Metaset Boot will perform its relatively simple bootstrapping role in an adequate way, which is at least to:
Thanks to semantics-based version management and the collaborative market mode of change, user sites will be smoothly upgradable throughout the process, yet choice and user-constraints will be respected.
All that is also thanks to some unique features in the Boot operating environment, most of them also being handy little illustrations of the emphasis on relevance rather than on mere data:
All the above, though initially in a very elementary and sub-optimal form, will be in the Boot product, where it will be both workable and sufficiently illustrative for ISVs to run with the idea.
That is, after all, the function of a "market bootstrap": to be refined and elaborated by the market itself, into the indefinite future, becoming an ever fuller "market vehicle".
There will also be many pointers to strategies in respect of legacy applications (See also in my faq q 10 reply).
Such apparently foolish - even mad! - ambition is thanks to the entire internal design having been done in MACK-canonical terms (See also my faq q 12 reply). That provides a most relevant proof-of-the-pudding, and indicates the level of functionality that MACK makes so much easier.
There is a central set of fundamental architectural reasons why all the above is possible: building on the "magic ingredient" below, MACK has unique concepts of operation and process in an intuitive and even congenial semantic net. They lead to very different notions of resource management and security in a combined development and operational environment. They in their turn result in a revamping of multitasking and multiprocessing, itself a key enabler of the architecture of the operational environment as described above.
OO people may note - to the satisfaction of one school there - that dynamic MI (Multiple Inheritance) is a pivotal aspect. Further OO features include strict inheritance, no "fragile baseclass problem", far fewer, far more stable and generally simpler procedurally-coded methods, while even with those few there is a quantum leap in reusability. MACK's basis is truth and logic, and that is very different from the conventional "service" basis. So MACK does not have to consider the conventional (and insufficiently-asked) OO question of how a client is to know which service to avail itself of, and what to expect of it (Name, signature, and devices such as pre- and post-conditions miss the point. As do Java's so-called introspection and reflection. No other present or mooted repository comes anywhere near MACK's natural approach). In short, MACK offers a combination of stable reusability yet with flexibility and refinability that is presently not dreamt of. And yet, easy simplicity results, as a couple of small examples illustrate: multiple effective subtyping levels as seen from various viewpoints need not complicate the picture as seen from any one viewpoint; and all events and notifications are handled in a uniform way.
All indications are also that - for a given high complexity of functionality - even the Metaset Boot will have an average performance comparing well with present OO's, even without the partial code-generation-and-compile option that later versions or add-ons will offer (and which derives beautifully from Metaset's special MACK-invited combination of resource-, transaction- and market-management).
We must start against the appropriate background:
In a crucial and epistemologically-based break with the past,
Simplicity helps us manipulate our symbols easily and reliably. The ER model is intuitive and simple, and MACK's core concept, the "typology", or set of related types and their associated metadata, with a well-defined coherence, is a simple combination of ER-based semantics. (Though - sorry! - its internal details are not for now...).
The typology has strict formal properties. MACK being intrinsically reflective, the typology's own consistency is fully defined and reliably enforced in terms of self-descriptive typologies.
Thus MACK has no formally-different metamodel. (And hence it has no need of any complication such as a "metametamodel" as often mooted in this kind of context!) I should add, however, that I have not pursued those formal aspects in a mathematical way. I have, rather, provisionally contented myself with their practical programmability. Nonetheless, I will hopefully soon be in a position to help better mathematicians than I in applying their own rigour and creativity to the model's refinement and elaboration.
Together, the above two features enable the magic:
Where desired, in a market-like, easy and robust plug-and-play
The process is systematically mediated on the basis of "Common Knowledge" (or "CK"), which is that which both components have in common. That commonality must also be a correct typology.
The result is an integrating and multiplying effect (in figurative contrast to the merely additive effect of conventional OO's mere aggregation of components and services), with many useful consequences.
To start with, such "typology-binding" implies an associated (though often null) data transformation (i.e. of associated instances), which an implementation must effect automatically and generally transparently (though user involvement may also be called for).
That will often appear to the user to amount to some rewarding local and even remote data-mining.
Since the instances involved are usually merely being subtyped, the processing is not generally as burdensome as it may appear. It may also be deferred. Or a test subset may be created for prototyping purposes (Coherent subsets may - naturally! - be specified easily, as they reflect the coherence of the typology).
Thus bigger and often surprising meanings may be created easily, with behaviour - often radically different - that neither component had alone. That is synergy. Simple, meaningful, relevant power.
But is it safe? Quite apart from formal correctness and accompanying data-consistency guarantees, we may remind ourselves of the market background: component typologies will typically be offered by ISVs and fully supported in the more functional market infrastructure. The issue here is the relevance and correctness of RE-methods, for which the supplier is the first point of assurance. (The supplier relationship has many further technical and commercial implications, but we don't need to go into them at this point.)
Thus typologies defining application aspects are components for amalgamated typologies looking more product-like in their addressing of user-recognizable needs.
The same applies to any supplier/user interacting-pair of applications, for implicit in that too is one single typology. Thus properly semantic "next-generation EDI" follows quite easily from the automatically-discoverable yet user-extensible commonality between the parties. That applies to distributed searching as well as to the more conventional transactions involved in market trading. (We may note that the whole trading scene is transformed too.)
As ever, relevance - which includes that more specialized sub-issue, access security - is not only taken fully into account but is tantamount to being the key perspective. "Relevance" is also the user's view of the already-much-mentioned "relativity" and "relativistic views" of the architectural picture.
However, the correctness of amalgamated typologies need not imply an exact match with user needs. So resulting application products may also be selectively unbound.
Both binding and unbinding have many traditionally-tricky aspects which the above-described operating and data-management environment can usually undertake largely transparently. For example, a dumb undo or rollback often misses the point: we may not want to unwind a stack, but pick from it and sort out any problematic consequences. There is a host of further issues less obvious to the user: concurrencies, dependencies, versioning, distributed replication, market-sensitivity ... to mention a few.
There we may observe a whole bunch of reasons why conventional DBMS has to go: a whole new approach to concurrency and transaction management is required, as already indicated in the "true love" document's dismissal of conventional ACID).
But is all that practical - even possible - in a simple architecture? Here we do well to bear in mind that there is no real complexity in these manipulations with already-accepted - hence deemed-"clear" - symbolic abstractions, despite the accompanying synergy or dis-integration (as the case may be). So MACK-enabled reflectivity-based automated following of logical consequences, together with dynamic in-context help to the user where necessary, can realistically simplify such complexity (or what is often called complexity but which is perhaps best called complication!).
Preferably, on the other hand, newly-bound applications, starting with their given commonality, may be refined to meet particular needs better. That generally-necessary process too may be stimulated and supported by further typologies (in the usual reflectivity-enabled way).
Thus while the typology is comparable to the conventional OO "framework" (This is, after all, The Mainstream), it is more useful to point out that the typology is effectively the Business Object that the BODTF and these three Workshops have correctly been seeking. It is the single unit for user manipulation of application specification, or molecule for combining into nourishment (Note: it is not the atom...).
Thus, true to the OMG's original OMA aim - of 1990 but regrettably lost in 1991 - MACK meets the complexity-demanded "generalized object model" ideal in that the typology is a multi-type entity and user-meaningful operations are defined, analysed and effected solely in the context of containing typologies (and are not directed to a one-class target, as in the OMA's "classical object model"). That accords with frequent comments on the need for an appropriate way of handling inter-type collaborations (see e.g. my earlier Booch quote).
Where the BODTF has a "Common Business Object" concept, MACK has an "Essential Application Typology" (EAT), which is at the highest level of abstraction at which a particular application concept makes sense.
The soon-to-flourish MACK-compliant market will offer a tempting array of EATs from which more substantially-featured refinements may be concocted and cultivated. (Though maybe we should avoid the TLAs as the essential variety graduates through the refined, cultivated and fully-featured stages ... despite the delicious ironies and hence tempting and appropriate invitations to those constructive knowledge-revolutions of which civilization is made?) See further under A&D below.
This being The Mainstream in much detail, the typology is very close to many familiar IT concepts. It nonetheless amounts to a new IT paradigm in the representation and manipulation of abstractions, both in shape and in effect. The conventional scene is so near yet so far!
And it is still a new paradigm, even though it is a natural pursuit of the even more venerable and insignificantly-disputed Mainstream in epistemology (The ambiguities of "Object Identity", for example, were pioneered - at least in Western philosophy - by Heraclitus, two centuries before even Socrates and Plato).
Its compelling historical, logical, organic, elegant and yet practical nature will convincingly brush aside the OMG's CORBA and its underlying OMA, as well as Microsoft's ActiveX and its underlying COM. (And Sun's Java and its underlying... and its underlying... Wasn't Sun an early pioneer of that interface-based hence so cpu-centric RPC which led the others astray too, including the OSF? And didn't that arise out of the similarly cpu- or process-centric Modular Programming, from which in the mid-Seventies sprouted the interface-circumscribed three-schema architecture of ANSI/X3/SPARC?)
If the Workshop is interested, I could try to portray graphically another image that has guided me in an epistemological way, further to the "two images" of my paper of last year. Like the "Scylla and Charybdis" image of that paper (and find "Scylla" throughout its faq), it too dates back to September 1966. It is also helpful in contrasting MACK with the OMA (and the others), and was to a considerable degree instrumental in the above-mentioned "crucial break with the past".
If the typology is based on abstractions and plain logic, what is the connection with reality? That is the role of the "RE-methods" which import a "Realworld Equivalent" notion, for which I refer you again to the relevant passages in the "true love" document.
With a view to matters raised below, there are further RE-method aspects that are usefully mentioned at this stage, particularly time and space. (As the philosopher Immanuel Kant insisted over two centuries ago, in his Critique of Pure Reason, those two additional concepts are so central to our reasoning that - along with mathematics - they could be elevated to the rare status of "synthetic a priori" concepts.)
Time is much raised in my previous documents (Starting in my OOPSLA'96 BO Workshop paper), mainly in the context of the difference between "naïve realtime" and "relativistic realtime", but also mentioning transformations between one "time" and another.
(That is not as complicated as I may appear to be making out: for example, an OLTP-based accounting application's close-triggered rollup into an OLAP MDDB is partly such a transformation (i.e. "On-Line Transaction Processing" feeding into an "On-Line Analytical Processing" "Multi-Dimensional DataBase", as used for so-called BI or EIS, i.e. "Business Intelligence" or "Executive Information Systems"). And, yes, the MACK DB architecture does cater nicely for MDDB!)
Thus there is a time dimension that is always part of the metadata of a typology, and there are simply and largely-transparently reusable RE-methods to look after it. That has of course got quite a lot in common with the "temporal database" concept (cf. TSQL, the temporal extensions to SQL). MACK's approach is however far more integrated, and - critically - the time component is attached to an entity - the typology - which has the appropriate application-level granularity.
(In that respect we may note that the Relational DB architecture - with its stress on the table with its finer granularity like that of the classical type or class - commits essentially the same "classical object model" error as the OMG's OMA.)
Space on the other hand does not find formal and non-trivial recognition in familiar business information systems: it is usually "abstracted-out" (though there is an exception very relevant to MACK implementations: windowing systems' graphic coordinate systems, also with their own transformations).
It is however rising strongly with GIS ("Geographic Information Systems"), and as our applications' integration grows, and considering to what high degree geographical information is Common Knowledge and hence a basis for further defining and facilitating semantic interoperation, we may expect to see a lot more of it, and benefit from the opportunity.
There is a handy example of its more obvious practical importance: as we shall touch on again in the "getting down to it" section, there is a strong trend towards defining and "internalizing" business "externalities" with a view to simplifying complex environmental management with the help of some plain accounting. The integration of GIS into some accounting systems is part of it.
We may observe further that both the time dimension and temporal and spatial granularities, together with associated mappings, are key aspects of GIS (though still very inadequately explored and exploited). Hence we may expect MACK's formal and central provision for transformations to help smooth the way ahead in spatial applications too.
Space also provides an added dimension for medical and sociological research, so its proper incorporation will open new opportunities there, as well as in the many disciplines where GIS is already rising.
Thus, whether we look at EATs ("Essential Application Typologies" or basic application components, as we saw above) with their amalgamations and refinements, or at our shared occupancy of a richly-textured time and space, Common Knowledge comes to the fore and will generally expand exponentially as the MACK-compliant market gets underway and CK's exchange with further parties is thereby seeded and nurtured.
We will all be in a natural market mode. "Sharing" tends to be idle, while "supplying" carries and develops both responsibility and a truer interaction. Mere availability in an abundantly-varied world is less valued than relevance. (Having said that, I must add that I expect that certain kinds of freeware and shareware - of many varieties of MACK-compliant information - will blossom beyond all present imaginings.) The application-development market will change its very nature: "All changed, changed utterly." (William Butler Yeats, Easter 1916, on that other revolution-ridden country...)
The market will forever bootstrap itself. Market velocity will be dramatically higher than we are presently obliged to accept, thanks, as ever, to its human participants' greater effectiveness when helped by the fluent recombinability of the typology and its evolutionary testing within the managed market infrastructure. That is much as the speed of evolution of viruses contrasts so with that of mammals. And that is not an ominous thought, as, thanks to the evident and growing vigilance of netizens, the market will always be people-driven, according to our own evolving definitions of congeniality.
Physical and human reality rather than information bottlenecks
will become the ultimate constraint
Such realities' ever-mysterious boundaries with the unknown will forever position and shape the eventual plateau of each new market-segment's sigmoid-curve after its newly-"informed" exponential growth phase.
Commercial and political landscapes will be transformed too, and will demand a whole new joint review of "fixed" and "movable". Establishments of all the professional and more informal kinds - as the self-appointed and exploitative guardians of our humanly-meagre mastery of complexity - will be shaken. But that immobile Charybdian figtree will forever remain there, even growing as the boundary between "easy" and "difficult" shifts outwards with the help of the new tools for helping us simplify complexity together. However, that process will ever better help us follow the competent rather than the mere chancers ... and how that will be to the educational benefit of all concerned!
The new dispensation will demand an ever more vigilant democratic management of the market. Naturally, supply will oblige, and much more effectively than it presently does in that domain. (See the "Sixthly" item of my faq q 3 reply.)
Any single user will gradually accumulate and fine-tune a personalized "total product" or set of typologies to mediate his or her own Profoundly Congenial interface with our enormous and ever-growing world of sharable and usable information. At issue is not mere information-filtering, but easily-defined and specific relevance to the user's complex activities, needs and values.
For the medium term I would foresee some such "Personal Assistant" residing on person-unique smart-cards or smart-watches (How about "smatch" for a useful word there, considering also its supply/demand-matching role?), for ubiquitous and virtually instant personalized access to the networked universe. Thanks to extensive and semantically-managed Common Knowledge, it can be small enough, effectively "thick client" though it will be. And it will generally and largely-automatically keep track with relevant progress elsewhere. (Forget such unintegrated and desperate notions as the "mobile agent" as the market better performs its expected function!)
For the longer term we will indeed see the ultimate "thin client" option: a triggering voice-print or other physiometric combination, or - for the shy or fearful or confidently goal-directed - a quick series of keywords or keypunches (or even gestures or thoughts) revealing a freely-sharable though specialized passage into a market niche that is already known to be congenial.
In that longer run, and as already part of the burgeoning SOHO phenomenon, I would agree with the Mainstream expectation that sees the office de-emphasized and occasional meeting-places - often blending work and pleasure - rising in their stead. They will tend to merge with the familiar conference-centre or newly-wired community-centre and even with the home. They will be multi-purpose and "mediated", but configurable as group-congenial mini-marketplaces.
They will not only be for business but also for the ever-sprouting and ever-changing groupings of Civil Society that will proliferate and flourish as we work at filling the gaps and resolving the distortions that our poor data-based abstractions leave in our selves.
However, please note that Metaset/MACK does not involve natural-language understanding (though I am sure that this product and that need will in future be more closely associated in several still-to-be-exploited ways). The present assumption is that, with a view to its sharing, information will be represented according to agreed-upon standards, according to an evolving ACK.
(I might add nonetheless that the Metaset Boot displays a form of generated natural language output. Very stilted though it will still be at launch, its elaboration and honing will be an important development direction from the outset of the MACK-compliant market.)
As the inventor of the Web, Tim Berners-Lee says it both well and persuasively: "The job of classifying all human output is a never-ending one, and merges with the job of creating it." We may expect the participants in the market to live with the conventions in their own individual interests, as we are always doing anyway. And we readily adapt to new ways if they too are simple enough, as Berners-Lee must be observing, at least to his own so well-earned satisfaction!
BO assumes the market. I am alleging that both have a scope rather greater than the BODTF or at least their published documents dare to spell out. But the all-pervasive market and concomitant globalization threaten us with uniformity in all things except peace and prosperity. "La pensée unique" is becoming a swearword in not insignificant quarters (i.e "the single ideology" or reductionist "unithinking". It is a much-echoed and at least thought-provoking invention of Ignacio Ramonet, editor of Le Monde Diplomatique, and a welcome variant from the usual "dogmatic liberalism" or "neoliberalism" of other critics).
Since simplicity is MACK's suggested modus operandi, let us first review its benefits and its dangers in general, to be more sure that we are on the right kind of track. (This section is also a recasting of the first half of my 1990 "Ride The Mainstream!" founding document of the Metaset project, which argued the effectively unrestricting universality of the market. See also the Sixthly item in my faq q 3 reply, yet again. Its Fourthly and Fifthly items touched on another positive angle.)
It is the simple manipulability and intermixing of the basic terms of the typology that will really make it all feasible. But does that implicit reductionism not invite an irreparable injustice to our realities? (That is not idle purism: it includes, for example, the issue of "cognitive dissonance" that much preoccupies A&D.)
To start with, we may find courage in some excellent precedents: alphabets and number systems have already shown - as virtually all of us confidently believe - that what is an even greater symbolic simplicity need not preclude an all-encompassing yet satisfactorily-workable elaboration of meanings.
Simplicity need not oversimplify, especially if we constantly grow and renew it. On the contrary, it is essential to our being. We cannot do without it. We can only progress by doing it better. That introduces many further issues, as this many-dimensioned yet still merely illustrative selection of comparably-complementary pairs of concepts shows (and where we may note that neither side has a monopoly of either good or bad, and ambivalences abound):
Design and Synthesis
|Analysis and Decomposition|
Scylla the many-headed monster
|Charybdis the whirlpool, and the figtree|
|Bewilderment, and profiteering|
"... the worst are full of passionate intensity."
|"The best lack all conviction, while ... " (Yeats)|
|Bureaucracies and Establishments|
|Religion / holiness|
Quackery and charlatanism
|Effectiveness and saintliness|
Imposition and Uniformity
|Discrimination and Diversity|
Compromise and Reconciliation
|Precision and Perseverance|
On the whole, and speaking crudely, humanity's objectively sharable adventure is first to correct the inevitable but ultimately minor aberrations and invent and develop the meagre tools on the left-hand side. With their help we then seek to resolve the more interesting tensions on the right-hand side, and discover its real depth and exploit the wealth that our poor human artifices so hide.
(For we know how the immobile Charybdian figtree of the Establishments so veils the dynamic whirlpool of our given complexity -- that complexity which produces in us that fearsome yet fascinating and inextricable couple of very worst and very best. And all the while we bear in mind that everything on the right-hand side is poorly mediated by all these and other such mere words...)
One might carry on at length on the many ways in which our approaches to most of our more mundane problems will be made more effective and efficient by this so abstract-seeming exercise. But let's not pursue that angle here, as it is better that we should use the medium of Metaset itself to plant our seeds and help us all express and grow and develop it, in the highly practical and relevant ways that MACK-compliance will nurture. We children need bread, not stones (Huh? Okay, to start with, consider how the Web has shown us the potential of hypertext rather than linear text like this. Then, all Web users know how deadwood or "hyperjunk" or both is often the result. So now carry on:)
That is where relevance comes into the picture. All our activities are more effective and rewarding when more coherent with our own best understanding and expression of all our needs and values. Rather obviously, there is a limit to the degree to which the computer will ever be able to help us there. But equally obviously, with ubiquitous and penetrating computerization further helping us pursue the historical process of discovery, formulation, elaboration and evolution, there should and will be an ever-growing convergence between products and needs, between activities and values.
As an immediate and practical example, the "Beyond the desktop" operational environment described above is not about to supersede the "desktop metaphor" with just another intervening metaphor. It is reality. It will actually be our own part of the communal market vehicle for our own "inner" realization together, ultimately in the most humanly profound way.
That is the big scene of which "Profound Congeniality" is part, addressing not only Business but Everything Else.
And yet, despite that individual-centred view (and "phenomenon of knowledge" axiomatic ontology, in which, in a Cartesian way, the separated subject and object of knowledge are the basic or undefined terms), the market aspect restores the communal balance (and will help us work on that posited holistic or monistic metaphysical ideal that the mystics have always tried to express -- and which I along with many others cannot really grasp, much though the notion could be feasible and is attractive ... much as the misconceptions of it are seductively beguiling of the innocent, as demagogues are quick to exploit ... and much as we must heed Goethe - "Whereof we cannot speak, thereon we must keep silent." - in trying to live out the lesson of history's real mystics).
The emerging number-one problem, so brought to the fore by the Internet, is now no longer how to store or convey or retrieve masses of so-called information from anywhere to anywhere else. The architectural challenge is for us to discover and invent together how we may make communication more relevant, more congenial, to every individual participant, in the broadest sense: economical, accessible and assimilable, but rich and never-endingly rewarding. ("An architect does not tell people how to live, he creates an environment in which people may live their own lives creatively." -- H. Zemanek, past president of IFIP)
Meanwhile, though I am not as yet willing to divulge just how it works (as I have discussed elsewhere), I can reassure you that the essence of relevance is nicely distilled and captured in the very definition of the simple typology concept together with how it is applied to the most complex market. For the rest of this paper, and for the Workshop itself, let us take that as given.
So Metaset as a market vehicle will build on that final pair in the list above, Supply and Demand, helping suppliers make sense of the ultimate democratic reality of demand, that is, "helping people simplify complexity" and thereby addressing the other issues there too. Thereby it will help us collectively to find the appropriate balance whenever the opposites conflict.
That is the way human knowledge evolves - for this is The Mainstream - and Metaset will bootstrap the MACK-compliant market which will help stimulate and test our creativity in evolving it better ... and riding that river in an exhilarating, joyous and fulfilling adventure. (Wow that's great but ...)
Since the market in general, where the ideal is for all Demand to be addressed by total Supply, is the medium for all data-based collaborative activity, in every domain, how do we advance the familiar commercial market "Beyond Business Objects", into the almost-universal medium it could become, at least partially addressing virtually all social activity?
And how, on the way, do we reconcile satisfaction with tension, comfort with challenge, specialization with flexibility and comprehensiveness?
That is, at least, what our architectures and our tools should aim to provide for.
The simple yet databased operating environment: If we are to help people (ourselves and others) face complexity, it is surely essential not to confront users with a complicated medium. (Don't worry, with all my words and polysyllables and never-ending sentences, I do not take you for the typical reader!)
A greatly rationalized operating environment such as that described above is indispensable for a medium accessible to all, if we are to make the most of the IT opportunity in helping us broach the world's real problems effectively.
This Workshop could explore that practical issue. What are those artificial impediments, and what - since this has to be such a Mainstream scene - seem the better ways to alleviate them? Or at least, what do present efforts foresee in a desirable and practically-producible environment, given presently-proposed architectures? (Even though the OMG does not address these issues directly.)
(It would be nice to press the relevant parties for some real answers to that question, seeing that present web browsers with all their applets and their simplistic pull and push publishing styles, for example, do not provide anything much of interest as web page and applet caches start impinging on that DBMS reserve that the tiered architecture notion keeps out-of-bounds, in one vast oversimplification. Or as versioning and friendly and participative change-management seem to be ignored by both pull and push, excellent examples of "naïve realtime" that they are (and no amount of "naïve asynchronicity" due to logon-induced deferrals will make them a systematic "relativistic realtime"). And what a cop-out the Network Computer is, connectivity and band-width and logons being what they are! What opportunities cast aside!)
Workflow is a key theme in that intended harmony. But to what extent is the very concept of "workflow management" merely a dictatorial aberration that can never allow the flexibility and cultivate the responsibility that the more complex work environment increasingly demands of us?
What are the newly-"informed" limits of the specialization implicit in traditional concepts of work? In our "post-modern" world, how should IT advance "Beyond Workflow"?
How do we move towards that balance between routine and discursive modes of both activity and thought?
Clearly the routine mode will always be there somewhere, but it can always - at least in routinizing theory - be fully determined on a rational basis. Increasingly, as computerization pervades all, we should have ever more of the determining factors "on the system" somewhere, at some level of abstraction, and since such issues are thereby regarded as being "clear", it is not a complex task to deduce at least first-pass work-item definition as well as managed workflow from such a basis. (The optimum - on the other hand - is often an intractable problem...)
There are enough "design patterns" actually in use that can be "typologized" and simulated in order to do better. That allows rapid - virtually "on-the-job" - analysis-cum-design of workflow. Its run-time management follows easily, given the MACK approach to transaction definition. The Metaset Boot will show the way in its simple groupware functionality, and will offer a necessary yet tempting foundation for anyone wanting to build further. (See also my earlier statement on Metaset workflow, in my comments on Wolfgang Schulze et al's paper on workflow at last year's Workshop.)
Models for application Analysis and Design, leading hopefully to compliant methodologies and tools, are another key OMG focus at present. But: Does Design remote from Implementation make the best sense? Should Analysis best be in implementation-neutral terms? Should we aim for CASE without full and smoothly integrated I-CASE? The OMG implicitly accepts a "Yes" to those three questions (and in the light of the historical course and current state of IT, who can blame them?). MACK insists that the answer is "No!" The market is properly the entire process. I-CASE must merge with the market. And with Metaset it does, in a small and sufficiently illustrative way for a market bootstrap.
We may recall that the typology - as a multi-type entity - is effectively a molecule of reusable design. And that such molecules, thanks to Common Knowledge in a market atmosphere, most naturally knit into larger and more relevant wholes.
The EATs introduced above must not be judged as if they were mere sets of conventional class definitions (much though the definition above might superficially imply that). Moreover, they have global properties, both of the intrinsic or systemic hence reflectively determinable kind, and as further added by their suppliers. That means that the market-integrated A&D process is always far more constructive and even spontaneous than we are accustomed to, to the extent of often appearing surprisingly intelligent. That sometimes diffident, sometimes more confident behaviour will become increasingly pervasive as the market gets to work on creating and offering that burgeoning variety of EATs and their fuller-bodied refinements.
More apparently practically, after a little market expansion we may expect to see the mere expression of a use case - or even of minor frustration with an application - attracting relevant typologies, almost like bees to a jam-pot, which will analyse and further prompt and quickly help resolve a need into the terms of a product.
As a result, Supply and Demand will closely mesh. The more ubiquitous and penetrating computing becomes, the less will it be feasible to separate the two aspects. The MACK typology enables the requisite integration by more precisely yet flexibly relating the enduser or enterprise views to the technical or internal views, however semantically near or far apart the situation requires them to be, and of course by maintaining it all within market perspectives. That will practically bring us much closer to that powerful and unifying yet democratic and ultimately more effective and efficient ideal in which supply more closely tracks demand, to the extent that they tend to merge, with demand often even expressing itself in the terms of supply.
That will happen taking ever closer account of the more profound and valuable issues of the kind on the right-hand side in the above table -- those that over-hasty design oftens slurs (cf. that "cognitive dissonance" so well emphasized in the TRC Inc's BOF Submission). Such expression by demand in terms of supply will be free, less distorting and in any case will take place in a far more market-correctable way. Congeniality will progressively displace alienation.
But let us exercise our collective imagination in some brainstorming way and try to consider now what we would realistically like to see in such an advanced A&D facility. What must be the front-running response to the OMG's RFP, namely UML, is so far from I-CASE as to be of very limited relevance, trying, as it does, to be all things to all analysts past (We charitably overlook, of course, such unrational freaks as round-trip engineering with Visual Basic). As intimated in that "table of terrible twins", analysis and design are the two inseparable sides of one and the same coin. That suggests such a cutting of corners that mere complication will far more easily resolve into effective and reliable simplicity. And what a happy state that remarriage of the long-divorced A&D partners will be!
Kitty Hung and Dilip Patel's paper for this Workshop already very well illustrates some of the ideal themes: the market process combining top-down and bottom-up approaches and building on the very different A&D roles of Business Processes and Business Objects, while - under their "Conclusions" - this extract (like many descriptions of well-conceived and well-led projects) resonates with the effect we may expect to see much-enhanced by the use of MACK-compliant approaches:
As a result, project was delivered on time. User community seemed satisfied with the new application. There had been a complete team spirit between developers and end-users. The change of relationship from supplier/consumer to partnership had made software project more like a business project with developers sharing ownership of the project. The holistic approach of DBOA also enabled developers to understand business better thus bridging the semantic gap between business and I.T. The iterative life cycles allowed the developers to rearchitecture the conceptual model during the entire project phases.
The island enterprise, or unit participant in the organized market, is already caught up with EDI, JIT-enabled and other sub-contracting, core-competencies, value-chains, BPR, rightsizing, globalization and many other information-revolution-wrought changes. The island's defining barriers are being breached all around, and toe-holds on new lands established. Not only because "change is the only constant", we may be sure of an on-going and even accelerated reshaping under the influence of the IT-based market medium.
Enterprise definition and commercial transaction patterns, with the impetus to integration that MACK's semantics-boosted RAD will bring, will need to be applied and adapted accordingly (Though the Metaset Boot will still regard market-transactions comparatively crudely). Robert Haugen's paper is an excellent example of a promising growth-point (That paper - along with Hung/Patel just cited - are the only other ones which, as I write, have already been published on this Workshop web site).
More significant, perhaps, than the well-known business trends just noted, will be the increasing internalization of factors now known as externals, as is well known in environmental management circles (Though benefits need to be brought in as well as costs...), and as we saw under GIS above.
Already increasingly IS-based, a newly IS-catalysed and IS-empowered Civil Society will ensure and greatly reinforce that trend, while electorates will respond warmly to it too.
Does Business appreciate that this is already looming? Not sufficiently, seeing how "global competitiveness" so dominates public fora. I certainly can't see how it will pan out, but it is certain that the effect will be most appropriate to our shrinking planet with burgeoning consumption and dwindling resources of many kinds, many of whose costs are so conveniently externalized at present.
One reason why that trend is not taking off as it should is of course the added complications that it will bring to business management. Simpler and more comfortably integrative computing will greatly help overcome that obstacle which is at present so well exploited by both the lazy and the manipulative on the "supply-side".
(And that is also a handy example of following Homer's general advice on how to overcome that notorious "resistance to change": we must simplify, but not too much (Accept the risk and "Hug Scylla's rock!" -- as Odysseus was warned on approaching Charybdis), in order to avoid that fearful Charybdian whirlpool that threatens the lazy. Later we must use that Mainstream "mast and keel" method as encapsulated in MACK to escape the deadening though merely artificial privilege-trap with which history ensnares the manipulative Bureaucracies and Establishments of the immobile Charybdian figtree.)
But: The medium is the message, as television inspired Marshall McCluhan to tell us, and so instructively.
By now it is also old-hat and Mainstream stuff that while television is badly supply-driven, the Internet is swinging the market balance in favour of demand-drivenness.
However, with penetrating computerization the market will still remain very IT-supply- hence medium-conditioned. Many of our detailed expectations have yet to catch up with the pace of present and future change. All my experience as an agent for change in business underlines that "We don't know what we want until we know what we can get." So in this "Let's get down to it" section - as in A&D in general for at least a long while yet - we are seriously limited in how usefully we can discuss our IT message in a medium-independent way (cf. the three questions in the first A&D paragraph above, and my answer).
More immediately, therefore, considering the enormous degree to which I evidently believe MACK will influence the market in that McCluhanesque way, and though I have high hopes of much guidance from this Workshop, I have therefore to expect both my "trade-secrecy" and the present lack of a nicely-demonstrable Metaset to impose some unfortunate and significant limitations on it.
So I shall try to play it by ear as regards the compromises I shall need to make between discussion-enhancing and project-threatening disclosure of the still-secret details of MACK. And though my usual tendency is to err on the side of openness, I shall nonetheless be trying hard to control it at this stage still...
A perusal of the OOPSLA'97 program has just pointed me to the "Aspect-Oriented Programming" project at Xerox PARC, led by Gregor Kiczales. AOP has a strong prima facie relevance to MACK: Kiczales was one of the designers of CLOS (Common Lisp Object System), and CLOS was cited in the OMG's original OMA document of November 1990 as a sample realization of the "Generalized Object Model". Now I have made a show of blaming the latter's overthrow in favour of a "Classical Object Model" for the dismal lack of success of the present OMA. So if I am correct in favouring the generalized variety, then AOP should have some features of which I would most likely approve. And how true that turns out to be!
AOP has rationale and characteristics which accord significantly with some of MACK's (as MACK has been sketched above and here in the "true love" document). For example:
But, alas, there are even greater differences:
To conclude, therefore: while it compares strikingly with MACK, AOP is very far behind. That judgment is consistent with the AOP project's own admirably-modest statement: "We think of the current state of AOP research as analogous to that of OOP 20 years ago."
So while AOP is still merely going beyond objects, MACK is there already and Metaset will soon be there too.