A computer person's strange story

Christopher Spottiswoode
cms@metaset.co.za
 
15 October 1998 (minor updates 27 October)

As explained in the historical section of Part 1 of the MACK paper, some of the relevant background to Metaset/MACK might help lend some concreteness and credibility to what is otherwise a rather abstract and incredible story. It might also help you contribute where I have gaps, and judge where I might best fit into any MACK-related projects in which you might become involved.

This document expands on the opportunities I have had for more obviously practical rather than philosophical lessons on software, software architecting and the software business. The story adds to the version in the question 3 answer of my 1996 paper's faq, and with little repetition.

1969: After over four years of physics and pure mathematics at Stellenbosch University, South Africa, and the four years of philosophy etc at Cambridge, England, as indicated in that faq link and in the present part 1, I lectured in pure mathematics at Stellenbosch while trying to finish my intended book, The Phenomenon of Knowledge. Later (1986) it became Beyond Apartheid, but it was not yet ripe, so I proceeded with my plan of going into data-processing as a handy and relevant way of learning about real and practical complexity. As I explained it to my future employer's interviewer, it was "to learn about administration", and to a mystified friend (likewise mathematically-trained) on his way to becoming a sociology professor, it was "to study change".

I was appointed as a mathematician, but coincidentally my new employer (conveniently also in Stellenbosch, near Cape Town) just then took over another company which already had an Operations Researcher, so I was luckily able to concentrate on my broader objective.

1970 to 1972: Worked in the IBM mainframe environment, as a trainee programmer, then programmer, later programmer/analyst, on a project that implemented the full batch-mode integrated cycle of sales order processing and accounting, sales-analysis-based demand forecasting and production planning, raw material control, and bill-of-material-based purchasing with production accounting. Languages were RPG, Assembler and PL/1, the latter becoming my own particular favourite and speciality.

1972: My boss, Tom Bezuidenhout, was keen on "data capture at source". Contrasted with its then alternative, centralized data capture, it was a form of demand-empowerment, as Tom saw clearly, and it did seem practical. He heard of Datapoint "intelligent terminals". Datapoint (San Antonio, Texas) had commissioned a cpu chip from a small company called Intel, but didn't like what they got so they designed their own instead. (As I recall what was at least Datapoint's version of the story, Intel went on to market the chip as the 4004, the first in what became the series of 8008, 8080, 8086, 80186, etc.) Luckily, the intelligent terminal, thanks to Datapoint's excellent architecture, went on to evolve into the world's first commercial desktop minicomputer, and became multi-user in 1973, thanks to its "Datashare" time-shared high-level language interpreter. Meantime, it was only programmable in assembler.

Perhaps not entirely coincidentally, IBM tried to sell us VSPX, their vehicle scheduling package, but for practical turn-around it required capture of orders at source. What better opportunity for IBM to get the company on-line onto a bigger mainframe? But it turned out to be more practical to use off-line Datapoints, even though we had to program it in assembler from the bottom up (like writing decimal arithmetic routines when the cpu could only do 8-bit arithmetic).

1973 to 1975: In same organization, newly Manager of the development department, I led the introduction of Datapoint into three very different applications throughout the country-wide group, though not before experiencing the legendary IBM "FUD" policy, whereby they went over your heads to top management, spreading Fear, Uncertainty and Doubt in the ability of employees who are daring to restrict the use of IBM kit. Fortunately, Tom and his team had good track records by then (while unfortunately for IBM, and despite our best pilot project efforts, VSPX was not practical in our situation).

As hardware/software cost ratios are much higher in South Africa than in the US, we set the hardware such a high application complexity target that the top Texans, visiting in early 1973, told us we were crazy. All the applications worked well, however (thanks to a fine team, though luckily Moore's Law also helped), and the main competition battled to keep up with the functionality we gave our users and especially our customers.

We had free access to the full assembler source code of virtually all the Datapoint system software: operating systems, assembler, compilers, time-shared interpreters, terminal emulators. It seemed normal to write special comms software and operating system patches, e.g. for making i/o services thread-safe (in the Datapoint sense), or intervening in the dispatcher mechanisms. All most exciting! And "divine programming", often right down to the hardware interfaces. (What an awesome feat, too, by Harry S. Pyle, whose name as the programmer appeared at the top of every single source file! Very memorable...)

Most applications were distributed country-wide, while one had daily-synchronized master files at the centre that also functioned as backups for the branches, as the distributed operations were designed to run without trained operators. That also demanded maximal resilient automation of workflow, which in its turn works best with the help of generic approaches.

I became the main assembler programmer as well as chief distributed application designer and programming leader, and came to see clearly that I was better at those functions than at conventionally managing the department. On the other hand, due to the very nature of the distributed applications, we all worked a lot with the end-users so had to understand them well, with their typical problems but often surprising insights and abilities. It was very gratifying work.

I also undertook quite extensive investigations into DBMS, mainly IBM's DL/1 (the basis of IMS), Univac's DMS, and Cincom's Total. One of our teams implemented a CICS/DL/1 pilot application.

1975: While still in the above job, gave a 10-lecture evening course, entitled Philosophy and Education, at the University of Cape Town's Extra-Mural Studies Department. It was an extension of my ubiquitous Scylla and Charybdis theme, now with extra flavours from computing.

One of many lessons for me: the most surprising people passionately keep looking to the most abstract philosophy for relevance to their daily lives. MACK, as a tool to help us simplify complexity, is directed to that task in all-encompassing scope and detail.

1976: Left the large company to start concentrating on my still vaguely-conceived mission. Worked as freelance programmer cleaning-up a Datapoint site that hadn't been doing so well. Contrasting it with our earlier and successful applications led to the "IDIOM" related in my 1996 faq, and whose lengthy saga has played a seminal role in the evolution of MACK, even though the full IDIOM product itself ("No subsets!") never got off the ground.

The basis of IDIOM ("Interpretable Design for Integrated Operation and Management") lay in the definition of the system context of a program. One influence was mainframe JCL ("Job Control Language"), but (so as not to put off ex-mainframers...) I must add that it had a far more fine-grained approach, addressing OLTP as well as Batch. Its core was Boolean-logic-based usage specifications for virtual as well as real resources, and, crucially, end-user-oriented logical design specifications, via logical chains, could be used to manage run-time resource usage and possible conflict (Hence also the acronym).

1977: Thanks to loud talk about IDIOM (at the 1976 Sicob expo in Paris), and on account of our then-free access to the Datapoint source code as well as our 1975 tests on a new and "fairly compatible" competitor, was called as a witness, for over two days of sworn testimony, in a U.S. lawsuit alleging "theft of trade secrets" and a countersuit alleging "restrictive trade practices". Many useful lessons for me there! (I also told both parties that neither side could win. Over two years later they settled by dropping both suits.)

In Irvine, California, designed and programmed PROBE, a debugger that became one of the standard utilities for Computer Automation's Sybol high-level language.

In Irvine, and later Paris and London, researched and conferenced in DBMS and other IT-architectural issues, in further pursuit of IDIOM's potential, though with no immediately concrete result.

1978: Co-founder (with Robert Gibson) of Synergy Computing as a software house in Cape Town explicitly concentrating on the more "complex" minicomputer-based multi-user application (as distinct from the "microcomputers" then starting towards their impending ubiquity as "PC"s and with a relative simplicity and higher level of package use that left fewer comparative advantages for local development and support enterprises).

We first specialized in Datapoint, where we consolidated our standard IDIOM-rooted techniques for building resilient though fully end-user-driven applications. Later concentrated on the HP3000, starting off in COBOL, and extensively using their excellent "Image" DBMS (derived from Cincom's Total, which in those pre-Relational days was the world's most heavily-used DBMS).

Full 1980's: Synergy Computing was the exclusive distributor in South Africa for several HP3000-based application-development and system tool vendors, including Robelle, VESoft and Cognos. Cognos' flagship product at the time was their PowerHouse 4GL, soon ported to DEC VAX, DG AOS, IBM AS/400 and PC. Cognos later also sold the InterBase RDBMS on their prime platforms. Our main tasks were selling, including frequent benchmarking against competitors, and language training and technical support.

We also sought and had much in-depth involvement with customers in consulting, handholding and actual doing of application-development, DB-design, data-migration, and implementation, in a great variety of organizations and applications. Along the way, we had various salutary experiences with "Big Seven"-style (now "Four"?) high-profile consultants and methodologies.

Many good lessons in real user needs and more technical aspects such as portability, DBMS, application- and data-migration, and the system software and software tool industries. Ample close contact with high-level tool extensibility, or lack thereof, as well as the often limited need for it in the real world, and how tool architectures limit extensibility, both by design and unintentionally.

1983: On an idea of Robert's, was instrumental in the local equivalent of the ACM organizing a one-day "shoot-out" between locally-available 4GLs. The occasion triggered a quantum leap in our ability to sell our product: the local market for 4GLs suddenly became more demand-driven. (Oh, the fickleness of markets, but sometimes, how happily!)

1982 to 1985: Revived IDIOM for the HP3000 platform, but it crunched to a halt (again ..., the first halts having been at least partly due to hardware platform switches, in some good lessons on portability issues) after it went off on a language-and-compiler tangent rather than keeping to its original "hyper" or linked database design. That was another good lesson, now also firmly applied to Metaset. (Far more flexible and extensible roots than HTML/XML's Unix text-stream roots!)

A subset, Synq (for "synchronizer and queue-manager"), was worked on (Thank you, Bob Green, of Robelle Consulting in Vancouver, for the name!) A Synq subset was eventually implemented and sold as "PowerManager" in the PowerHouse environment. That was a precursor of the messaging services now emerging for distributed environments.

1984: As part of the Synq work, wrote a paper (presented in Anaheim, California) which, like this one's Part 1, took Charlie Bachman's "Copernican Revolution" metaphor as a starting-point. It introduced the concept of "relativistic realtime" which demands a "time component of data" and associated "transformations between viewpoints". It was presented in the context of application system design and management, especially as it applies in the multi-department enterprise. In 1986 the paper became Appendix B of my book, as it applied equally to multi-organizational society. Those three "relativistic" notions quoted above are of course still fundamental to MACK, though no longer so novel by themselves.

1985: Decided to cut IDIOM/Synq as my long-standing objective suddenly became more urgent as anti-apartheid riots and securocrat reactions spread throughout the country. Started my book, Beyond Apartheid. Its Appendix A consisted of the Scylla and Charybdis story in Homer's Odyssey, for easy reference from throughout the book, so the book's title and two appendices nicely indicate its wide-ranging inspirations and motivations, though it was basically a philosophical work. (This seems a good point to mention that in Cambridge I had also found myself on the committee of the "Society for the Application of Research", which at the time was a rather heretical notion in those ivory towers.)

1986: Completed and published the book myself (conventional publishers having been somewhat confused by its unusual political stance...), at the same time rather innocently attempting to initiate action based on the theory that political change more effectively simplifies yet better respects complexity if it is brought about by Civil Society rather than by either political parties or revolutions, and that Civil Society can be greatly fostered by IT and public-access networks.

In its action aspects the book was more than somewhat premature, especially as South Africa's Prestel-protocol-based "Beltel" national dial-up service was still in trial mode. I also still had to appreciate better that my thinking, while it was -- then as now -- clearly largely right, had really diverged badly from what people are used to. So the book fell flat as the immediate political influence it had aimed to be. Its core was consequently overlooked too.

Its main aim and very basis, however, is still 100% valid, and is still worth studying in detail if you want to explore the now ever-faster-tightening partnership between philosophy, governance, markets and IT. A lot of its thesis can rightly be called cliché, but that partnership is, I think, largely unique, then as now. So in that respect the book was very accurately far-seeing, though in a general way.

For a smaller but quite specific confirmation, I can't resist repeating this one:

Whereas the South African political situation was so extremely urgent, large and complex that the wise did not dare predict, I did make one (and only one) major time-quantified prediction in that book (p. 47):

We might expect to see the fastest growth [towards a fuller democracy] over a 3 to 8-year timeframe.

Naïve revolutionaries thought that far too long, for the hard-bitten realists it was far too short. In the event, the book having been completed and published in May 1986, May 1989 was the middle of FW de Klerk's January to September rise to the Presidency, while the country's first fully-democratic elections were held in April 1994 and Nelson Mandela was inaugurated as president in May. So it is difficult to imagine a more precise confirmation of a medium-term political prediction! And it can't all have been luck.

The timeframe prediction was immediately followed by a summary of the mechanisms then foreseen:

The democratic pressure and the information storage and retrieval technology would probably develop in parallel (with supply and demand alternately leading and following).

Fortunately the supply was growing independently of me and Beltel, and it is now well known how the subsequent collapse of apartheid, like the collapse of the Soviet empire which vastly facilitated our own little process, was strongly enabled by various newer electronic media (fax, video, satellite, etc) whose use was burgeoning at the time. So I can squeeze a partial confirmation out of that one!

The above statement from the book was then immediately followed by the qualification:

The reorganization of government would probably lag somewhat behind.

And how very truly -- and extremely urgently -- that state still needs addressing, in South Africa and globally!

1987: Synergy Computing goes also into the Beltel business, buying a VAX with the already-operational "Jutalex" Beltel-based legal information service on it, offered by Juta, one of South Africa's two major legal publishers. We transfer the service to Synergy's own HP3000-based Beltel gateway system, which had been brilliantly designed and implemented by our Stephen Davies (in just one of his amazing design and programming exploits with us since 1980).

We start developing an Entity-Relationship-based Semantic Net design for an intended public groupware system (though the "groupware" term was not yet current, as far as I know), with "soft" or user-extensible semantics.

The design is an amalgam of the original IDIOM plus descendants, our in-depth DBMS experiences, our heavy use of e-mail since 1985 (thanks again to Stephen), some Expert System smatterings I pick up on a three-day hands-on introduction, and my longstanding quest for context.

April 1988: At South Africa's first national Electronic Information conference and exhibition (which we also helped organize), Stephen exhibits the Beltel-accessible prototype application. Supported by a specially-commissioned multi-projector slide-show graphically and powerfully portraying why and how Beltel might promote the multi-group and multi-niched nature of Civil Society and contribute to constructive political progress, I give a paper on the simplifying business of governance as it is already boosted by IT and might most desirably be further boosted by public-access networks.

Paper and especially show are well acclaimed, but their impact is later very much deflated by the failure of the national telecomm to make good their unofficial talk of following the French Minitel example by subsidizing the penetration of Beltel into everyday use. The video of the show (of which I have some copies, in case you're interested) is however incorporated into the telecomm's standard Beltel Information Provider course.

1988: Stephen and I further elaborate the prototype for the ER-based groupware plan. (Stephen comments "This looks like hypertext!" That was the first time I'd heard the word...). But Semantic Net hypertext-based groupware is more ambitious even than the present html-based Web (cf. the August 1998 CACM article, Web-Based Development of Complex Information Products, in which the authors demand "links as first-class objects". The Web, with XML, is only now generalizing the link concept. The "meaningless link" is also the basis of my dismissal of html when I first encounter it in 1991.) Based though it was on the clear IDIOM/Total/Image data-structuring lessons, I talk still vaguely-perceived designs that even Stephen can't see in programmatic terms. Normal commercial work occupies us. After a while, unfortunately though very understandably, Stephen moves on to a company with a clearer immediate and full-time outlet for his talents.

1989: Synergy Computing (though mainly through our Johannesburg operation) becomes the local distributor for Systematica's VSF (Virtual Software Factory) meta-CASE product, with my enthusiastic support in view of the CASE-like aspects of my 1976 IDIOM and its descendants, and even my pursuit in the earlier Seventies of generic template or pattern-based approaches. We make one big sale. But despite IBM's having bought a 30% interest, Systematica later goes bust. That is a good lesson on the typical fate of very high-level software designs insufficiently rooted in clear and efficiently-implementable structures.

1989 to 1990: I move towards the full-time pursuit, funded mainly by me (and family), of the grand scheme already emerging out of the above mists. "Ride The Mainstream!" is its founding document in March 1990. The document has a mainly political face to it (considering the exciting circumstances at the time and my own on-going Civil Society activities), but is basically philosophical (arguing at length that the market potentially encompasses all social activity, and without meaningful restriction, especially when appropriately boosted by IT), though -- that having been the document's purpose -- it does set what is for me the already clear, complexity-oriented, IT direction.

I sell my share in Synergy Computing to my long-time but long-suffering partner, Robert Gibson, and start setting up a new operation completely dedicated to what becomes the Metaset/MACK project.

The explicit application target at the time has the unwieldly description "database-enriched continuous conferencing". Its own repository and self-directed CASE-like aspect are implicit, given its IDIOM and PowerHouse background (the latter having a data-dictionary foundation and largely declarative approach), and its semantic-net-based user-extensibility.

With hindsight, and seeing how existing Internet utilities are still so unintegrated (as commented rhetorically in The Divine Programmer Syndrome section), the target seems extraordinarily ambitious, but it has the basic simplicity of the then already-emerging MACK, and in its user-extensibility in the widest market one can also see how it is implicitly intended to be merely a "seed" or "boot" product, like the present immediate target.

So the project is not nearly as bad an example of notorious "specification creep" as may appear from later developments!

1991 to 1992: Through a lucky coincidence, we kick off with a contract for a product which is also a useful milestone on the way to Metaset: a metadata-based multi-indexing data-retrieval system including full-text field and document inversion with optional phonetic access, fitting between the PowerHouse 4GL and the InterBase RDBMS (as Metaset was also originally envisaged). We call it "Metakey" (Like "Metaset", the name is suggested by Guy Bullen-Smith, who has newly joined). The experience brings further lessons: the multi-national customer also has us port Metakey from its Metaset roots on our VAX, to their local HP3000 with MPE/iX and their UK-sited HP9000 with HP-UX. The portable version is subsequently developed and maintained on our own HP9000.

Through the Cognos grapevine (Thank you Nigel Campbell!) we have enquiries for the product from other countries, but we had learnt some good lessons as to the demands such a "sandwich" product makes on training and support for both slices as well as the filling of the sandwich, so we decide not to take it further, and rather to concentrate on the full Metaset.

The Metakey experience, especially in its Version 2 which most advantageously eliminates much internal use of InterBase, greatly encourages us in the view that we would do better to integrate our own DBMS into the Metaset product (PowerHouse had already been squeezed out by the Metaset design's tightening integration into a more coherent full-function tool).

The integrated-DBMS decision is firmly taken in mid-1993, the objective being DB fully supporting all real IT requirements such as viewability, updatability, navigability, sharability, efficiency, tunability, resilience, extensibility, transformability, etc, though not artificial demands such as SQL support.

(There's another example of what seems a South African tradition of doing things our own eclectic but coherent and practical self-contained way (InterBase was even applying political sanctions, or trying to!) There are however some more substantive reasons for dismissing SQL: its "subschema-equivalent" (if I may call it that, seeing its roots) using Views, as the context for a query, is far too badly granularized and unintegrated, hence effectively static, while its transactional approach is also too simple-minded, hence bad for workflow. It is a "naïve realtime" concept, rather than "relativistic realtime", while even its later move towards TSQL (Temporal SQL) does not systematically make the break. Since IDIOM days in 1976 I had been aiming for predicate locking (though it was only in 1977, in the UCI library, that I learnt that term from Michael Stonebraker's work). And while HP's Image offered it later, it did so in an unintegrated and hence too limited way. In Metakey we were already using InterBase's lower-level "blr" ("Binary Language Representation") rather than Dynamic SQL, precisely for more dynamic locking. In general, the split between the lower two tiers of the 3-tier architecture does not work effectively or efficiently.)

Our semantic net design, as developed since 1987 and evolved through Metakey, since it can model plain logic too, implicitly has an OO-aspect to it. So in Feb 1992 I attend a University of Cape Town "Technology Week" course on conventional OO. I purposefully miss the final day, on C++ (Metakey and Metaset are in plain C), as I vaguely sense that its model of OO might sidetrack me from the already-emerging "Metaset-canonical OO". And how true that has turned out to be: the MACK "Beyond OO" stance has since firmed-up beautifully!)

We do look seriously at the OSF's DCE, and later the OMG's CORBA, but their RPC basis is clearly irrelevant to the transport-level-based (presently just sockets) "semantic packet" distributed object architecture we were developing (As indeed it is also to Oliver Sims' SSA BOF proposal, and what a sad spectacle it was to see the latter's fine essence being squeezed into the to-it-irrelevant CORBA style and IDL form!). Interestingly, the web, with XML and XMI, is now very similarly aimed, while UML's OCL will surely tend to push UML in that direction too.

August 1991: I give a full day's workshop to two successive groups of schoolchildren, of average age 12 and 11 years (though of above-average ability), entitled Problem-solving in perspective, also as an exercise for myself in the accessibility of very generic problem-solving strategies of the kind that Metaset aims to promote. One of several lessons for me (though mainly from the younger group after I had learnt from the older one...): virtually all welcome the idea and some swim in it, even though I have to present it without the help and patience and individualizable approach of a supportive medium such as Metaset!

1992 to 1994: Parallel to our ever more explicit exploitation of the inherent DBMS and OO capabilities of our semantic net, we also embark on the successor of Jutalex (see 1987), which is intended to support the "Legislative Process" in a fully-participative way. Juta's own chief on-line database designer joins us.

But Metaset's delay (only partly due to Metakey version 2) leaves our legal team in the lurch, so they can only complete a necessary subset, CLM ("Current Legislation Monitor"), which is non-participative and otherwise down-specced. Despite its own qualities (Well done, David Clarke, Geraldine Gonçalves, Ingrid Smith and Kroum Kroumov!), further most notable contributions by Guy Bullen-Smith and Anthony Walker, its user base and the evident market demand, CLM has become too resource-sapping, so, despite the further efforts of Mark Wilson and Chris Newton, we put it on ice and cut the team. (So you can imagine it is quite an effort and a many-sided story.)

Meanwhile, with Guy as chief programmer, Metaset takes some great strides. However, partly because I have to spend a lot of time looking after the money side too (and am otherwise a patchy manager), and also because of my evident urge to pursue the design's high-level alignment, I am not close enough to the evolving technology on the ground, so Guy, like Stephen before him, quits what likewise looms before him as an unprogrammable task.

1995: The money side now provisionally sorted out, CLM distractions behind me, and with my nose forcibly returned to the programming grindstone (in a challenge which I eagerly take up as I still have my own firm though high-level vision), and despite the odds against a programmer who for 10 years (i.e. until Metakey) had done very little 3GL programming, then had learnt C at age 50 and for the first time looked seriously at any windows programming only three years later, the ever-growing market need and the emerging supply start really coming together at last.

Several factors contribute to that. Getting closer to windows-style event-drivenness is a major one (while also more seriously investigating C++ and MFC, and rejecting them both as fraught with unnecessary complications). Moore's Law again helping, I convert from Guy's beautiful client-server-with-thin-client model, with its evident history, to a fully peer-to-peer basis. I cut off my Internet access so as not to be distracted from my own conceptions.

Chiefly, however, I find it impossible not to aim straight towards "canonical MACK". That is very difficult in practice when the relevant infrastructure and tools are still not only unprogrammed but unspecified in much detail. But it really exercises the mind as well as the design! I eventually manage to start piecing the database together, painstakingly but in a very bootstrapping way, and building up the kernel MACK metadata. The whole design comes together ever faster.

1996: Such concrete steps confirming the foundation and giving confidence in the viability and stability of the vision, I undertake a more serious survey of trends elsewhere, and can hardly miss all the activity in the relevant standards worlds. I draw nothing but encouragement for these extraordinary ambitions. What had started out as merely a self-contained though distributed development and operational environment, sitting on top of basic operating system, transport-level and C facilities but otherwise ignoring what was going on elsewhere, suddenly looks like it is a suitable basis for an answer to many needs that have already crystallized-out elsewhere as standards quests!

I encounter JSOT, Jeff Sutherland's Object Technology web page, with its many key links and already with the Business Object Workshop I (1995) proceedings. I join his obj-tech mailing-list. What I see so excites me that I forget my earlier resolve not to speak out until I launch Metaset. Maybe I can after all get others to help get Metaset/MACK going "sooner and better"?

The rest, so far, is Web history on JSOT, with the up-to-date status of Metaset here.