wiki:XasDocsHistory
Last modified 2 years ago Last modified on 06/25/15 16:03:49

Who am I

In 1978 I graduated from the local community college with a AA degree in Data Processing. I have met one other person with this degree. This degree qualified me to wire Unit Record Equipment, understand OS 360 JCL and gave me the ability to write RPG II programs on an IBM System 3 mini-computer. Along with those skills I learned Dartmouth Basic, COBOL 2 and FORTRAN IV on a Xerox Sigma 9. All programs were submitted on "mark sense" cards and I learned the dubious skill of bench debugging my code using a printout of the submitted card deck.

This degree was obsolete 6 months after graduation and I never went back to formal schooling. At least I was debt free and 6 months later I had my first job in "Data Processing". Since then, I have watched all the trends and fads come and go. The current "Cloud Computing" is very reminisce of utility computing from the '60s and Time Sharing (renting time on mini-computers) from the '70s. All of them were ways to use excess capacity to generate revenue, they all failed because of the issue of who controlled the data. Even virtual machines are not new, IBM had that in the '60s and Devops sounds like "testing in production" with rollback.

During that time I learned a lot about developing code. After a stint of being a paper changer and tape hanger, I moved on to application developer. Which was interesting until I got bored writing the same report with slightly different headings. Then came the fun filled world of systems administrations. A good sysadmin is a lazy person. They automate everything. Soon you have everything automated and then the environment changes. Then you get to do it all over again. After a while you start to think theres got to be a better way.

Then along comes the monolithic monitoring platform. Which is COOL, as long as you buy into their way a doing things and can afford the cost. You also need a dedicated person to polish the knobs and push the buttons. Let's see what was the licensing model, by the CPU, but the box, by the number of bytes that is processed. Hmmm, the software is the same, why the different cost models. Oh yeah, the data is stored in a proprietary database that you don't have access too, and which incompatible protocol stack are they using to collect and monitor stuff.

So you are back to writing those little scripts to automate that boring stuff. Your in the budget, the monolithic monitoring solution is not.

After a while those little scripts have a need to talk to each other. Nobody has done this yet, so you start to read research papers and try stuff. After a while you have a framework. This framework works. It is distributed, anything can run anywhere and talk with each other. You go to OSCON and talk to others and they have done similar things. Soon there is a buzz on the interweb about others, who have done similar things. A trend starts to emerge.

So you polish your code and make it available.

Kevin