Skip navigation

Tag Archives: Otto Berkes

When people attempt to build complex software solutions using a mix of vendors, I call it vendor soup.

A recent example is healthcare.gov. That particular vendor soup was rich with delays, chunky with cost overruns, seasoned with vague requirements, and thickened with a lack of leadership and accountability. The GAO report on the problems with healthcare.gov is a great case study. The soup got made, but was it tasty?

Vendor soupThe problem with vendor soup isn’t necessarily the vendors themselves. There are plenty of technology solutions and services that typically don’t make sense to implement from scratch: large-scale cloud computing infrastructures, billing systems, content delivery networks, source code management systems, software development tools, databases, and content management systems are just some examples. Vendors also provide expertise needed on an infrequent basis that may be inefficient to build in-house. I’ve used vendors in the past to help implement hardware prototype designs.

Even companies like Apple, Google, Amazon, and Microsoft use technology vendors because it makes economic and strategic sense; they don’t re-invent the wheel, don’t spend more than they have to, and focus on the things that are core to their businesses. But these companies still have armies of highly skilled software engineers and technical leadership to architect and direct the use of external technology vendors.

Quality software is hard to make. It requires a robust and well-defined architecture, a clear product vision and leadership, well documented code and APIs, consistently high engineering quality, common coding conventions and design principles, friction free communication between functional groups, an efficient and repeatable process for continuous product improvement, and relentless focus on the consumer experience.

Was even a single one of these attributes present with healthcare.gov?

No matter how competent the vendors may be, you can’t create an accountable, coherent, and high-functioning software development organization with vendor soup. Vendor soup is a tempting shortcut that looks great on a PowerPoint presentation to management. But who’s in charge when the countless things that can go wrong so easily in the software world…unstable apps, web sites that crash, user accounts that disappear, confusing user interfaces, links to nowhere, transactions that never go through…do go wrong?

If you’ve managed to cobble together a consumer-facing product relying primarily on vendors, you may be asking yourself: Did I make vendor soup? Obviously, the first hint will be that you have a lot of vendors doing things for you. But that alone does not make vendor soup. You need to ask yourself a few more questions. Does my product scale for future growth? Can I deploy new features and functionality quickly and easily? Are my costs under control? Do I have an efficient, well-defined architecture? Am I generating and protecting relevant intellectual property? Is my product constantly improving? Is the code elegant, robust, maintainable, well documented, well tested, and consistent? Do I have clear accountability? Is my product uniquely differentiated and superior to competitive solutions?

Now, be honest. If you find yourself saying “no” to more than a few of those questions, you have vendor soup. Don’t panic, it’s not the end of the world – you’ve gotten this far and nothing terrible has happened. They’ve fixed healthcare.gov (with help from former Microsoft colleague, Kurt DelBene). And there’s the possibility that for what you need, vendor soup may actually be OK. But at the very least, if you can recognize what you have, you won’t be surprised when vendor soup fails to meet your needs in the future. And if your long-term goals exceed vendor soup limitations, you can start planning a different approach.

Advertisements

This is the text of my 5/17/2014 commencement address for the University of Vermont’s graduate college:

commencement

Thank you very much for the introduction.  It’s an honor and a privilege to be speaking to you today.

This is a very special day for me as well because I received my master’s degree from UVM at a similar May commencement 25 years ago.  When I graduated, I did not imagine that I would be returning one day to address a future class of graduates.

I confess that getting ready to speak with you today has posed a real challenge for me.  I’m a perfectionist.  I wanted to find something to say that each one of you would find useful or at least thought provoking.  I wasn’t really sure that I could give the same advice to someone studying historic preservation as someone studying biochemistry or public health, so that goal seemed like a tough engineering problem to me.  Also, my preferred presentation format includes lots of Q&A and interactive dialogue rather than simply talking to an audience.  I did ponder the possibility of trying something a bit different, but I ultimately decided that it may be too soon to innovate with the commencement address format just yet.  And finally, as a UVM graduate, I felt that I had an extra measure of responsibility to this audience given my shared connection to this school and this community.

My path to UVM and to computer science was not a direct one.

My family and I came to the United States as political refugees.  It was the late sixties, and my native Hungary was still behind the Iron Curtain.  In addition to lacking many other basic freedoms, education was highly controlled and censored by the political system in place, and my parents didn’t want to raise their children in such an artificially limiting environment.  We ended up gaining political asylum in the United States, and I had to get busy learning English.  Back then, there were no classes in elementary schools for English language learners.  But I remember learning lots of English watching The Three Stooges and Bug Bunny cartoons on our small black-and-white TV, and trying to figure out what the characters were saying.

A few years later in the mid-1970’s, the first generation of video game consoles were coming to market, and the first real blockbuster video game was Pong.  For those of you haven’t ever heard of Pong, the game consisted of two electronically simulated paddles that could be moved up or down on the screen with a pair of controllers to try to keep a ball – really, just a crude square – bouncing between them.  If you missed the ball, your opponent got a point.  That was it.  But it was a simple, fun, and intuitive game, and the market was eventually flooded with Pong game consoles hooked up to TVs.  My brother and I received one as a gift at some point, and after the novelty of playing the game finally wore off, I took it apart to find out what made it work.  How did the paddles and the ball get painted on the TV screen?  How did the ball know whether or not it missed the paddle?  I was determined to find out.  Inside the device, there was a printed circuit board with a bunch of components soldered to it, including funny-looking rectangular parts with lots of legs.  As I discovered, these were chips.  And as I found out, the mysterious process that made the Pong machine work involved those chips, and digital electronics.

In the process of taking the Pong machine apart, I broke it, and when I put it back together it no longer worked.  But I still wanted to get to the bottom of the mystery of what made the device work.

The late 70’s and early 80’s were a golden age for digital electronic hobbyists.  Technology was still simple enough to be able to build your own projects from the ground up.  I learned to solder, and how to design and make my own printed circuit boards to do things like count up or down, or measure things like temperature, or create sound effects.  With enough trips to the local Radio Shack, it seemed you could build anything.

Eventually, my projects got complicated enough that they required being controlled by a computer to be useful, so I taught myself how to program.  The majority of my early software efforts were simply a way to bring my hardware projects to life.

I went to college and ended up being a physics major.  Middlebury didn’t have a computer science major yet, and besides, computers were still just a side interest of mine.

That changed when I got my first real job after graduating from college.  I applied for a programming job in the “help wanted” section of the newspaper.  It might be hard to imagine now, but answering help wanted ads in the paper was how people actually found jobs back then.  This was the mid-80’s, the PC revolution was just starting to take off, and people who knew how to write software were in high demand.  In my case, the people looking for software help looked past the self-taught nature of much of my knowledge and hired me.  I dove right in, and re-wrote the tiny company’s bread-and-butter product over the course of a number of months.  By now, I was now completely hooked.  Not only did I love what I was doing, but I was getting paid for it!  It was also fantastic to be able to thumb through a magazine and point to an ad for the product I was responsible for and to be able to say “I wrote that”.

But I also knew that much of what I was able to do was self-taught, and as valuable as teaching yourself is, it has its limits.  I felt that there would eventually be a gap between what I wanted to do with technology, and the deeper knowledge that more advanced work would require.  I loved the science and the craft of building software, and I wanted to be as good as I could possibly be.  That’s how I finally ended up studying computer science at UVM.

When I graduated with my master’s, I could not imagine everything that would unfold in the computer technology area over the next 25 years.  In 1989, the PC was still emerging as a mainstream product, the Internet was essentially a research project, and so many things that we take for granted today – everything from mobile phones and connected devices to seamless access to information and connectivity – were still in the future, yet to be invented, created, and developed.

My next three jobs after graduation were software programming jobs, and I wrote many thousands of lines of code and loved programming.  But there came a point when I was asked to become a development lead at Microsoft.  This role entailed management responsibility in addition to continuing to write software.  After some consideration, I agreed, figuring that I could go back to pure software development if the management part of the job became too distracting.

You may know where this is going.  About a year later, I was asked to take on even more responsibility as a development manager.  This meant an end to me writing code.  But it did not mean an end to me being an engineer, and everything I had learned in grad school continued to be incredibly useful, just applied in different ways.

In fact, this was the period of time when I co-founded Xbox.  The Xbox effort started as an unofficial side project that was not approved by senior management.  I was able to formulate an engineering and technology plan, but now as a manager, I was also able to assemble a small team of volunteers within my group to build the prototype software for Xbox.  This working prototype convinced Bill Gates that the idea of creating a console platform using Windows technology was actually feasible.

Later, I led an effort in Microsoft research developing and patenting new technologies in anticipation of a future boom in mobile computing and touch-based interaction for product categories that did not yet exist such as today’s smartphones and tablets.

I also served in general management and architecture roles developing products, product concepts, and designs that were predecessors to modern tablets and e-reader devices.

When I graduated from UVM, I never imagined that I would have a product design portfolio, or patents, or management experience leading teams of hundreds of people.  Much of my work since graduate school may not seem directly related to a computer science degree, but from my perspective, all of it was built on the foundation of engineering that I established here.

The basics principles in my field are still true today.  Sound engineering practices don’t go out of style, and creative problem solving and innovation still look very much as they did when I graduated.

I have a whiteboard in my office, and I use it to map out designs, processes, architectures, and potential solutions in the same basic way as I would have used it 25 years ago even though today I may be solving organizational or business challenges rather than engineering ones.

Trust the foundation you have established here, and your ability to build your future upon it.  Remain open to new possibilities to develop and grow as the future reveals itself to you.  And stay curious about how things work even if means that you occasionally take something apart and can’t put it back together as I did with Pong.

I want to wish you the best of luck, and congratulations on your achievement.

Thank you.

I’ve tried to keep the number of obsolete reference manuals and technical books I have to a minimum over the years.  That stuff has been getting outdated at the same rapid rate as the evolution of the technology industry.  And with on-line references available for all things technology-related, there is almost no need to keep paper copies of anything.

Despite best intentions, however, possessions tend to accumulate, and when we moved from Seattle to New York a few years ago after being in the same house for close to two decades, it was necessary to do some significant culling.  If I had a book or manual that didn’t pass the “will you ever use this again” question, it went into the donation pile.  The Friends of the Seattle Public Library organizes book sales every year to support the library, and this made saying goodbye to about thirty boxes of books our family assembled much easier.  In this process, I did make allowances for sentimental reasons.

One of the exceptions I made was to hang on to my original copies of Borland Turbo Pascal. It came on a single 5.25” floppy disc along with a paperback reference manual.  This is a picture of the original 1.0 and 2.0 versions that I’ve kept:

photo

I credit this product as much as any other for taking me down a path that would lead me to become a professional software developer.

I was an undergraduate at Middlebury College when I bought it.  Much of the software development I was doing was self-taught using one of the earliest IMB PC clones available – a Sanyo MBC-555.  The Sanyo was not a very good machine and had lots of problems with compatibility, but it was the cheapest PC I could convince my parents to buy.

I had reached the limits of what I could do with Basic, and let’s face it – a real program was a compiled, self-contained executable package (a proper “app” for all the young readers out there), not some Basic file that you had to run through a slow interpreter.  Also, I had been involved with assembly-level programming since the beginning of my interest in computers, and wanted a tool that allowed access to BIOS- and hardware-level functionality, even if it meant hand-compiling the opcodes using the 8086 CPU reference manual.

Turbo Pascal would let me do all of this, and at a price that a college student could justify to his parents: $49.95.  This was a bargain compared to the high cost of any of the Microsoft tools available then; Microsoft’s Pascal compiler was $400.  That was a lot of money back in the early 80’s, and a $400 compiler for a student was out of the question.  At the time, I couldn’t have imagined that I would eventually go work for the Microsoft that wanted so much money for a software development tool.

I bought Turbo Pascal mail order, sight unseen.  There was no Internet as we know it today, no Amazon, no on-line reviews, and my connectivity consisted of a 300 baud modem (that translates to 0.00029 megabits).  Everything I knew about the product was contained in a glossy advertisement in Byte Magazine.  I realize how quaint that all sounds, but when I got the package with the small paperback reference manual and the floppy, I was in programming heaven.  The compiler was incredibly fast even by today’s standards, and produced real executable programs even if they were limited to the smaller .com variant rather then .exe files.  And the fact that Middlebury’s math department taught a few Pascal classes (the college did not have a computer science department back then) was a big plus.

I would remain a big Turbo Pascal fan for a number of years until I fell in love with the C programming language, but that’s another story that also involves a thin paperback that I have also kept to this day.