|
|
Language SelectionSelecting the right computer language Published in ESP, July 2000 Want to increase your team's productivity? Reduce Bugs? Meet deadlines? Take Jack's one day Better Firmware Faster seminar. You’ll learn how to estimate a schedule accurately, thwart schedule-killing bugs, manage reuse, build predictable real-time code, better ways to deal with uniquely embedded problems like reentrancy, heaps, stacks and hardware drivers, and much, much more. Jack will be presenting this seminar in Chicago (April 23, 2008), Denver (April 25) and London, UK (May 19). Want to be your company’s embedded guru? Join us! More info here. For hints, tricks and ideas about better ways to build embedded systems, subscribe to The Embedded Muse, a free biweekly e- newsletter. No advertising, just down to earth embedded talk. Click here to subscribe. The
Story of Language
The
night, illuminated only by a universe of stars, swallows the sounds of our
movements as we ease through the brush. Coming over a rocky rise we spy a
flickering light, a beckoning distant campfire whose promise of friendly warmth
draws us closer. Though it’s been a long day and a tiring trek wariness still
prevails; we approach these unknown humans downwind and quietly lest the need to
protect their turf turns into a violent confrontation. Closer
now, we listen to a language that’s been unheard for millennium, but through
the miracle of fiction we somehow comprehend. An old man, perhaps 30, gray,
bearded, and sickly relates a story in a mixture of song and epic poetry of the
history of their people. He heard it from this father, who passed it from other,
now dead, generations. The younger folks listen enraptured by the story. In the
manner of youngsters everywhere they little realize that this is more than an
interesting tale. Years will pass before they, too, pass this oral history on to
their genetic legacies. It’s
4000 BC. The printing press lies five thousand years in the future. Writing
won’t appear for another two millennia. Yet a million years have passed since
man became more or less self-aware, capable of learning, a tool-maker who,
perhaps in small ways, strives to improve his lot. Though the great explosion of
new ideas and revolutionary ways of living lies far off in the future, much has
been learned over the generations. Until new technologies come along humans pass
this information along just by talk. Somehow
humankind, though, has learned the most effective way to package information
passed this way: the story. We remember little of a dry recitation of facts, but
are enraptured by stories, as their elements weave permanent patterns in our
minds. We
time travelers press the fast forward button on the Way Back machine and zoom
ahead. Writing appears; some of the earliest written words are the stories
formerly passed around the campfire. Jumping ahead once again we hit the
information age. Lo and behold, even now the most reproduced book in the world
starts with a recitation of many of those campfire stories. Chapter 1: Genesis.
We listen to little children reciting tales of a plague that tortured the world
500 years before – “Ring Around The Rosy”. It
appears, despite astonishing new communications methods and channels, new
“learning strategies” and developmental fads, that humans learn best from
something inalienably simple: stories and storytelling. It’s a fact we neglect
at our peril. This
longwinded introduction is my tale of how I was struck by Michael Barr’s
article in the March issue of this magazine about selecting appropriate
languages for embedded systems development. I started reading the piece with the
expectation of a dry but precise engineering approach to language selection.
Instead, he told a story. One that showed both triumph and failure, a high tech
version of heroic struggles between the forces of good and evil (if you define
“good” as getting a bug-free product out on time!). Beowulf and Barr. As
a vehicle for learning, especially when there probably is no scientific formula
to get the most optimized result, the story is king. It’s interesting that
after 50 years of computer science it seems we still have no definitive way to
select a language, even though that’s the most fundamental choice we’ll make
for a project. Is
Ada or Forth the right choice? C or C++? What role should assembly play? I have
my own ground rules, though these are honed only through the vicious lessons of
bitter experience. Yours may differ. Perhaps, though, I can take a cue from
Michael’s approach and share some of my own stories with you. Epic they’re
not. To me, though, they’ve proven instructional. A C++ Crash
A
couple of years ago an instrumentation company asked me to visit their site and
probe into a project. On the phone the manager sounded in-control and confident
in the team. The company had a broad range of successful products and, secure in
their technology, were working on a new generation device planned to be the
basis of future years of products. Somewhat
reluctantly I agreed to the visit, hoping to learn something from how this
manager was able to coordinate a decent sized team working on quite complex
products. The successes claimed on the phone sounded so good, compared to the
normal chaos of development! But why did they want to see me? At
the site I found an army of developers working in spacious quarters. Unusually,
private offices were the norm, capital equipment budgets were fat, and the
manager was a truly enlightened individual dedicated to bringing the best
software processes into the firm. It seemed there was little I could contribute
other than admiring comments. After
a couple of hours listening and looking, bits of pain started to leak out.
Though past generations of products were indeed successful, something seemed
awry with the current project. 40 firmware developers were clearly working hard
on this new gadget, but time leaked from the schedule like cheap wine from a
street person’s flask. A year into the project they were already 6 months
late, getting later at an ever increasing rate. Worse,
the system had grown beyond anyone’s expectation in size despite no scope
changes. Bugs crawled from every function. An entire group was dedicated to bug
fixes, but like a hydra fixing one created three more. As
the day wore on and the manager slowly revealed ever more problems, his façade
of success finally broke and almost tearfully asked me what I thought had gone
wrong. The
answer, if not clear to him, was obvious to me. Not at all because of superior
intellect or experience; rather, I drew on the consultant’s secret weapon:
theft. Throughout
the day I had had the chance to talk one-on-one with many of the team members.
They knew what the problem was. Though some had trouble separating details from
menaing, the engineers knew all too well what had happened. I merely stole their
thoughts and reformatted them into one idea the manager could understand.
Communications problems are at the root of so many troubles, from home life to
the office to global political issues. “The
code is junk because the developers are just hacking away,” I told the
manager. “No one really understands the language, and so they build things
almost randomly in the hopes things will work.” I
went to the doctor once after reading a newspaper article about a disease that
seemed to match my minor symptoms. After telling her about the article the doc
rolled her eyes and admonished me against playing amateur medical person. In
fact, nothing was wrong; my utter lack of knowledge in this field meant I was
ripe to believe the overly simplified description of complex issues jammed into
three paragraphs of USA Today. Similarly,
the president of this company had read a Wall Street Journal description of the
benefits of Object Oriented Programming. The promise seemed too good to be true:
OOP meant, or so this techno-novice read, that all code would become reuseable.
I credit the president with identifying that reusable code could solve many
problems. Unhappily, just as with my experience at the doctor’s office, the
Journal’s piece couldn’t do justice to the complexities of software
development. This layperson learned just enough to be dangerous. He
issued an edict: from now on, we’re doing all projects in C++! He blithely
went on about his normal management, sales and accounting functions, little
realizing the terrible implications of such a far-reaching rule. Is
this a rant against C++? Not at all. A year and 100,000 lines of junk code into
the project it became apparent that the rule was bad. The edict meant 40
developers were suddenly plunged into an environment they knew nothing about. For,
it turns out, when this project started only one developer, the only one fresh
out of college, had even the slightest exposure to C++ and OOP. 39 others were
learning on the job, cranking code and making mistakes on a mission-critical
project, as they tried to grapple with the very different philosophies of OOP.
Working with objects means thinking out the design of a product in a very
different way than most C designs. C++ is an approach, as well as a language.
Trying to have the developers build an OOPy design as they learned the concepts
was a bit like designing the space shuttle’s orbital dynamics as the engineers
learned calculus. The
result was a mad mishmash of conventional procedural code mixed with shoehorned
objects, a design that exploited the worst of both kinds of programming. And so
I recommended they trash the code and start over. In C. All 40 developers knew
C, all 40 were experts at building systems using a C framework. The
problem wasn’t the language but an arbitrary edict that neglected a critical
part of the success of any embedded systems: having expert people. Though the
president’s wish for reusability was right on, his rule was as disastrous as
telling a newspaper reporter to write in Latin. Without people skilled in the
development environment you’re doomed. And
never expect skills to appear magically overnight. So
to this day I hate having told them to toss out the code – though it was the
right choice. I made the same recommendation as did Michael in his story, which
is unhappily the universal refrain of consultants around the world. “Toss out
the code – it’s crap – we’ll start over.” Firmware is so terribly
expensive! But it’s impossible to retrofit quality into lousy code. Maybe,
with enough effort, we can get rid of most of the bugs in a poorly written
system, but maintenance becomes impossible. And crummy code always has nasty,
lurking bugs that just seem unconquerable. So
the moral of this unhappy tale is that people’s skills are far more important
to the success of the project than any technology. No tool, no technology, no
“new new thing” will save us from the realities of development woes. Skills
first, technology second. They Never Die
When
Intel invented the microprocessor in 1971 I was working as an electronics
technician at a small instrumentation company here in Maryland, putting in too
many hours to support myself while in college. Within a year of their 4004, the
wizards at Intel came out with the 8008, a truly useful microprocessor that, in
my opinion, started the true microprocessor age. The
engineering manager at my company recognized the value of this part to our
products and started designing a new product around the part. The problem: the
engineers there didn’t know how to program computers. Why should they? In the
early 70s computers were still mostly inaccessible beasts used more by the IT
folks than poor EEs. Somehow they found that I knew assembly language – for a
Univac mainframe – and promoted me to the awesome status of engineer. My job
was to write the 4k firmware package for the product. Never has so little been
known by so many, never has so much on-the-job training happened in such a short
period of time. We
eventually did manage to ship a number of these units, which were successful
enough that customers demanded ever more features, more than could fit in the
8008’s 16k address space. It’s 1975 now, and the 8080 has hit the market.
Through some quirk I’m running the greatly expanded micro group, and we
redesign the instrument around this new processor. It too ships. We
designed the product using another Intel invention, the EPROM. They guarantee
data retention of ten years, which of course seemed varnishingly close to
infinity to our very young development team. A
decade passes and, as is always the case, individuals disperse even though the
company continues. The product is obsolete, but customers still have them
working, measuring oil content on polyester fibers in a factor environment. Ten
years to the month after shipping the first of these units they start dropping
bits in the field. The
repair crew at the company discover that all of the original object files exist
on the only media of the mid-70s: paper tape. No one has a clue how even to boot
up the Intellec 8 “development system”, let alone load the tapes and
reprogram the EPROMs. Self-employed now, I’m surprised by a panicked phone
call from them. Plumbing the dim recesses of memory I’m somehow able to
remember how to use the tools, reload the now-brittle paper tapes, and reburn
the EPROMs. By
the mid-90s I expected another call as another decade sailed by, figuring the
memories would once again need refreshing. None came; presumably the instrument
had finally been retired. But I learned a critical lesson from this experience:
embedded systems never seem to die. They run quietly in the background for
years. If we don’t plan ahead for maintenance that may literally span
generations we’re fools. And surely the Y2K fiasco should have taught us the
same lesson about software in general. Selection Criteria
I
won’t even try to tell you how to select your next embedded lingo. However,
from the lessons above, it’s clear to me that we simply must select one that
has been standardized. Prior to the C’s ANSI standard dozens of variants
competed for attention. Portability was impossible. In my opinion, C++ has only
recently become a viable embedded choice, since its November 1998 ANSI standard.
We know that both choices are safe, that a C or C++ programmer ten years from
now will be able to work on our ANSI-compliant code. At
the risk of offending lots of people, I don’t see Java as a serious embedded
contender just yet, unless your system will go away in a year or two. Java is a
language in flux, torn asunder by marketing forces of giants. It’s coupled,
for better or worse, to the two biggest forces of change in the universe: the PC
and the Internet. I suspect that eventually it will indeed be standardized, and
in fact may even supplant C++ due to its simpler, less overwhelming, syntax. But
for now, be assured that the Java we write today will look very different in a
few years. The
exception to standards (because there’s always an exception in embedded work)
is assembly language. Fact is, this is different for every processor, is
inherently non-portable, and will always be a tough way to write code. Though I
do love it so! Assembly makes sense only for time or space-constrained apps, or
for small parts of a bigger project where speed is truly an issue. When
my son was very young I’d end the evening with a made-up story. Most were
silly things meant to elicit a laugh. When traveling I’d often write a short
bit of nonsense that his mom would read in my absence. Now, just barely a
teenager, he seems to have forgotten all of the rules I’ve so carefully tried
to instill… but does remember these stories even long after my failing
middle-aged brain hasn’t the slimmest recollection. Stories
seem to fit the nature of the way the human brain works, much more efficiently
than dreary facts. Urban legends spread like wildfire as arithmetic skills
collapse. And so, another moral for us all is to learn to frame important ideas
in the context of stories. For only those will be remembered. Back to home page. The Ganssle Group
|