My personal favorite definition of “Software Engineering” is:
“The systematic application of scientific and technological knowledge, methods, and experience to the design, implementation, testing, and documentation of software“—Systems and software engineering – Vocabulary, ISO/IEC/IEEE std 24765:2010(E), 2010.
I like this definition because I think it best illustrates what is often missing in the contemporary practice of software engineering, as exercised by lead engineers, architects, and systems designers. This is a rather blunt and broad assertion to make, but those who have attended software engineering symposia and conferences over the last decade will have noted there is an increasing trend away from Computer Science foundations, and these events are now focused on presenting the latest increasingly complex and abstract architectures and patterns. At such events we are surrounded by talk of “anti-patterns”, “code smells”, “bloaters”, “cruftiness”, “technical debt”, and many other non-scientific concepts and intentionally derogatory terms for any approach not in fashion.
I think that we must be careful not to let hype and marketing interfere with our methodical and scientific approach to reviewing the patterns and practices being promoted. We must not assume that it is our own lack of understanding to blame if the assertions and approaches being expounded upon do not make sense from an objective scientific perspective. Adoption rate is not equivalent to correctness, and therefore does not remove our professional responsibility to examine the reality-based costs and benefits.
A shift towards popularity driven practices and wrong thinking erodes our focus on applied computer science and engineering for meeting business objectives. We must focus on actual objectives rather than imagining we must build the perfect system from our purely technical perspective.
If we step back and view the general trends in large software projects from a business perspective, we see a tremendous increase in spending and effort to deliver features to production at a decreasing rate, and with only incremental (if any) increase in richness. Most features argued as justification for the next generation systems are abstract and/or future promises of increased scalability and flexibility that never manifest.
I believe we see these results because we have become locked in a cycle of focusing on the most recent frustrations from previous generation architectures and code while devaluing its benefits, combined with underestimating the cost and increased complexity of the new. We are building ever more complex and abstract systems in the hopes that somehow this will result in emergent better specific application domain systems.
But we know it is flawed logic to assume that decomposing a solution domain below the level of the problem domain will result in a system that represents the solution better than a system composed within the domain itself. In fact, there is very solid science behind the understanding that expressing an algorithm in language specialized to a problem domain has more succinct and accurate results1. The more abstracted we become from a problem domain the less efficiently we can express the algorithms required to implement the real work, yet we keep generalizing and adding more layers of abstraction. We have become so enamored with our abstracted and layered architectures that we seem to have forgotten that there is ultimately real business-serving functionality to be delivered.
There are many visible cases of projects stalling as more layers of abstraction are added to make up for the deficits of previous abstractions, and clearly teams that have lost sight that our job is to meet the business objectives with the best return on investment possible. The cumulative cost of these abstraction layers is grossly underestimated. Despite the solid science behind efficiency being tied to conceptual conformity between the medium of expressing algorithms and the problem domain, there is a strong trend in the current software engineering culture towards abstraction for the sake of theoretical benefits.
Across the coming posts I will explore some of the computer science underpinnings and practical engineering principles and applications that I believe have fallen from focus in the pursuit of fashionable technical approaches and a desire to work on “cool stuff”.
About Ted Warring
With over thirty years of experience developing software and architecting innovative solutions, Ted brings a wealth of knowledge to Jonas Chorum. As the Vice President of Technology and Chief Scientist, Ted oversees Jonas Chorum’s technology roadmap, the quality of systems and software, and presides over a team of development managers, quality assurance specialists, IT specialists, and software engineers. Prior to joining Jonas Chorum, Ted was a principal in two software companies focusing on software automation for various vertical market industries, and has spent the last four years consulting for several leading hospitality technology companies.
1 – A preliminary study on various implementation approaches of domain-specific language – T Kosar, PE Martı, PA Barrientos, M Mernik – Information and software …, 2008 – Elsevier
2 thoughts on “The Lost Science of Software Engineering – Part 1”
Thanks for sharing your info. I truly appreciate your efforts and I will be waiting for your further post. Thanks once again.
So true! Bravo!