Dan Morena, CTO at Upright.com, made the point that every startup was unique and therefore every startup had to find out what was best for it, while ignoring whatever was considered "best practice." I wrote what he told me here:
No army has ever conquered a country. An army conquers this muddy ditch over here, that open wheat field over there and then the adjoining farm buildings. It conquers that copse of lush oak trees next to the large outcropping of granite rocks. An army seizes that grassy hill top, it digs in on the west side of this particular fast flowing river, it gains control over the 12 story gray and red brick downtown office building, fighting room to room. If you are watching from a great distance, you might think that an army has conquered a country, but if you listen to the people who are involved in the struggle, then you are aware how much "a country" is an abstraction. The real work is made up of specifics: buildings, roads, trees, ditches, rivers, bushes, rocks, fields, houses. When a person talks in abstractions, it only shows how little they know. The people who have meaningful information talk about specifics.
Likewise, no one builds a startup. Instead, you build your startup, and your startup is completely unique, and possesses features that no other startup will ever have. Your success will depend on adapting to those attributes that make it unique.
Napoleon and his army would like to have a word with you…
I get the analogy but I think it can be made a lot better, which will decrease people who dismiss it because they got lost in where the wording doesn’t make sense. I’m pretty confident most would agree that country A conquered country B if country B was nothing but fire and rubble. It’s pretty common usage actually. Also, there’s plenty of examples of countries ruled by militaries. Even the US president is the head of the military. As for army, it’s fairly synonymous with military, only really diverting in recent usage.
Besides that, the Army Corp of engineers is well known to build bridges, roads, housing, and all sorts of things. But on the topic of corp, that’s part of the hierarchy. For yours a battalion, regiment, company, or platoon may work much better. A platoon or squad might take control of a building. A company might control a hill or river. But it takes a whole army to conquer a country because it is all these groups working together, even if often disconnected and not in unison, even with infighting and internal conflicts, they rally around the same end goals.
By I’m also not sure this fully aligns with what you say. It’s true that the naive only talk at abstract levels, but it’s common for experts too. But experts almost always leak specifics in because the abstraction is derived from a nuanced understanding. But we need to talk in both abstractions and in details. The necessity for abstraction only grows, but so does the whole pie.
It's a cute analogy, but like all analogies it breaks after inspection. One might try and salvage it by observing that military "best practice" in the field and Best Practice at HQ need not be, and commonly are not, the same, either for reasons of scope or expediency. Moreover, lower case "practice" tends to win more, more quickly. Eg guerillas tend to win battles quickly against hidebound formal armies.
For a startup, winning "battles, not wars," is what you need, because you have finite resources and have an exit in mind before you burn through them. For a large enterprise, "winning wars not battles" is important because you have big targets on your back (regulators, stock market, litigation).
One might paraphrase the whole shooting match with the ever-pithy statement that premature optimization is the root of all evil.
>> Also, there’s plenty of examples of countries ruled by militaries. Even the US president is the head of the military
Maybe I should have reversed the order of these two. I didn't intend to use the US as an example of a country ruled by a military but rather that military is integral and connected directly to the top.
> If you are watching from a great distance, you might think that an army has conquered a country, but if you listen to the people who are involved in the struggle, then you are aware how much "a country" is an abstraction.
Most things of any value are abstractions. You take a country by persuading everyone you've taken a country, the implementation details of that argument might involve some grassy hill tops, some fields and farm buildings, but its absolutely not the case that an army needs to control every field and every grassy hill top that makes up "a country" in order to take it. The abstraction is different to the sum of its specific parts.
If you try to invade a country by invading every concrete bit of it, you'll either fail to take it or have nothing of value at the end (i.e fail in your objective). The only reason it has ever been useful or even possible to invade countries is because countries are abstractions and it's the abstraction that is important.
> The real work is made up of specifics: buildings, roads, trees, ditches, rivers, bushes, rocks, fields, houses.
Specifics are important - failing to execute on specifics dooms any steps you might make to help achieve your objective, but if all you see is specifics you won't be able to come up with a coherent objective or choose a path that would stand a chance of getting you there.
The army that is conquering is carrying best practice weapons, wearing best practice boots, best practice fatigues, best practice tanks, trucks, etc.
They're best practice aiming, shooting, walking, communicating, hiring (mercs), hiding, etc...
The people that are in the weeds are just doing the most simple things for their personal situation as they're taking over that granite rock or "copse of lush oak trees".
It's easy to use a lot of words to pretend your point has meaning, but often, like KH - it doesn't.
This is frequently not true. There’s examples all through history of weaker and poorer armies defeating larger ones. From Zulus, to the American Revolution, to the great Emu wars. Surely the birds were not more advanced than men armed with machine guns. But it’s only when the smaller forces can take advantage and leverage what they have better than others. It’s best practices, but what’s best is not universal, it’s best for who, best for when, best for under what circumstances
That doesn't defeat my point- is the smaller/poorer army using best practices?
When all things are the same, the army with more will win.
When all things are not the same, there are little bonuses that can cause the smaller/poorer, malnourished army to win against those with machine guns. Often it's just knowing the territory. Again though, these people are individually making decisions. There isn't some massively smart borg ball sending individual orders to shoot 3 inches to the left to each drone.
> That doesn't defeat my point- is the smaller/poorer army using best practices?
I don't agree, but neither do I disagree. But I do think it is ambiguous enough that it is not using best practices to illustrate the point you intend.
> malnourished army to win against those with machine guns
Occasionally something novel and innovative beats the best practice. In that case it usually gradually gets adopted as best practice. More often it doesn't, and falls by the wayside.
I think the word you're looking for is "nation", not "country". A country is the land area and would be conquered in that example, while a nation is the more abstract entity made of the people. It's why it makes sense to talk about countries after the government falls, or nations without a country.
Likewise, people do business with people, not with companies. Assert that “society” is merely an abstraction invoked for political gain to become an individualist.
> people do business with people, not with companies
Many of my interactions are with electronic systems deployed by companies or the state. It's rare that I deal with an actual person a lot of the time (which is sad, but that's another story).
It took me way too long to realize this, but my experience is that "zealots, idiots, and assholes" (as the author says) are going to abuse something and wield it as a bludgeon against other people. This appears to be a fixed, immutable property of the universe.
If you take it as a given that some number of people are going to get an idea lodged in their head, treat it like gospel, and beat as many other people in the head with it as they can... the best strategy you can adopt is to have the ideas in their head be at least somewhat useful.
Yes, reasonable people understand that "best practices" come with all sorts of context and caveats that need to be taken into account. But you won't always be dealing with reasonable people, and if you're dealing with an asshole, zealot, or idiot, I'd sure as hell prefer one who blindly believes in, say, test-first development versus believing that "test code isn't real code, you should spend all of your time writing code that ships to users" or some other even worse nonsense.
In any given culture (software development) you will have a set of traditions that may be used by the members of that culture to signify status, experience, pedigree and authority, and often some combination of all these and other descriptors. And, by virtue of creating a culture, you will also necessarily create a counter-culture, which rejects those things (performatively or otherwise) in an effort to do the exact same thing, but in the other direction. If you decide that proper software is developed with command-line in mind first, and UI second, you will in that effort necessarily create software devs who believe the exact opposite. This is core to our being as humans.
In my mind, this author is merely signaling software counter-culture, some of which I agree with, others I don't. And the people whom you describe above are signaling software culture, in a hostile and domineering way.
And of course, these two sides are not impermeable, forever persistent: culture and counter-culture shift constantly, often outright reversing from one another on a roughly 30 year timeline. And, they both have important things to offer. Best practices are best practices for a reason, and also telling stuffy people to chill out when they're focused so hard on best practices that they lose the plot of what the org is actually attempting to accomplish is also important.
The one thing that gives me pause, is that I have seen stages of mastery where the base stage is repetition and adherence to rules to internalize them before understanding them and knowing when to break them.
If much of our industry is new, evangelizing these rules as harder and faster than they are makes a lot of sense to bring people to get people ready for the next stage. Then they learn the context and caveats over time.
That made me want to look up some link about Shu Ha Ri. Turns out that's actually been made popular in some corners of sw dev already. E.g. https://martinfowler.com/bliki/ShuHaRi.html
I think the rejection is too strong in this article. The idea of, “best practices,” comes from an established Body of Knowledge. There is one published for software development called the SoftWare Engineering Body of Knowledge or SWEBOK; published by the IEEE.
The author seems to be arguing for nuance: that these “laws,” require context and shouldn’t be applied blindly. I agree.
However they shouldn’t be rejected out of hand either and people recommending them aren’t idiots.
Update: one problem with “best practices,” that I think the article might have unwittingly implied is that most software developers aren’t aware of SWEBOK and are repeating maxims and aphorisms they heard from others. Software development is often powered by folklore and hand waving.
I think it is best to strongly reject the idea "best practices will always benefit you".
Most best practices that I have been told about were low local maxima at best, and very harmful at worst.
If someone quotes a best practice to you and can't cite a convincing "why", you should immediately reject it.
It might still be a good idea, but you shouldn't seriously consider it until you hear an actually convincing reason (not a "just so" explanation that skips several steps).
> It might still be a good idea, but you shouldn't seriously consider it until you hear an actually convincing reason (not a "just so" explanation that skips several steps).
If everyone follows that then every decision will be bikeshedded to death. I think part of the point of the concept of "best practices" is that some ideas should be at least somewhat entrenched, followed by default, and not overturned without good reason.
Ideally your records of best practices would include a rationale and scope for when they should be reexamined. But trying to reason everything out from first principles doesn't work great either.
Well calling something a bikeshed is implicitly claiming that it's not so important. Often the specific choice is not very important, but making a choice rather than not making one is important. And while an effective organisation would not allow important decisionmaking to get derailed, many organisations are ineffective.
> Most best practices that I have been told about were low local maxima at best, and very harmful at worst.
This matches my experience, though sometimes they indeed will be helpful, at least after some consideration.
> If someone quotes a best practice to you and can't cite a convincing "why", you should immediately reject it.
In certain environments this will get you labeled someone who doesn't want to create quality software, because obviously best practices will lead to good code and not wanting to follow those practices or questioning them means that you don't have enough experience or something. Ergo, you should just apply SOLID and DRY everywhere, even if it becomes more or less a cargo cult. Not that I agree with the idea, but that way of thinking is prevalent.
(not that I agree with that, people just have that mindset sometimes)
I definitely sympathize with the thrust of the article. I think the reality is somewhere in the middle: best practices are useful short-cuts and people aren't always idiots for suggesting them. I've worked with folks who insist on Postel's law despite security research in recent years that suggest parsers should be strict to prevent langsec attacks, for example. In those cases I would refute leniency...
Although I also do work in fintech and well... card payment systems are messy. The legal framework covers liability for when actors send bad data but your system still has to parse/process/accept those messages. So you need some leniency.
It does drive me up the wall sometimes when people will hand-wave away details and cite maxims or best-practices... but those are usually situations where the details matter a great deal: security, safety, liveness, etc. People generally have the best intentions in these scenarios and I don't fault them for having different experience/knowledge/wisdom that lead them to different conclusions than I do. They're not idiots for suggesting best practices... it's just a nuisance.
That's what I mean about the rejection being too strong. It should be considered that best practices are often useful and helpful. We don't have to re-develop our intuitions from first principles on every project. It would be tedious to do so. But a healthy dose of skepticism should be used... especially when it comes to Postel's Law which has some decent research to suggest avoiding it.
I don't think anyone has ever thought that best practices will always benefit you. Nothing always works every single time in every single case.
This whole thing is really silly and obvious.
Of course you shouldn't blindly follow advice without thinking. But not following advice just because it might not always be right is also a bad idea.
My advice: In general, you should follow good advice from experienced people. If enough experts say this is the best way to do something, you should probably do that, most of the time.
But that advice will never trend on HN because it isn't clickbait or extreme, and requires using your noggin.
> I don't think anyone has ever thought that best practices will always benefit you.
Whenever a "best practice" or "convention" has been presented to me, that is how it has been framed. (...it is best practice, therefore, it will definitely benefit you to follow it)
I do not know what context this happened to you in, but in the context of building something quickly, learning, while not being an expert in an area, best practice are a common crutch.
In many work places either they do not have time or at least think they do have time to think things through 100% for themselves from first principles so they depend on best practices instead.
That makes sense to me and I would expect better results on average with using best practices than rejection of best practices in the above context.
That said I try to work on things where I am not always in the above context, where thinking things through end to end provides a competitive advantage.
100%… a best practice in other traditional engineering practices help us work within the state of the art. They’re the accumulated wisdom and experience of engineers that came before us.
There are plenty of them that help us write concurrent code that avoids common deadlock situations without having to resort to writing proofs every time. Someone already did the work and condensed it down into a rule to follow. Even if you don’t understand the underlying proof you can follow the rule and hope that everything will shake out.
What I find we struggle most with is knowing when we actually need to write the proof. Sometimes we bias ourselves towards best practices and intuition when working it out formally would be more prudent.
SWEBOK seems the opposite of that. A body of knowledge is not at all the same thing as a best practice. The only unapologetic best practice in SWEBOK is that professionals should be familiar with every topic in SWEBOK. Definitely not that you _should_ do everything in the book.
The book is quite sophisticated in this. It explicitly separate the technical knowledge from the judgments of which, when, and where to apply it. Most occurrences of "best practices" in the text are quoted, and are references to other works and describe the need to choose between different best-practice libraries depending on context. Others are part of a meta-conversation about the role of standards in engineering. Very little of SWEBOK is promoted as a "best practice" in itself.
Here's a quote from SWEBOK v4, 12-5
> Foremost, software engineers should know the key software engineering standards that apply to their specific industry. As Iberle discussed [19], the practices software engineers use vary greatly depending on the industry, business model and organizational culture where they work.
In my view best practices emerge from a body of knowledge (or sometimes from the practice and wisdom of others that haven't been documented/accepted/etc yet) and are "shortcuts."
I'm not defending Postel's Law; I agree that, after years of practice and research, it leads to security issues and surprises.
However, the point is that these kinds of principles don't arise out of people's heads and become accepted wisdom for nothing; they're usually built off of an implied (or explicit) body of knowledge.
Sure. Best practices develop by choosing practices to match your context out of a defined body of knowledge.
But SWEBOK is very clear that "best practices" are context specific - they are radically different forces and solutions in video games as compared to chemical engineering control systems. There's no such thing as a "best practice" absent a context. The footnotes in SWEBOK point off in a million directions saying "go look over there for best practices for YOUR context".
To put it simply, best practices are, at best, context-dependent. Best practices for avionics software are not the same as best practices for a CRUD form on your mailing-list signup page.
And to be fair, the best practices for designing a bridge or a skyscraper are not the same ones for designing a doghouse.
This! "Best practice" depends on the circumstances. Are "micro services" a best practice? What about "monolithic architecture"? Those choices are not best practices in and of themselves but may be best practices when considering organization/dev team size, application user count, computational demands on the application/system, etc. What are the goals and vision of the future? Let's future-proof and pre-optimize for problems we don't currently have nor will likely have! (And don't get me started on the number of folks that dream about "we're going to need to be able to scale!" for a fairly simple CRUD app that will most likely be used by hundreads, maybe thousands, or users and realistically need 100's of "simple" requests per second (most likely per minute... )
> In 2016, the IEEE Computer Society kicked off the SWEBOK Evolution effort to develop future iterations of the body of knowledge. The SWEBOK Evolution project resulted in the publication of SWEBOK Guide version 4 in October 2024.
So the thing called "SWEBOK Guide" is actually the reference text for SWEBOK.
The books that encode some standardized Xbok are always named "The guide to the Xbok".
The actual BOK isn't supposed to have a concrete representation. It's not supposed to be standardized either, but standard organizations always ignore that part.
Well, literally the "state" as in what is the knowledge that everybody shares. We usually call that by something closer to "minimum common denominator".
What people usually call "state of the art" is the best knowledge that is reasonably well known. That is out of scope. If you take a look on this one, it's full of stuff that we knew not to use on the 20th century. This is typical.
> but because they’re mostly pounded by either 1) various types of zealots, idiots, and assholes who abuse these kind of “best practices” as an argument from authority, or 2) inexperienced programmers who lack the ability to judge the applicability,
The author might go on to make other points that are worth discussing, but lays out his supporting arguments clearly in the opening paragraph. Best practices do not necessarily do harm because they offer bad advice, they do harm because they are advocated for by zealots and the inexperienced.
My first reaction is how unfortunate it is that this particular developer has found himself in the presence of bad engineers and the inexperienced.
But then, the argument is automatically self-defeating. Why is the rest of the article even worth reading, if he states upfront what his arguments are and those arguments are very easy to refute?
It is deeply irrational to judge the merits of an idea based solely on who is advocating for that idea.
My advice to the author is to reflect on the types of positions that he accepts, the ones that have him so put off by the people that he works with that he is openly writing about abandoning what he admits could be sound engineering practice, solely based on who that advice is coming from and how it is being delivered.
Developing software is complicated. It is constant problem solving. When solutions to problems come about, and we abstract those solutions, it is quite easy for individuals to misapply the abstraction to an inappropriate concrete. To drop context and try to retrofit a lousy solution because that solution was appropriate to a slightly different problem. But at the end of the day, these abstractions exist to try and simplify the process. Any time you see a "best practice" or design pattern acting as a complicating force, it is not doing its job. At that point you can either be objective and exercise some professional curiosity in order to try and understand why the solution adopted is inappropriate ... or you can take the lazy way out and just assume that "best practices" are the opinions of zealots and the inexperienced who blindly follow because they don't know any better.
> Best practices do not necessarily do harm because they offer bad advice, they do harm because they are advocated for by zealots and the inexperienced.
I think the point is that blindly suggesting "best practices" often is bad advice.
It's a common form of bikeshedding—it allows someone to give their casual two cents without doing the hard work of thinking through the tradeoffs.
we don't have to give undue value to 'best practices', nor do we need to judge an idea based on its presenter. we just need to have a reasonable discussion about the idea in the context in which its being applied. this simple approach has been largely eclipsed in the industry by the fetishization tools, and the absurd notion that whole classes of approaches can be dismissed as being 'antipattern'.
It's not very hard to weigh a suggestion. speculate about its costs, benefits and risks.
I think the problem is that a lot of computer nerds are a bit OCD and like to apply one solution to everything. You see this with how they get crazy about strictly typed versus not strictly typed, one particular language for every application, or spaces vs tabs. I was like that when I was younger but as I get older I realized the world is the complex place and has programs have to deal with the real world there is no one solution fits all, or a best practice that always applies. To become good at programming you need to be adaptable to the problem space. Best practices are great for juniors once you've got a bit of experience you should use that instead.
My job revolves around adopting software solutions to solve practical problems and let me tell you, this mentality of one solution to everything goes beyond just the coding. I've encountered countless developers that seems to believe that the reality of a business should conform itself to how the developer believes your business/industry should operate.
Funny, my experience is the opposite. When I was younger I thought there was a time and place for everything, a right tool for the job, a need to carefully consider the circumstances. As I got older I realised that actually a lot of libraries, languages, and communities are simply bad, and an experienced programmer is better served by having a deep knowledge of a handful of good tools and applying them to everything.
The problem is that a lot of true things in the world are counter-intuitive. So insisting that all the rules "make sense" in an immediate way is clearly a non-starter. In the safety industry there are many examples of best practices that are bred from experience but end up being counter-intuitive to some. For instance, it might not make intuitive sense that a pilot who has gone through a take-off procedure thousands of times needs a checklist to remember all the steps, but we know that it actually helps.
It's hard because there is usually some information loss in summarisation, but we also have limited memory, so we can't really expect people to remember every case study that led to the distilled advice.
As a chemical engineer by training, though, I have constantly been amazed at how resistant software people are to the idea that their industry could benefit from the kind of standardisation that has improved my industry so much.
It will never happen outside of limited industries because it would appear to be a loss of "freedom". I think the current situation creates an illusory anarchist freedom of informality that leads to sometimes proprietary lock-in, vulnerabilities, bugs, incompatibility churn, poorly-prioritized feature development, and tyranny of chaos and tech debt.
There are too many languages, too many tools, too many (conflicting) conventions (especially ones designed by committee), and too many options.
Having systems, tools, and components that don't change often with respect to compatibility and are formally-verifiable far beyond the rigor of seL4 such that they are (basically) without (implementation) error would be valuable over having tools lack even basic testing or self-tests, lack digital signatures that would prove chain-of-custody, and being able to model and prove a program or library to a level such that its behavior can be completely checked far more deeply in "whitebox" and "blackbox" methods for correctness would prove that some code stand the test of time. By choosing lesser numbers of standard language(s), tool(s), and component(s) it makes it cheaper and easier to attempt to do such.
Maybe in 100 years, out of necessity, there will be essentially 1 programming language that dominates all others (power law distribution) for humans, and it will be some sort of formal behavioral model specification language that an LLM will generate tests and machine code to implement, manage, and test against.
I disagree slightly here. There may be one (1) dominant formal language that's used as the glue code that gets run on machines and verified, but it will have numerous font-end languages that compile into it, for ease of typing and abstraction/domain fit.
Who drove that standardization in chemical engineering?
I ask, because the intra-organizational dynamics of software have been ugly for standardization. Vendor lock-in, submarine patents, embrace-and-extend, etc. have meant naive adoption of "best practices" meant a one-way ticket to an expensive, obsolete system, with an eventually insolvent vendor.
That's an interesting question. I guess it's partly the fact that chemical industry is very large-scale, often with one company in charge (think Shell or Total). The realities of having one organisation in charge of many large operations across many countries probably gives higher reward on standardisation. This is a bit like coding to "Google style guidelines" or whatever. The big organisation has more incentive to fund standardisation, but the small people can benefit from that effort, too.
The magnitude of impact also means that many industrial plants fall under government regulation, and in the safety field specifically there is a lot of knowledge sharing.
I think there is also a component about the inflexibility of real matter that factors into this. It's much harder to attach two incorrectly sized pipes together than it is to write a software shim, so the standardisation of pipe sizes and gets pushed up to the original manufacturers, where it also happens to be more economical to produce lots of exact copies than individually crafted parts.
The advantage of best practices is that you have something you can follow without having to analyze the situation in depth. The disadvantage of best practices is that you may have to analyze the situation in depth to notice that they maybe aren’t the best choice in the specific situation. The harm that best practices can do are lessened by viewing them as a rule of thumb conditioned on certain premises rather than as a dogma.
“Dogma” is the key word in this situation, I believe (and in a lot of similar situations). There are very few examples for when dogmatic obedience is healthy, helpful, or appropriate. Sadly, the trends seem to be heading the wrong way, with more tribalism than pragmatism.
I like to use the term "golden path" instead of best practices.
In a golden path, lots of others have gone before you and figured out all the nuance. But this doesn't mean the path is the best one for you, but does mean you should have a good reason for starying from it
Maybe the problem is the word Best. Perhaps Standard Practice is better — suggests "the usual way we do things" without presuming it is the best and only way to do things.
How other engineering industries deal with this phenomena? Why those approach do not work with programming? I feel silly sometimes because software development is huge industry and we don't have consensus on basics.
For example I think that strict formatting is a good thing. Since I tried to use Prettier I'm using it and similar tools everywhere and I like it. I can't do vertical alignment anymore, it eats empty lines sometimes, but that's a good compromise.
May be there should be a good compromise when it comes to "best practices"? Like "DRY" is not always best, but it's always good enough, so extract common stuff every time, even if you feel it's not worth it.
I often deal with this dilemma when writing Java with default Idea inspections. They highlight duplicated code and now I need to either disable this inspection in some way or extract the chunk of code that I don't really think should be extracted, but I just can do it and move on...
Those approaches do work with programming, but they don't make use of what makes programming different from other disciplines.
Software is usually quick to write, update and deploy. And errors usually have pretty low impact. Sure, your website may be down for a day and people will get grumpy, but you can hack together a quick fix and have it online with the push of a button.
Compare that to, say, electrical engineering, where there's often a long time between finishing a design and getting a manufactured prototype (let alone mass production.) And a fault could mean damage to equipment (or people) and the cost of having to replace everything. So you'll find that there's a lot more work done up-front and the general way of working tends to be more cautious.
There's also the idea of best practices as a form of communication. This also helps for programmers, as code that looks and acts the way you expect it is easier to follow. But code is primarily shared with other programmers. Other engineering disciplines (more) frequently need to collaborate with people from other domains. For example, a civil engineer's work could be shared with architects, government bureaucrats and construction managers, and best practices often provide a common familiar standard.
Compared to other engineering disciplines, software is a big unorganized mess. But it's also incredibly fast and cheap to make because of that.
You can destroy rockets, lethally irradiate people, fly planes upside down, or financially ruin a company because of software bugs, so avoid faults can be critical for software as well.
It is just that high-velocity low-reliability web and consumer application development is a very large niche. A lot of our best-practices are about attempting to maintain high velocity (often with questionable results), more than increasing reliability.
> It is just that high-velocity low-reliability web and consumer application development is a very large niche
And most of them have no care about the user experience of the end user at all.
Almost every piece of software I have to interact with on a daily basis is absolute garbage. Just full of frustrating bugs that makes most of my day when I'm forced to use a computer absolutely miserable. But to each of the devs it's just a small annoyance in their particular app. Not caring to the end user it's one annoyance that leads to a death by a thousand cuts.
> How other engineering industries deal with this phenomena? Why those approach do not work with programming?
A lot of engineering discipline is a way to prevent engineered works from causing unintentional injury, physical or fiscal.
Most software development is far away from physical injury. And fiscal injury from software failure is rarely assigned to any party.
There's no feedback loop to push us to standardized process to cover our asses; we'd all prefer to do things our own way. It's also pretty hard to do convincing studies to determine which methods are better. Few people are convinced by any of the studies; and there's not been a lot of company X dominates the industry because of practice Y kinds of things, like you see with say Toyota's quality practices in the 80s and 90s.
Other engineering disciplines have certification, codes and regulations for specific domains, which are enforced by law.
DRY is a perfect example though of something which in moderation is a good idea but as the article says is vulnerable to ‘inexperienced programmers who lack the ability to judge the applicability’ and if over-eagerly applied leads to over-abstraction and premature abstraction which does more harm than good.
Before regulations, other engineering disciplines have far more objective decisions and calculations than software engineering.
Consider a mechanical analogue of DRY: choosing between reusing identical parts to make design, assembly and repairs simpler or designing similar but different parts because they are worth optimizing (e.g. a whole IKEA cabinet with interchangeable screws or with short and long ones).
Unlike next month's shifting software requirements the cost and performance of this kind of alternative can be predicted easily and accurately, without involving gut feelings or authority.
Well I think the point is you can’t legislate on things like style, or at least it is pointless to do so and other disciplines don’t try to. DRY is a style guideline.
What you can legislate/codify are procedures, safety and outcomes. So for example building designs must be signed off by a structural engineer and architect, both of whom are liable if the buildings collapses and kills someone. There are standards materials must meet and for which materials can be used. Buildings must meet standards for fire protection, air flow, heat loss etc.
I’m not sure software is at the stage where we even know what to codify or what is good and what is not good.
>> inexperienced programmers who lack the ability to judge the applicability
In other words, the author knows better than you.
The author could have put forward precedent, principles, or examples. But instead he chose to make it about the people (inexperienced), not his arguments.
I think one thing the industry does not do properly is applying different practices and standards depending on context.
An retailer website is not the same as a trading platform, the same way that a house is not the same as a railway station. But we blindly try to apply the same "good practices" everywhere.
We also have another interesting phenomenon, our products can mutate in their lifetime, and our practices should follow (they often don't) an MVP can become a critical system, a small internal project can become a client-facing application, we can re-platform, re-write, etc. That's very rare in other industries.
What makes you so sure they do? Go to the hardware store and behold how many fasteners there are. Go down the rabbet hole of pipe fittings. Consider the optimal size of lumber, someday.
And then get ready for the horrors of electrical connections. Not necessarily in how many there are; the real horror is how many think there is a "one true answer" there.
You can find some solace in learning of focusing effects. But, focus isn't just getting harder for individuals. :(
In the end, other engineering areas also have lots of "it depends" situations, where often there are multiple correct answers, depending on availability, legislation, safety, physical constraints, etc.
Perhaps in software engineering people are just too quick or immature to judge.
A nit; DRY is probably not what you think it is. DRY is basically the same as SRP, framed differently. In SRP, it's totally valid to have the code twice if it has different meaning from a user pov.
The problem with definition is that it's subjective and cannot be checked automatically. So my definition was about mechanistic DRY, which is objective and checked by linter.
I think we're more like Pythagoras: some useful theory about numbers, taken way too far and became an actual religion[0] listening to the Delphic[1] Oracle[2].
[0] Tabs or spaces? Vi or emacs? Is the singularity the rapture of the nerds, with Roko's Basalisk as the devil and ${insert name here according to personal taste} as the antichrist? SOLID or move fast and break things?
Often the models and equations rely on making assumptions in order to simplify the problem (cue the joke about physicist and the spherical cow). This is one of the reasons thing are designed with tolerances and safety factors.
Software like CAD and particularly Computational Fluid Dynamics (CFD) packages can simulate the problem but at least with CFD you would typically perform other types of verification such as wind tunnel tests etc.
I'm not sure that's analogous to "best practices" like "do not repeat yourself (DRY)" or "don't use GOTO". These are little more than stylistic choices that claim to offer more maintainable code. Comparable "best practices" in other engineering fields would be along the lines of "do not chamfer/fillet until the end of modelling" (one I have heard before).
Analyzing a CAD model as you describe is more like running a compiler or type checker on code already written, which is the norm in software too, but is not within in the vein of the topic of discussion.
- I think SW needs much more creativity than other industries.
- Typically SW is not mission critical (in mission critical things, it IS pretty much regulated to uncomfortable extremes)
You could regulate it to death, and would probably have some positive impact by some metric, but you would be easily overtaken by FOSS, where for sure there will be less restrictions.
"Engineering industry" is a misnomer. Other engineering areas have the same issues with best practices we have, industries apply best practices with a great amount of success (but not to totality).
Usually, engineering creates best practices for the industries to follow.
So outside of coworkers that have a hard time collaborating in general, is it a problem for others that their coworkers will not apply context? That has not been my experience.
I don't understand the question... If someone has a strong opinion, and they have arguments for their opinion, but don't recognize the significance of the context in which they've formed their opinions, they have blind spots they aren't aware of. Is that a problem? I dunno, that's up to you and your environment.
> Only software engineers pretend best practices exist outside of any useful context.
I am not seeing this issue with programmers in general or with my coworkers, with the exception of those who in general have a hard time collaborating with others.
So my question was/is if you discount the above exception are people seeing a problem with programmers/coworkers not taking context in to account? I have not noticed a wide spread issue and I am interested in how prevalent you, and others, perceive the issue to be.
Aren't these discussions the evidence? The fact that the author wrote a blog post and we are here discussing it. I might be missing the point of your question. This is everywhere around us in the development world. Anytime people compare react to htmx, redis to postgres, TDD vs BDD.
I'd like to point out I never called it a problem. I said that was a judgement call for you to make. We all have harmless biases.
But yeah, it can be a problem. If I have an engineer derailing my team because of his insistence for svelte, and can't read the room: ie can't take any of the context of the business, stack, domain, team, into his consideration, then yeah, it becomes a problem. Time is money
(svelte isn't a good example, it's not a best practice per se. s/svelte/TDD/)
> But yeah, it can be a problem. If I have an engineer derailing my team because of his insistence for svelte, and can't read the room: ie can't take any of the context of the business, stack, domain, team, into his consideration, then yeah, it becomes a problem. Time is money
I would describe this someone who does not know how to collaborate, maybe they don't know the balance they need between give and take, maybe they do not know how to format their ideas so they are understood by the group, maybe there is some fundamental misunderstanding. Since the tool of collaboration is not working for them, they reach for other tools to leverage and achieve their goals, like argument by authority via a convenient best practice.
The best practice/standard was not the issue, lack of context for the best practice was the the issue, the lack of collaboration or ability therein is the issue.
The author sounds like even though they have read extensively about various "best" practices, they did not really gain an understanding of the tradeoffs involved with each one.
> “Don’t Repeat Yourself” (DRY) is basically good advice, but sometimes just copy/pasting things is just the more pragmatic thing to do, and not really a big deal.
Duplicating code on purpose is not about being pragmatic, it's about recognizing when DRY would violate the single responsibility principle.
The ability to weigh tradeoffs in context is what makes some engineers better than others.
The problem with DRY is that it is expressed incorrectly and misleadingly. There is little inherent benefit in avoiding code duplication, and it can do harm when done for the wrong reasons. The actual context is change management. Will the copy of the code likely have to change in the same way as the original? Only then should duplication be avoided. The rule taken literally fails to convey this important precondition and the reasoning behind it. (And so does WET.)
> The author sounds like even though they have read extensively about various "best" practices, they did not really gain an understanding of the tradeoffs involved with each one.
It sounds to me like they did understand the tradeoffs. But that they were being brow-beaten to apply "best practices" that were inapplicable because of the tradeoffs.
I have an on-going topic with friends at work about what accessibility "means".
It annoys me to no end when devs talk about some specific technical change "increasing accessibility". The accessibility best practices are used as a checklist where more checks = more accessibility points = better. It results in people gaming the score with meaningless repetitive metadata or low-impact tweaks, rather than actually improving the descriptive/alternative/touch/visual interface. Usually never even trying any alternative method of interacting with an interface.
The best practice is "always include metadata", but it leaves off "... that adds context about the element rather than noise, and integrates with a surrounding application that uses consistent metadata labelling. Remember, this is a portion of a complete descriptive interface someone has to use."
These best practices being driven into people's brains verbatim means conversations devolve into inane on-or-off taxonomy discussions like "is this colour accessible?" or "have we added accessibility? Do you need a ticket for that?" where pushing back isn't seen as caring about users, it's seen as being "against accessibility".
IME "best practices" is a demand problem, not a supply problem. IE far more people want best practices than there are actual best practices.
Prospects and customers desperately want to know our "best practices" and then complain when we say "it depends" or something experimentation is required, as if we are hiding secret teachings from them.
For me this is more a personality test: people who just want solutions on a silver platter vs DIYers who want to tinker and distrust black boxes.
That's an interesting point, which leads me to my main reason for coming to these comments and leaving my 2 cents: there are way less best practices of there than one would believe by looking at all the places, people and firms offering some set of "best practices".
One thing I learned after many years working in consulting is that, more often than one would believe, best practices are just a compilation of whatever could be found (hopefully at least common practices, more often "things I could find that were minimally documented to be reusable"), with no serious analysis of their claim of superiority other than them being common.
So, first thing: learn to challenge the claim of "best". Best for whom? Under what context? What other not-so-good practices are out there, and why is this the best?
Second:if it's documented and evident enough to be treated as a best practice, it's probably fairly common knowledge already. Barring the commonality of really bad things being done out there, don't expect that you'll become much more than mediocre by adopting best practices. By the time they get to be called there, they are no longer any competitive advantage, more a basic thing you should be doing already - assuming they are indeed best practices (as per my previous point).
It's not that I'm against best practices as a concept, or compiled bodies of knowledge. Just don't expect them to do more than keep you somewhere in the middle. True leadership and innovation lies where best practices have not been established yet - together with all the dangers and mistakes you can make on uncharted waters.
I don't think that the issue is with "best practices," or any other type of dogma.
I think the main issue, is that companies tend to hire folks that aren't especially skilled at what they do, and rely on excessive structure, to compensate, or that they don't stay around, long enough, to get comfortable with the structure.
This can apply to both newer folks, who don't understand the system well enough to effectively deviate, and ones with a lot of experience, who have allowed themselves to get so hidebound, they are afraid to deviate.
As I have gotten older, wiser, and more battle-scarred (often, from self-inflicted injuries), I have learned that "It Depends™" is the only true mantra for my work.
Usually, best practices/dogma/structure becomes important, when the codebase is being worked on by a team, and when there's the need to coordinate work between teams.
There's some type of work that just can't be done, without structure. I've done that type of work. Other work can be killed by too much structure. I've done that kind of work, as well.
There are 2 main kinds of advice that get labeled best practices in an attempt at persuasion:
1. Advice which worked in one situation - “we tried Agile, everyone should use it!”
2. Proven stuff like code review, which you call best practices when begging your org to implement it: “please let’s do it, I can clearly see how this will improve our org.”
These 2 examples represent locations on a spectrum: let’s call it “provenness”.
The author’s problem boils down to subjectivity - everyone positions different practices in different places on the provenness axis. The upshot of that is when one person says “we should do this, it’s obvious and/or it’ll definitely help” another person hears “we tried it once, you should try it too!” and then everyone has a bad time.
Then it gets confounded by everyone calling everything best practices - no matter how long ago or unproven the practices might be.
What would be handy is some generally agreed-upon yardsticks for graduating practices into or out of best practices status, plus better terminology to cover the spectrum of provenness so more sophisticated discussions can be had that account for the nuance and we don’t all talk past each other..
But then analyst companies wouldn’t get to make their glossy 2x2 charts, so it probably won’t happen.
Best practices are tough for practices where the foundations are not stable. And with programming, we have trouble defining the foundations. Much less stabilizing them.
And note I don't mean stable as in, not crashing. I mean it as not changing.
For a while, this was doable with java. For its warts, it gave a good foundation. Industry practice got caught up in start up enthusiasm, though, and that went out the window.
Similar could probably be said for Windows. I was not a fan of its model, but it provided a stable base for business apps for a long time.
The author sort of goes against his own advice but not diving into the specifics of why he doesn't like certain things. I get that such a piece would be longer, take more time and effort, and would draw more argumentation but that's what he's asking for in the piece itself.
Maybe if people didn't follow best practices their software would be terrible-er? Guardrails for hard things doesn't imply no guardrails would make it easier.
The name “best practices” kind of implies that they actually are practiced somewhere. So it’s different from theoretical abstract ideas “how we should write software”, which maybe nobody follows.
Like any other discussion of this kind[1] I think this one will go nowhere because 1) the question doesn't have a black and white answer, it's a sliding scale. 2) almost no one is giving examples of what they mean; if they were, they could find that they agree with the person they are replying to. And 3) most people will discard the experiences of people they disagree with "you didn't even try! maybe the problem is you!", and this is easy because this is the internet and the other person can be (is probably?) making it up.
The longer I program the more I agree with this take. I have seen too much power and dominance from people who wield it as a cudgel against everyone else in the room who doesn't care enough to double-check the practice.
Put another way, knowledge is knowing best-practices, but wisdom is knowing where and when to apply them. Unfortunately, most building software have only the knowledge and there is too little consideration for the fact that structure is not free, and considerations must be made for when velocity is the primary objective vs safety and adherence to best-practices.
It all comes down to architecture design in the end.
“Best practices” have nothing to do with engineering. They are a marketing concept. They are a kind marketing better known as bullying.
In every case where you want to say “best practice” there is a better alternative, which is to say “practice.” The concept of best is never needed or warranted, because practices are not subject to rigorous testing and review.
I have been an independent consultant and trainer since 1999 and not once have I taught or recommended a best practice.
I do have many opinions. I call them: opinions. I think my opinions are the best, but I can’t think of any reason that anyone else beyond my wife and dog should think so.
I agree with the sentiment of the article, but Postel law is a good idea that has very little to do with the context. Of course the real problem is that it's a very small advice that needs to be put into context each time, and here is the failure of all the types of advices: they are not substitutes for intelligence and good design tastes.
Perhaps it's due to the very large number of factors in a code base and/or the particular problem space.
But as I got more senior, when asked by less experienced developers the best way to do something, my answers tended to increasingly start with: "Well, it depends...".
And that's the thing, there is no universal best practice for everything, there are solutions which are more often than not good enough, but as all solutions tend to be a trade off favouring a particular scenario over another, sometimes they're also highly inappropriate to the particular circumstances.
Another term for someone trying to blindly apply supposed best practices is "cargo culting".
In summary, there is lot's nuance to software development and the particular circumstances it's being applied to meaning that you need to understand the trade offs of particular solutions to see which one makes the most sense for a particular situation.
> “use single point of exit”, and that kind of absolute insane bollocks
As always it's a matter of context. If you have a low level language like C without any automatic cleanup having an out label to goto which cleans up in all cases makes a lot of sense.
There's always some reasoning behind those rules, sadly rarely talked about directly. You should check if that reasoning applies to you and weigh the benefits against the costs. A lot of time it's just doing what everyone else is doing, which is a very fine reason, as it makes onboarding new developers easier.
I once put in an e-mail the following "best practice applied for the sake of best practice, is bad practice" to consultant who chalked up an awful decision to "applying best practices"
While I guess most would agree all the cited practices have been abused what is really the alternative? Absence of practices does not make things better - history has plenty lessons there. To run medium to large scale projects one needs patterns to ease communication and integration.
The real question is how do we prevent best practices to be perverted and I fear the answer is having the right people in the right places. The one best practice to rule them all: Have knowledgeable balanced people who know when to break the rules.
The alternative is to accept that every solution has drawbacks and trade-offs. Best practices were an attempt to codify design standards that would be equivalent to "buying IBM" in the sense of the old phrase, "Nobody gets fired for buying IBM."
That was always a bad idea. Often the best choice in one context would be a bad choice in other contexts. You don't want an engineer in a 20-person startup making decisions like they're at Google, or vice-versa. You have to take responsibility for deciding what's best for a particular problem in a particular context, without looking for the cover of "everybody does it this way."
I've worked too often with people who think they know better
They do not
Straw men do not change the fact that "best practices" , especially the ones quoted, are very good.
No sensible person is saying "never use globals". We caution you to think very carefully before you do, only for truly global state.
I am suffering from inherited code, written by a very good programmer, who got lazy with globals and comments. Too many of the former, too few of the later. What a nightmare
I've been on a team where most comments I added where deleted because they believed it was best practice to avoid them getting out of sync with the code.
So even your comment disagrees with your claim that "best practices" are very good.
"Postel's law" is an ironic name, I don't think the people who coined that term meant that it was an infallible rule of the universe, just what seemed a good rule of thumb for implementing protocols.
I still like Postel's law, but don't imagine for a second it's a 'law' with the kind of authority the article implies, and didn't enjoy the article's slime about people who like it!
- Protect data structures accessed concurrently with a mutex?
- Have backups?
I wouldn't say there isn't some imaginary situation where you'd do something different, but it's safer to fall back to doing the above in most situations.
That said many people make up "best practices" or use them without understanding the circumstances to which they apply.
Best practices and such shouldn't be obeyed as "laws" but guidelines for general work. Once in a while there's good reason to avoid / skip them for whatever reason.
The bigger issue is that developers (me included) are usually making the decisions in their own heads. Usually the reasons are quite ok but they are not really said out loud or documented.
I've been stumbled upon this as both developer trying to get their code approved and when doing a code review.
For the developer it feels super annoying that somebody nitpicks about things which the developer has probably gone through in their heads already and ended up with the resulting solution. Just like Martin in the post complains and reacts passive aggressively towards reviewers mentioning these things.
For the code reviewer it feels like the developer is being sloppy or doesn't care about our common way of doing things thus increasing the need to nitpick and explain so that the developer understands how it should be done.
The solution for this is actually quite easy: document in the comments and in the Pull Request _why_ you're breaking team's / company's guidelines and why you think it's warranted for solution for this case. This seems to remove quite a lot of friction.
Oh and of course it doesn't mean that the guidelines should be broken all the time just because one thinks this kind of stuff is for "idiots|assholes". When working as a team, we need to adhere to common way of working for most of the time.
I see "best practices" as just case studies. Here's one possible way to do it. If it intrigues you, you can do the same.
Most of these "best practices" have valid opposing camps anyway. There's DRY, but there's also "a little copying is better than a little dependency". Both are valid in different contexts.
Following "Best practice" isn't about doing what is "best", it's about being predictable.
In many cases, being predictable is better for future maintenance than forging your own path, even if the custom solution is "better" against all (current) metrics.
"Best practices" is a tool that you need to know how to use it. Same as any other tool. If you do use the same tool for everything, this is when a tool becomes a bad thing.
"Best practices" is a really good tool if you use it in the correct context.
I've found that "best practice" generally means "what you will not get fired for doing". Sometimes it is a good practice. Sometimes it's something the higher-ups read about in CTO magazine and decided that everybody now must do. I'd say use your individual judgement, but depending on your role and the company, using your own judgement may be beyond your pay grade.
Teaching threads on a operating systems course, i have witnessed some of the more convoluted code ever by students so afraid on global variables that they turned simple exercises into monstrosities. So yes, f*ck the rigid rules. They have to unlearn best practices guidelines, to which they were "blood-written sacred rules".
Well, in class environment, they've probably figured out that it's not a sane or correct answer that gets the points. It's the answer that follows all the rules that have been set in the material for that class. It's extremely typical for teachers to grade based on the "best practices" and "sacred rules" that the classes teach. Showing ideas, outside knowledge or solutions that are simpler than the course material just gets a student punished.
most "best practices" are just trends that got way out of hand. Microservices, long variable names, XML, TDD, SCRUM, OOP were just marketing concepts that turned into a cult.
Books like Code Complete are useful tools but not bibles.
Just like everything on the internet: it's just another person's opinion. What matters is what works (i.e. makes you money).
As someone at Google, I hardly believe they deserve to be used at Google. We have like 20% the efficiency we could have if we were more selective about procedure.
>There is something wrong with people who enjoy telling other people what to do.
Telling other people what to do is human's favorite thing. Give a man an opportunity to make and/or enforce rules, and you have a very happy man. People dedicate their whole lives to reaching stage after stage in their rule-making game.
Great article! I feel like I am dealing with so much of this behaviour at my job recently. It feels like most of the people espousing Best Practice never learned that "appeal to authority" was not a reasonable argument.
It occurs to me that the author is explaining his best practice is to look at someone else's best practice with a gimlet eye and take what you want from it. So I wonder how to apply his best practice to his own best practice?
I want engineers to think about the best for the project but I let engineers talk about the best of implementation because it's better than doing irrevant things.
> “Don’t use globals” is obviously good, but “never use globals under any circumstances” is just silly
I'm actually not even convinced that this is a good rule. When it's explained it makes sense on the surface. I also think that since beginners to programming use global variables, they come to the conclusion that it must be universally bad since that's what their beginner self did.
Having work on codebases with most state being being stored as global variables (in a single namespace at that) is manageable at worst and easy to develop on at best, assuming certain conventions are followed.
Living through the 2000s was enough to give me a permanent aversion to "best practices." So many things that we look back on in horror were smugly and haughtily justified using those words.
There was a culture of best practices zealots that had an uncanny resemblance to organized religion. All the answers have been written down for us, and everything that goes wrong is because of arrogant, misguided people who decided to trust their own fallible thoughts instead. Their faith was strong enough to survive any project, no matter the outcome.
Ralph Waldo Emerson: "A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines. With consistency a great soul has simply nothing to do. He may as well concern himself with his shadow on the wall. Speak what you think now in hard words, and to-morrow speak what to-morrow thinks in hard words again, though it contradict every thing you said to-day. — 'Ah, so you shall be sure to be misunderstood.' — Is it so bad, then, to be misunderstood? Pythagoras was misunderstood, and Socrates, and Jesus, and Luther, and Copernicus, and Galileo, and Newton, and every pure and wise spirit that ever took flesh. To be great is to be misunderstood."
Follow best practices unless you can give a reason not to. "Best practice" is a shorthand for a lot of accumulated knowledge that you don't want to go over again every time. Also following BP makes the code more consistent and thus easier to understand. But when an argument arises, go back to the underpinnings of the best practice and work from there.
‘The reasonable man adapts himself to the world; the unreasonable man persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.’
> Therefore all progress depends on the unreasonable man.
The unfortunate corollary to this is that all retrogression also depends on the unreasonable man. The reasonable person (as defined here) maintains the status quo, for good or ill.
I would reverse this: if you can explain to me exactly why your so called "best practice" applies here and now, good. Otherwise it's a nice input to have in the discussion, but nothing more.
It depends on your risk appetite. A best practice likely keeps you away from pitfalls known to the community of the practice but not to you. It also may keep you away from crafting a solution that is optimal for your specific circumstances not known or applicable more widely.
In high-reward / low-risk environment, such as building an indie turn-based retro-style game, go with your gut feeling unless you have a good reason not to.
In a high-risk / dubious-reward environment, such as implementing cryptography, follow the best practices to a t, unless you know intimately how things work and maybe codified some of the practices.
> A best practice likely keeps you away from pitfalls known to the community of the practice but not to you.
In my experience, many "best practices" are the pitfalls you should be wary about, as they can easily translate into hundreds or thousands of lost hours of work and derail and doom entire projects. (The most annoying part of this is that the real causes won't be found, precisely because "best practices have been followed". Therefore the reputation of the best practice will stay untarnished).
Cryptography on the other hand is a well known example of something you should not touch at all unless you are an absolute expert- that's not even a "best practice" but probably the only reasonable practice.
It's standard practice to install outlets with NEMA connectors in North American buildings. Sure, you could technically swap those out with a more optimal connector that is "better" (one that prevents electricity from flowing while the plug is partially exposed, for example), but using the standard practice is best practice for human-shaped reasons that are often not apparent to early-career engineers.
They wouldn't, but about half of the developers commenting here would do the equivalent of switching from NEMA to something else on the grounds that the something else is better.
There are many like that. Every practice is a trade off.
However, there are many where the cost/benefit ratio is so large that you can default to "you should just do this".
I dont think Id ever look at a company that e.g. had no CI or versioning for a large project for instance and think "they might have had a good reason for this". They didnt.
I don't think so. For example, If I'm writing something in a programming language I don't know by heart, I start by following the best practices recommended by the language authors, then start to flex them when they become less useful, until I hit a nice pitfall which these best practices are designed to avoid.
This allows me to start faster, and be in a better shape in short term. Then I can flex advice more and more as I understand how a language works under the hood.
I also read the language tutorial first. But if they don't explain their best practices (and often the explanation is simple), I don't care much for them.
What if the "best practice" was invented by a VC funded company (like, just for example, Vercel) desperate to maintain their growth trajectory by any means necessary, including huge marketing pushes to convince everybody the "best practice" is the "best way to do things"?
It's downright dangerous to assume a "best practice" in software development somehow automatically means it's some super distilled wisdom juice. A whole lot of it, in my experience, is just hype and rapid, unquestioning cargo culting slamming in right behind the hype.
Use your fucking brain. If the "best practice" is objectively a good solution to your problem then use it. If not, think of something better.
A lot of the time best practice can also mean “we did it this way last time and it was ok”. I don’t think anyone is saying don’t find improvements in “what is currently defined as best practice” and if they are then that’s your problem.
Best practices are a way to avoid building everything up from first principles every time. They are more, “here are rules that are generally applicable so you don’t have to waste time thinking,” vs “here are laws of god which are always applicable.”
It is an error to throw out every best practice and reconsider everything just as it is to blindly follow the best practices.
IMO it’s best to apply the best practices by default and question them as they are used. Essentially trust but verify. Assuming they are right most of the time you get the benefit of speed. Assuming some of them are wrong, it still leaves room to course correct.
I've seen (and started using) "current common practices" instead of "best practices" where it makes sense.
In an always-evolving technology landscape, it feels like a better representation of what we're doing, and it helps prevent the kind of dogmatic stubbornness that forms around the word "best".
Ragebait post unless you enjoy playing semantics games with verysmarts. If you don't understand the purpose of the term "best practices", they were probably made for someone like you.
I can sense the hackles rising. The arcane magi of the Guidance Council have scried a rebel wizard in their orb. Every cantrip and charm must be uttered at once, with maximal urgency! Invoke the spirits of the tomes!!!
Dan Morena, CTO at Upright.com, made the point that every startup was unique and therefore every startup had to find out what was best for it, while ignoring whatever was considered "best practice." I wrote what he told me here:
https://respectfulleadership.substack.com/p/dan-morena-is-a-...
My summary of his idea:
No army has ever conquered a country. An army conquers this muddy ditch over here, that open wheat field over there and then the adjoining farm buildings. It conquers that copse of lush oak trees next to the large outcropping of granite rocks. An army seizes that grassy hill top, it digs in on the west side of this particular fast flowing river, it gains control over the 12 story gray and red brick downtown office building, fighting room to room. If you are watching from a great distance, you might think that an army has conquered a country, but if you listen to the people who are involved in the struggle, then you are aware how much "a country" is an abstraction. The real work is made up of specifics: buildings, roads, trees, ditches, rivers, bushes, rocks, fields, houses. When a person talks in abstractions, it only shows how little they know. The people who have meaningful information talk about specifics.
Likewise, no one builds a startup. Instead, you build your startup, and your startup is completely unique, and possesses features that no other startup will ever have. Your success will depend on adapting to those attributes that make it unique.
I get the analogy but I think it can be made a lot better, which will decrease people who dismiss it because they got lost in where the wording doesn’t make sense. I’m pretty confident most would agree that country A conquered country B if country B was nothing but fire and rubble. It’s pretty common usage actually. Also, there’s plenty of examples of countries ruled by militaries. Even the US president is the head of the military. As for army, it’s fairly synonymous with military, only really diverting in recent usage.
Besides that, the Army Corp of engineers is well known to build bridges, roads, housing, and all sorts of things. But on the topic of corp, that’s part of the hierarchy. For yours a battalion, regiment, company, or platoon may work much better. A platoon or squad might take control of a building. A company might control a hill or river. But it takes a whole army to conquer a country because it is all these groups working together, even if often disconnected and not in unison, even with infighting and internal conflicts, they rally around the same end goals.
By I’m also not sure this fully aligns with what you say. It’s true that the naive only talk at abstract levels, but it’s common for experts too. But experts almost always leak specifics in because the abstraction is derived from a nuanced understanding. But we need to talk in both abstractions and in details. The necessity for abstraction only grows, but so does the whole pie.
https://en.wikipedia.org/wiki/Military_organization
It's a cute analogy, but like all analogies it breaks after inspection. One might try and salvage it by observing that military "best practice" in the field and Best Practice at HQ need not be, and commonly are not, the same, either for reasons of scope or expediency. Moreover, lower case "practice" tends to win more, more quickly. Eg guerillas tend to win battles quickly against hidebound formal armies.
For a startup, winning "battles, not wars," is what you need, because you have finite resources and have an exit in mind before you burn through them. For a large enterprise, "winning wars not battles" is important because you have big targets on your back (regulators, stock market, litigation).
One might paraphrase the whole shooting match with the ever-pithy statement that premature optimization is the root of all evil.
The US president, a civilian, is in command of the US military. This is, in fact, the inverse of a country being run by its military.
Also true in the UK. Even in a war the UK armed forces are ultimately tasked by and report to politicians.
Its true everywhere except for military dictatorships.
> I’m pretty confident most would agree that country A conquered country B if country B was nothing but fire and rubble.
I think we can all agree that if that is the case, you’ve in fact conquered nothing.
Edit: Since we say opposite things, maybe we wouldn’t agree.
So.. how would you make it a lot better?
> If you are watching from a great distance, you might think that an army has conquered a country, but if you listen to the people who are involved in the struggle, then you are aware how much "a country" is an abstraction.
Most things of any value are abstractions. You take a country by persuading everyone you've taken a country, the implementation details of that argument might involve some grassy hill tops, some fields and farm buildings, but its absolutely not the case that an army needs to control every field and every grassy hill top that makes up "a country" in order to take it. The abstraction is different to the sum of its specific parts.
If you try to invade a country by invading every concrete bit of it, you'll either fail to take it or have nothing of value at the end (i.e fail in your objective). The only reason it has ever been useful or even possible to invade countries is because countries are abstractions and it's the abstraction that is important.
> The real work is made up of specifics: buildings, roads, trees, ditches, rivers, bushes, rocks, fields, houses.
Specifics are important - failing to execute on specifics dooms any steps you might make to help achieve your objective, but if all you see is specifics you won't be able to come up with a coherent objective or choose a path that would stand a chance of getting you there.
The army that is conquering is carrying best practice weapons, wearing best practice boots, best practice fatigues, best practice tanks, trucks, etc.
They're best practice aiming, shooting, walking, communicating, hiring (mercs), hiding, etc...
The people that are in the weeds are just doing the most simple things for their personal situation as they're taking over that granite rock or "copse of lush oak trees".
It's easy to use a lot of words to pretend your point has meaning, but often, like KH - it doesn't.
This is frequently not true. There’s examples all through history of weaker and poorer armies defeating larger ones. From Zulus, to the American Revolution, to the great Emu wars. Surely the birds were not more advanced than men armed with machine guns. But it’s only when the smaller forces can take advantage and leverage what they have better than others. It’s best practices, but what’s best is not universal, it’s best for who, best for when, best for under what circumstances
That doesn't defeat my point- is the smaller/poorer army using best practices?
When all things are the same, the army with more will win.
When all things are not the same, there are little bonuses that can cause the smaller/poorer, malnourished army to win against those with machine guns. Often it's just knowing the territory. Again though, these people are individually making decisions. There isn't some massively smart borg ball sending individual orders to shoot 3 inches to the left to each drone.
https://en.wikipedia.org/wiki/Emu_War
The Zulus won a pitched battle or two, but lost the war.
Sure, they (eventually) lost against the British, but they won against many of the southern African tribes before.
Occasionally something novel and innovative beats the best practice. In that case it usually gradually gets adopted as best practice. More often it doesn't, and falls by the wayside.
> It’s best practices, but what’s best is not universal, it’s best for who, best for when, best for under what circumstances.
I’m pretty sure building an organization on a free for all principle is anathema to the idea of an organization.
That's a straw man. The actual argument is about the danger of applying "best practices" uncritically, not about doing away with leadership.
"Do X because it's best practice" is very different than "do X because you were commanded by your rightful authority to do so."
Often not true. Often they are just "good enough" weapons, etc.
Wow what a fantastic little article. Thanks for writing and sharing that.
my irony detector is going off, but it's feeble. do I need a better irony detector?
I was being genuine.
More people should be.
I think the word you're looking for is "nation", not "country". A country is the land area and would be conquered in that example, while a nation is the more abstract entity made of the people. It's why it makes sense to talk about countries after the government falls, or nations without a country.
Likewise, people do business with people, not with companies. Assert that “society” is merely an abstraction invoked for political gain to become an individualist.
> people do business with people, not with companies
Many of my interactions are with electronic systems deployed by companies or the state. It's rare that I deal with an actual person a lot of the time (which is sad, but that's another story).
Seriously? From whom do I buy a computer or a car or a refrigertor?
It took me way too long to realize this, but my experience is that "zealots, idiots, and assholes" (as the author says) are going to abuse something and wield it as a bludgeon against other people. This appears to be a fixed, immutable property of the universe.
If you take it as a given that some number of people are going to get an idea lodged in their head, treat it like gospel, and beat as many other people in the head with it as they can... the best strategy you can adopt is to have the ideas in their head be at least somewhat useful.
Yes, reasonable people understand that "best practices" come with all sorts of context and caveats that need to be taken into account. But you won't always be dealing with reasonable people, and if you're dealing with an asshole, zealot, or idiot, I'd sure as hell prefer one who blindly believes in, say, test-first development versus believing that "test code isn't real code, you should spend all of your time writing code that ships to users" or some other even worse nonsense.
In any given culture (software development) you will have a set of traditions that may be used by the members of that culture to signify status, experience, pedigree and authority, and often some combination of all these and other descriptors. And, by virtue of creating a culture, you will also necessarily create a counter-culture, which rejects those things (performatively or otherwise) in an effort to do the exact same thing, but in the other direction. If you decide that proper software is developed with command-line in mind first, and UI second, you will in that effort necessarily create software devs who believe the exact opposite. This is core to our being as humans.
In my mind, this author is merely signaling software counter-culture, some of which I agree with, others I don't. And the people whom you describe above are signaling software culture, in a hostile and domineering way.
And of course, these two sides are not impermeable, forever persistent: culture and counter-culture shift constantly, often outright reversing from one another on a roughly 30 year timeline. And, they both have important things to offer. Best practices are best practices for a reason, and also telling stuffy people to chill out when they're focused so hard on best practices that they lose the plot of what the org is actually attempting to accomplish is also important.
Hippies create Cops / Cops create Hippies
The one thing that gives me pause, is that I have seen stages of mastery where the base stage is repetition and adherence to rules to internalize them before understanding them and knowing when to break them.
If much of our industry is new, evangelizing these rules as harder and faster than they are makes a lot of sense to bring people to get people ready for the next stage. Then they learn the context and caveats over time.
That made me want to look up some link about Shu Ha Ri. Turns out that's actually been made popular in some corners of sw dev already. E.g. https://martinfowler.com/bliki/ShuHaRi.html
I think the rejection is too strong in this article. The idea of, “best practices,” comes from an established Body of Knowledge. There is one published for software development called the SoftWare Engineering Body of Knowledge or SWEBOK; published by the IEEE.
The author seems to be arguing for nuance: that these “laws,” require context and shouldn’t be applied blindly. I agree.
However they shouldn’t be rejected out of hand either and people recommending them aren’t idiots.
Update: one problem with “best practices,” that I think the article might have unwittingly implied is that most software developers aren’t aware of SWEBOK and are repeating maxims and aphorisms they heard from others. Software development is often powered by folklore and hand waving.
I think it is best to strongly reject the idea "best practices will always benefit you".
Most best practices that I have been told about were low local maxima at best, and very harmful at worst.
If someone quotes a best practice to you and can't cite a convincing "why", you should immediately reject it.
It might still be a good idea, but you shouldn't seriously consider it until you hear an actually convincing reason (not a "just so" explanation that skips several steps).
> It might still be a good idea, but you shouldn't seriously consider it until you hear an actually convincing reason (not a "just so" explanation that skips several steps).
If everyone follows that then every decision will be bikeshedded to death. I think part of the point of the concept of "best practices" is that some ideas should be at least somewhat entrenched, followed by default, and not overturned without good reason.
Ideally your records of best practices would include a rationale and scope for when they should be reexamined. But trying to reason everything out from first principles doesn't work great either.
It strikes me that, if a decision can be bikeshedded to death, it's not, generally speaking, an important decision.
Well calling something a bikeshed is implicitly claiming that it's not so important. Often the specific choice is not very important, but making a choice rather than not making one is important. And while an effective organisation would not allow important decisionmaking to get derailed, many organisations are ineffective.
> Most best practices that I have been told about were low local maxima at best, and very harmful at worst.
This matches my experience, though sometimes they indeed will be helpful, at least after some consideration.
> If someone quotes a best practice to you and can't cite a convincing "why", you should immediately reject it.
In certain environments this will get you labeled someone who doesn't want to create quality software, because obviously best practices will lead to good code and not wanting to follow those practices or questioning them means that you don't have enough experience or something. Ergo, you should just apply SOLID and DRY everywhere, even if it becomes more or less a cargo cult. Not that I agree with the idea, but that way of thinking is prevalent.
(not that I agree with that, people just have that mindset sometimes)
I agree with you, I think you're describing the same sort of person I was thinking of in the second paragraph of this post: https://joshduff.com/2022-02-07-eschatology-of-software.html
Hmhm, just like “AWS is recommending serverless, so we should serverless everything!”
Never mind that AWS recommends what is good for AWS, not us.
I definitely sympathize with the thrust of the article. I think the reality is somewhere in the middle: best practices are useful short-cuts and people aren't always idiots for suggesting them. I've worked with folks who insist on Postel's law despite security research in recent years that suggest parsers should be strict to prevent langsec attacks, for example. In those cases I would refute leniency...
Although I also do work in fintech and well... card payment systems are messy. The legal framework covers liability for when actors send bad data but your system still has to parse/process/accept those messages. So you need some leniency.
It does drive me up the wall sometimes when people will hand-wave away details and cite maxims or best-practices... but those are usually situations where the details matter a great deal: security, safety, liveness, etc. People generally have the best intentions in these scenarios and I don't fault them for having different experience/knowledge/wisdom that lead them to different conclusions than I do. They're not idiots for suggesting best practices... it's just a nuisance.
That's what I mean about the rejection being too strong. It should be considered that best practices are often useful and helpful. We don't have to re-develop our intuitions from first principles on every project. It would be tedious to do so. But a healthy dose of skepticism should be used... especially when it comes to Postel's Law which has some decent research to suggest avoiding it.
I don't think anyone has ever thought that best practices will always benefit you. Nothing always works every single time in every single case.
This whole thing is really silly and obvious.
Of course you shouldn't blindly follow advice without thinking. But not following advice just because it might not always be right is also a bad idea.
My advice: In general, you should follow good advice from experienced people. If enough experts say this is the best way to do something, you should probably do that, most of the time.
But that advice will never trend on HN because it isn't clickbait or extreme, and requires using your noggin.
> I don't think anyone has ever thought that best practices will always benefit you.
Whenever a "best practice" or "convention" has been presented to me, that is how it has been framed. (...it is best practice, therefore, it will definitely benefit you to follow it)
I do not know what context this happened to you in, but in the context of building something quickly, learning, while not being an expert in an area, best practice are a common crutch.
In many work places either they do not have time or at least think they do have time to think things through 100% for themselves from first principles so they depend on best practices instead.
That makes sense to me and I would expect better results on average with using best practices than rejection of best practices in the above context.
That said I try to work on things where I am not always in the above context, where thinking things through end to end provides a competitive advantage.
100%… a best practice in other traditional engineering practices help us work within the state of the art. They’re the accumulated wisdom and experience of engineers that came before us.
There are plenty of them that help us write concurrent code that avoids common deadlock situations without having to resort to writing proofs every time. Someone already did the work and condensed it down into a rule to follow. Even if you don’t understand the underlying proof you can follow the rule and hope that everything will shake out.
What I find we struggle most with is knowing when we actually need to write the proof. Sometimes we bias ourselves towards best practices and intuition when working it out formally would be more prudent.
In general that is true, I think. Even if it doesn’t apply in all circumstances, it’ll apply in most.
It’d be ideal if you could identify when it doesn’t work. But in the absense of that applying it everywhere is still a net positive.
I could tell you the moon is made of cheese. If I'm wrong about it being made of cheese, does that mean the moon doesn't exist?
I like SWEBOK, but I don't understand your point.
SWEBOK seems the opposite of that. A body of knowledge is not at all the same thing as a best practice. The only unapologetic best practice in SWEBOK is that professionals should be familiar with every topic in SWEBOK. Definitely not that you _should_ do everything in the book.
The book is quite sophisticated in this. It explicitly separate the technical knowledge from the judgments of which, when, and where to apply it. Most occurrences of "best practices" in the text are quoted, and are references to other works and describe the need to choose between different best-practice libraries depending on context. Others are part of a meta-conversation about the role of standards in engineering. Very little of SWEBOK is promoted as a "best practice" in itself.
Here's a quote from SWEBOK v4, 12-5
> Foremost, software engineers should know the key software engineering standards that apply to their specific industry. As Iberle discussed [19], the practices software engineers use vary greatly depending on the industry, business model and organizational culture where they work.
> I don't understand your point
In my view best practices emerge from a body of knowledge (or sometimes from the practice and wisdom of others that haven't been documented/accepted/etc yet) and are "shortcuts."
I'm not defending Postel's Law; I agree that, after years of practice and research, it leads to security issues and surprises.
However, the point is that these kinds of principles don't arise out of people's heads and become accepted wisdom for nothing; they're usually built off of an implied (or explicit) body of knowledge.
Does that make sense?
Sure. Best practices develop by choosing practices to match your context out of a defined body of knowledge.
But SWEBOK is very clear that "best practices" are context specific - they are radically different forces and solutions in video games as compared to chemical engineering control systems. There's no such thing as a "best practice" absent a context. The footnotes in SWEBOK point off in a million directions saying "go look over there for best practices for YOUR context".
While I like the idea of SWEBOK, the actual SWEBOK book is not very useful though, quite incomplete, biased, and not up to date.
There was recently a HN thread about it: https://news.ycombinator.com/item?id=41907412
To put it simply, best practices are, at best, context-dependent. Best practices for avionics software are not the same as best practices for a CRUD form on your mailing-list signup page.
And to be fair, the best practices for designing a bridge or a skyscraper are not the same ones for designing a doghouse.
This! "Best practice" depends on the circumstances. Are "micro services" a best practice? What about "monolithic architecture"? Those choices are not best practices in and of themselves but may be best practices when considering organization/dev team size, application user count, computational demands on the application/system, etc. What are the goals and vision of the future? Let's future-proof and pre-optimize for problems we don't currently have nor will likely have! (And don't get me started on the number of folks that dream about "we're going to need to be able to scale!" for a fairly simple CRUD app that will most likely be used by hundreads, maybe thousands, or users and realistically need 100's of "simple" requests per second (most likely per minute... )
Makes me think as well of the best practices in development & project management methodologies.
*However they shouldn’t be rejected out of hand either and people recommending them aren’t idiots.*
Also it shouldn't be taken for granted that best practice is always "best/good" - there definitely are idiots recommending best practices.
I'm one of the devs not aware of the SWEBOK. Searching the internet all I can find is links to "the guide to SWEBOK".
https://ieeecs-media.computer.org/media/education/swebok/swe...
But, you know, I want the whole ordeal. I want the SWEBOK, not the "how to read the SWEBOK". Where can I find it?
I think that is what you want. From Wikipedia:
> In 2016, the IEEE Computer Society kicked off the SWEBOK Evolution effort to develop future iterations of the body of knowledge. The SWEBOK Evolution project resulted in the publication of SWEBOK Guide version 4 in October 2024.
So the thing called "SWEBOK Guide" is actually the reference text for SWEBOK.
It looks like SWEBOK Guide, guide to the SWEBOK, and SWEBOK are all used interchangeably. I wonder if they have a chapter on naming conventions.
The confusion is because "BOK" is not "book of knowledge" but "body of knowledge". So a "guide" as a canonical source kinda makes sense.
The books that encode some standardized Xbok are always named "The guide to the Xbok".
The actual BOK isn't supposed to have a concrete representation. It's not supposed to be standardized either, but standard organizations always ignore that part.
This. They’re supposed to represent the state of the art which is constantly evolving.
Well, literally the "state" as in what is the knowledge that everybody shares. We usually call that by something closer to "minimum common denominator".
What people usually call "state of the art" is the best knowledge that is reasonably well known. That is out of scope. If you take a look on this one, it's full of stuff that we knew not to use on the 20th century. This is typical.
I’m not sure folklore and handwaving is worse than relying on some sort of bible some mysterious organisation wrote as your source of truth.
> but because they’re mostly pounded by either 1) various types of zealots, idiots, and assholes who abuse these kind of “best practices” as an argument from authority, or 2) inexperienced programmers who lack the ability to judge the applicability,
The author might go on to make other points that are worth discussing, but lays out his supporting arguments clearly in the opening paragraph. Best practices do not necessarily do harm because they offer bad advice, they do harm because they are advocated for by zealots and the inexperienced.
My first reaction is how unfortunate it is that this particular developer has found himself in the presence of bad engineers and the inexperienced.
But then, the argument is automatically self-defeating. Why is the rest of the article even worth reading, if he states upfront what his arguments are and those arguments are very easy to refute?
It is deeply irrational to judge the merits of an idea based solely on who is advocating for that idea.
My advice to the author is to reflect on the types of positions that he accepts, the ones that have him so put off by the people that he works with that he is openly writing about abandoning what he admits could be sound engineering practice, solely based on who that advice is coming from and how it is being delivered.
Developing software is complicated. It is constant problem solving. When solutions to problems come about, and we abstract those solutions, it is quite easy for individuals to misapply the abstraction to an inappropriate concrete. To drop context and try to retrofit a lousy solution because that solution was appropriate to a slightly different problem. But at the end of the day, these abstractions exist to try and simplify the process. Any time you see a "best practice" or design pattern acting as a complicating force, it is not doing its job. At that point you can either be objective and exercise some professional curiosity in order to try and understand why the solution adopted is inappropriate ... or you can take the lazy way out and just assume that "best practices" are the opinions of zealots and the inexperienced who blindly follow because they don't know any better.
> Best practices do not necessarily do harm because they offer bad advice, they do harm because they are advocated for by zealots and the inexperienced.
I think the point is that blindly suggesting "best practices" often is bad advice.
It's a common form of bikeshedding—it allows someone to give their casual two cents without doing the hard work of thinking through the tradeoffs.
we don't have to give undue value to 'best practices', nor do we need to judge an idea based on its presenter. we just need to have a reasonable discussion about the idea in the context in which its being applied. this simple approach has been largely eclipsed in the industry by the fetishization tools, and the absurd notion that whole classes of approaches can be dismissed as being 'antipattern'.
It's not very hard to weigh a suggestion. speculate about its costs, benefits and risks.
Some of those fetishized tools are themselves anti patterns.
I think the problem is that a lot of computer nerds are a bit OCD and like to apply one solution to everything. You see this with how they get crazy about strictly typed versus not strictly typed, one particular language for every application, or spaces vs tabs. I was like that when I was younger but as I get older I realized the world is the complex place and has programs have to deal with the real world there is no one solution fits all, or a best practice that always applies. To become good at programming you need to be adaptable to the problem space. Best practices are great for juniors once you've got a bit of experience you should use that instead.
My job revolves around adopting software solutions to solve practical problems and let me tell you, this mentality of one solution to everything goes beyond just the coding. I've encountered countless developers that seems to believe that the reality of a business should conform itself to how the developer believes your business/industry should operate.
Funny, my experience is the opposite. When I was younger I thought there was a time and place for everything, a right tool for the job, a need to carefully consider the circumstances. As I got older I realised that actually a lot of libraries, languages, and communities are simply bad, and an experienced programmer is better served by having a deep knowledge of a handful of good tools and applying them to everything.
Rules travel further than reasons.
The problem is that a lot of true things in the world are counter-intuitive. So insisting that all the rules "make sense" in an immediate way is clearly a non-starter. In the safety industry there are many examples of best practices that are bred from experience but end up being counter-intuitive to some. For instance, it might not make intuitive sense that a pilot who has gone through a take-off procedure thousands of times needs a checklist to remember all the steps, but we know that it actually helps.
It's hard because there is usually some information loss in summarisation, but we also have limited memory, so we can't really expect people to remember every case study that led to the distilled advice.
As a chemical engineer by training, though, I have constantly been amazed at how resistant software people are to the idea that their industry could benefit from the kind of standardisation that has improved my industry so much.
It will never happen outside of limited industries because it would appear to be a loss of "freedom". I think the current situation creates an illusory anarchist freedom of informality that leads to sometimes proprietary lock-in, vulnerabilities, bugs, incompatibility churn, poorly-prioritized feature development, and tyranny of chaos and tech debt.
There are too many languages, too many tools, too many (conflicting) conventions (especially ones designed by committee), and too many options.
Having systems, tools, and components that don't change often with respect to compatibility and are formally-verifiable far beyond the rigor of seL4 such that they are (basically) without (implementation) error would be valuable over having tools lack even basic testing or self-tests, lack digital signatures that would prove chain-of-custody, and being able to model and prove a program or library to a level such that its behavior can be completely checked far more deeply in "whitebox" and "blackbox" methods for correctness would prove that some code stand the test of time. By choosing lesser numbers of standard language(s), tool(s), and component(s) it makes it cheaper and easier to attempt to do such.
Maybe in 100 years, out of necessity, there will be essentially 1 programming language that dominates all others (power law distribution) for humans, and it will be some sort of formal behavioral model specification language that an LLM will generate tests and machine code to implement, manage, and test against.
I disagree slightly here. There may be one (1) dominant formal language that's used as the glue code that gets run on machines and verified, but it will have numerous font-end languages that compile into it, for ease of typing and abstraction/domain fit.
Who drove that standardization in chemical engineering?
I ask, because the intra-organizational dynamics of software have been ugly for standardization. Vendor lock-in, submarine patents, embrace-and-extend, etc. have meant naive adoption of "best practices" meant a one-way ticket to an expensive, obsolete system, with an eventually insolvent vendor.
That's an interesting question. I guess it's partly the fact that chemical industry is very large-scale, often with one company in charge (think Shell or Total). The realities of having one organisation in charge of many large operations across many countries probably gives higher reward on standardisation. This is a bit like coding to "Google style guidelines" or whatever. The big organisation has more incentive to fund standardisation, but the small people can benefit from that effort, too.
The magnitude of impact also means that many industrial plants fall under government regulation, and in the safety field specifically there is a lot of knowledge sharing.
I think there is also a component about the inflexibility of real matter that factors into this. It's much harder to attach two incorrectly sized pipes together than it is to write a software shim, so the standardisation of pipe sizes and gets pushed up to the original manufacturers, where it also happens to be more economical to produce lots of exact copies than individually crafted parts.
Yeah. The capital structure is radically different. And physical interop is a different game.
I suspect we would have a defined Software Engineering profession if there were only a few dozen vertically integrated firms.
The advantage of best practices is that you have something you can follow without having to analyze the situation in depth. The disadvantage of best practices is that you may have to analyze the situation in depth to notice that they maybe aren’t the best choice in the specific situation. The harm that best practices can do are lessened by viewing them as a rule of thumb conditioned on certain premises rather than as a dogma.
“Dogma” is the key word in this situation, I believe (and in a lot of similar situations). There are very few examples for when dogmatic obedience is healthy, helpful, or appropriate. Sadly, the trends seem to be heading the wrong way, with more tribalism than pragmatism.
I'm proud of my little joke on dogmas: "The only dogma I believe is that all dogmas are wrong".
I like to use the term "golden path" instead of best practices.
In a golden path, lots of others have gone before you and figured out all the nuance. But this doesn't mean the path is the best one for you, but does mean you should have a good reason for starying from it
To which they will respond with something from their VC deck, like, “because we’re Uber for <whatever>”
Buddy that’s not a reason, that’s a rationalization.
That’s why I think Best Practice is a bad term. I rather call it something like Best Toolkit or Best Elements.
Maybe the problem is the word Best. Perhaps Standard Practice is better — suggests "the usual way we do things" without presuming it is the best and only way to do things.
How other engineering industries deal with this phenomena? Why those approach do not work with programming? I feel silly sometimes because software development is huge industry and we don't have consensus on basics.
For example I think that strict formatting is a good thing. Since I tried to use Prettier I'm using it and similar tools everywhere and I like it. I can't do vertical alignment anymore, it eats empty lines sometimes, but that's a good compromise.
May be there should be a good compromise when it comes to "best practices"? Like "DRY" is not always best, but it's always good enough, so extract common stuff every time, even if you feel it's not worth it.
I often deal with this dilemma when writing Java with default Idea inspections. They highlight duplicated code and now I need to either disable this inspection in some way or extract the chunk of code that I don't really think should be extracted, but I just can do it and move on...
Those approaches do work with programming, but they don't make use of what makes programming different from other disciplines.
Software is usually quick to write, update and deploy. And errors usually have pretty low impact. Sure, your website may be down for a day and people will get grumpy, but you can hack together a quick fix and have it online with the push of a button.
Compare that to, say, electrical engineering, where there's often a long time between finishing a design and getting a manufactured prototype (let alone mass production.) And a fault could mean damage to equipment (or people) and the cost of having to replace everything. So you'll find that there's a lot more work done up-front and the general way of working tends to be more cautious.
There's also the idea of best practices as a form of communication. This also helps for programmers, as code that looks and acts the way you expect it is easier to follow. But code is primarily shared with other programmers. Other engineering disciplines (more) frequently need to collaborate with people from other domains. For example, a civil engineer's work could be shared with architects, government bureaucrats and construction managers, and best practices often provide a common familiar standard.
Compared to other engineering disciplines, software is a big unorganized mess. But it's also incredibly fast and cheap to make because of that.
You can destroy rockets, lethally irradiate people, fly planes upside down, or financially ruin a company because of software bugs, so avoid faults can be critical for software as well.
It is just that high-velocity low-reliability web and consumer application development is a very large niche. A lot of our best-practices are about attempting to maintain high velocity (often with questionable results), more than increasing reliability.
> It is just that high-velocity low-reliability web and consumer application development is a very large niche
And most of them have no care about the user experience of the end user at all.
Almost every piece of software I have to interact with on a daily basis is absolute garbage. Just full of frustrating bugs that makes most of my day when I'm forced to use a computer absolutely miserable. But to each of the devs it's just a small annoyance in their particular app. Not caring to the end user it's one annoyance that leads to a death by a thousand cuts.
Software is just atrocious nowadays.
> How other engineering industries deal with this phenomena? Why those approach do not work with programming?
A lot of engineering discipline is a way to prevent engineered works from causing unintentional injury, physical or fiscal.
Most software development is far away from physical injury. And fiscal injury from software failure is rarely assigned to any party.
There's no feedback loop to push us to standardized process to cover our asses; we'd all prefer to do things our own way. It's also pretty hard to do convincing studies to determine which methods are better. Few people are convinced by any of the studies; and there's not been a lot of company X dominates the industry because of practice Y kinds of things, like you see with say Toyota's quality practices in the 80s and 90s.
Other engineering disciplines have certification, codes and regulations for specific domains, which are enforced by law.
DRY is a perfect example though of something which in moderation is a good idea but as the article says is vulnerable to ‘inexperienced programmers who lack the ability to judge the applicability’ and if over-eagerly applied leads to over-abstraction and premature abstraction which does more harm than good.
Before regulations, other engineering disciplines have far more objective decisions and calculations than software engineering. Consider a mechanical analogue of DRY: choosing between reusing identical parts to make design, assembly and repairs simpler or designing similar but different parts because they are worth optimizing (e.g. a whole IKEA cabinet with interchangeable screws or with short and long ones). Unlike next month's shifting software requirements the cost and performance of this kind of alternative can be predicted easily and accurately, without involving gut feelings or authority.
Well I think the point is you can’t legislate on things like style, or at least it is pointless to do so and other disciplines don’t try to. DRY is a style guideline.
What you can legislate/codify are procedures, safety and outcomes. So for example building designs must be signed off by a structural engineer and architect, both of whom are liable if the buildings collapses and kills someone. There are standards materials must meet and for which materials can be used. Buildings must meet standards for fire protection, air flow, heat loss etc.
I’m not sure software is at the stage where we even know what to codify or what is good and what is not good.
>> inexperienced programmers who lack the ability to judge the applicability
In other words, the author knows better than you.
The author could have put forward precedent, principles, or examples. But instead he chose to make it about the people (inexperienced), not his arguments.
The point is there are not rules that are always applicable. Dry is right sometimes, and sometimes it's not.
I think one thing the industry does not do properly is applying different practices and standards depending on context.
An retailer website is not the same as a trading platform, the same way that a house is not the same as a railway station. But we blindly try to apply the same "good practices" everywhere.
We also have another interesting phenomenon, our products can mutate in their lifetime, and our practices should follow (they often don't) an MVP can become a critical system, a small internal project can become a client-facing application, we can re-platform, re-write, etc. That's very rare in other industries.
What makes you so sure they do? Go to the hardware store and behold how many fasteners there are. Go down the rabbet hole of pipe fittings. Consider the optimal size of lumber, someday.
And then get ready for the horrors of electrical connections. Not necessarily in how many there are; the real horror is how many think there is a "one true answer" there.
You can find some solace in learning of focusing effects. But, focus isn't just getting harder for individuals. :(
That's a great point.
In the end, other engineering areas also have lots of "it depends" situations, where often there are multiple correct answers, depending on availability, legislation, safety, physical constraints, etc.
Perhaps in software engineering people are just too quick or immature to judge.
> rabbet hole
Nice pun ;)
I'd love to claim the pun was intended! :)
A nit; DRY is probably not what you think it is. DRY is basically the same as SRP, framed differently. In SRP, it's totally valid to have the code twice if it has different meaning from a user pov.
Quoted: "You can still not repeat yourself but violate single responsibility"
(https://softwareengineering.stackexchange.com/questions/2207...)
Quoted: "Likewise, you can repeat yourself but classes only have a single (duplicated, but subtly different) responsibility."
I think it's the same thing, but I usually postulate DRY as semantically identical code, not merely syntactically identical.
"Byte for byte equivalent" doesn't necessarily mean it's a copy, if the semantics of it are different.
The problem with definition is that it's subjective and cannot be checked automatically. So my definition was about mechanistic DRY, which is objective and checked by linter.
I think we're still in the "phlogiston" era of software development.
That good?
I think we're more like Pythagoras: some useful theory about numbers, taken way too far and became an actual religion[0] listening to the Delphic[1] Oracle[2].
[0] Tabs or spaces? Vi or emacs? Is the singularity the rapture of the nerds, with Roko's Basalisk as the devil and ${insert name here according to personal taste} as the antichrist? SOLID or move fast and break things?
vs. really a real religion: https://en.wikipedia.org/wiki/Pythagoreanism
[1] Not https://en.wikipedia.org/wiki/Delphi_(software)
but rather https://en.wikipedia.org/wiki/Delphi
[2] Not https://en.wikipedia.org/wiki/Oracle_Corporation
but rather https://en.wikipedia.org/wiki/Oracle
> How other engineering industries deal with this phenomena?
They don't. CAD, the "programming languages" of most other engineering disciplines, is as much of a Wild West.
I'd say Yes and No, there are standardized ways to analyze common engineering problems, for example beam deflection equations https://en.wikipedia.org/wiki/Deflection_(engineering)
or Heat Exchanger efficiency calculations (https://en.wikipedia.org/wiki/Logarithmic_mean_temperature_d...) etc.
Often the models and equations rely on making assumptions in order to simplify the problem (cue the joke about physicist and the spherical cow). This is one of the reasons thing are designed with tolerances and safety factors.
Software like CAD and particularly Computational Fluid Dynamics (CFD) packages can simulate the problem but at least with CFD you would typically perform other types of verification such as wind tunnel tests etc.
I'm not sure that's analogous to "best practices" like "do not repeat yourself (DRY)" or "don't use GOTO". These are little more than stylistic choices that claim to offer more maintainable code. Comparable "best practices" in other engineering fields would be along the lines of "do not chamfer/fillet until the end of modelling" (one I have heard before).
Analyzing a CAD model as you describe is more like running a compiler or type checker on code already written, which is the norm in software too, but is not within in the vein of the topic of discussion.
Some reasons I would say:
- I think SW needs much more creativity than other industries.
- Typically SW is not mission critical (in mission critical things, it IS pretty much regulated to uncomfortable extremes)
You could regulate it to death, and would probably have some positive impact by some metric, but you would be easily overtaken by FOSS, where for sure there will be less restrictions.
"Engineering industry" is a misnomer. Other engineering areas have the same issues with best practices we have, industries apply best practices with a great amount of success (but not to totality).
Usually, engineering creates best practices for the industries to follow.
other engineering industries use standards that are good for stability but bad for efficiency and innovation.
Only software engineers pretend best practices exist outside of any useful context.
- small localized team vs big distributed team
- bug fixes and incremental improvements vs green field poc
- saas vs system scripts
Context matters, and if people aren't giving you the context in which they deem practices to be "best", they are myopically wrong
So outside of coworkers that have a hard time collaborating in general, is it a problem for others that their coworkers will not apply context? That has not been my experience.
I don't understand the question... If someone has a strong opinion, and they have arguments for their opinion, but don't recognize the significance of the context in which they've formed their opinions, they have blind spots they aren't aware of. Is that a problem? I dunno, that's up to you and your environment.
> Only software engineers pretend best practices exist outside of any useful context.
I am not seeing this issue with programmers in general or with my coworkers, with the exception of those who in general have a hard time collaborating with others.
So my question was/is if you discount the above exception are people seeing a problem with programmers/coworkers not taking context in to account? I have not noticed a wide spread issue and I am interested in how prevalent you, and others, perceive the issue to be.
Aren't these discussions the evidence? The fact that the author wrote a blog post and we are here discussing it. I might be missing the point of your question. This is everywhere around us in the development world. Anytime people compare react to htmx, redis to postgres, TDD vs BDD.
I'd like to point out I never called it a problem. I said that was a judgement call for you to make. We all have harmless biases.
But yeah, it can be a problem. If I have an engineer derailing my team because of his insistence for svelte, and can't read the room: ie can't take any of the context of the business, stack, domain, team, into his consideration, then yeah, it becomes a problem. Time is money
(svelte isn't a good example, it's not a best practice per se. s/svelte/TDD/)
> But yeah, it can be a problem. If I have an engineer derailing my team because of his insistence for svelte, and can't read the room: ie can't take any of the context of the business, stack, domain, team, into his consideration, then yeah, it becomes a problem. Time is money
I would describe this someone who does not know how to collaborate, maybe they don't know the balance they need between give and take, maybe they do not know how to format their ideas so they are understood by the group, maybe there is some fundamental misunderstanding. Since the tool of collaboration is not working for them, they reach for other tools to leverage and achieve their goals, like argument by authority via a convenient best practice.
The best practice/standard was not the issue, lack of context for the best practice was the the issue, the lack of collaboration or ability therein is the issue.
The author sounds like even though they have read extensively about various "best" practices, they did not really gain an understanding of the tradeoffs involved with each one.
> “Don’t Repeat Yourself” (DRY) is basically good advice, but sometimes just copy/pasting things is just the more pragmatic thing to do, and not really a big deal.
Duplicating code on purpose is not about being pragmatic, it's about recognizing when DRY would violate the single responsibility principle.
The ability to weigh tradeoffs in context is what makes some engineers better than others.
The problem with DRY is that it is expressed incorrectly and misleadingly. There is little inherent benefit in avoiding code duplication, and it can do harm when done for the wrong reasons. The actual context is change management. Will the copy of the code likely have to change in the same way as the original? Only then should duplication be avoided. The rule taken literally fails to convey this important precondition and the reasoning behind it. (And so does WET.)
> The author sounds like even though they have read extensively about various "best" practices, they did not really gain an understanding of the tradeoffs involved with each one.
It sounds to me like they did understand the tradeoffs. But that they were being brow-beaten to apply "best practices" that were inapplicable because of the tradeoffs.
I have an on-going topic with friends at work about what accessibility "means".
It annoys me to no end when devs talk about some specific technical change "increasing accessibility". The accessibility best practices are used as a checklist where more checks = more accessibility points = better. It results in people gaming the score with meaningless repetitive metadata or low-impact tweaks, rather than actually improving the descriptive/alternative/touch/visual interface. Usually never even trying any alternative method of interacting with an interface.
The best practice is "always include metadata", but it leaves off "... that adds context about the element rather than noise, and integrates with a surrounding application that uses consistent metadata labelling. Remember, this is a portion of a complete descriptive interface someone has to use."
These best practices being driven into people's brains verbatim means conversations devolve into inane on-or-off taxonomy discussions like "is this colour accessible?" or "have we added accessibility? Do you need a ticket for that?" where pushing back isn't seen as caring about users, it's seen as being "against accessibility".
https://graypegg.com/2023/11/25/the-private-definition-of-ac...
IME "best practices" is a demand problem, not a supply problem. IE far more people want best practices than there are actual best practices.
Prospects and customers desperately want to know our "best practices" and then complain when we say "it depends" or something experimentation is required, as if we are hiding secret teachings from them.
For me this is more a personality test: people who just want solutions on a silver platter vs DIYers who want to tinker and distrust black boxes.
That's an interesting point, which leads me to my main reason for coming to these comments and leaving my 2 cents: there are way less best practices of there than one would believe by looking at all the places, people and firms offering some set of "best practices".
One thing I learned after many years working in consulting is that, more often than one would believe, best practices are just a compilation of whatever could be found (hopefully at least common practices, more often "things I could find that were minimally documented to be reusable"), with no serious analysis of their claim of superiority other than them being common.
So, first thing: learn to challenge the claim of "best". Best for whom? Under what context? What other not-so-good practices are out there, and why is this the best?
Second:if it's documented and evident enough to be treated as a best practice, it's probably fairly common knowledge already. Barring the commonality of really bad things being done out there, don't expect that you'll become much more than mediocre by adopting best practices. By the time they get to be called there, they are no longer any competitive advantage, more a basic thing you should be doing already - assuming they are indeed best practices (as per my previous point).
It's not that I'm against best practices as a concept, or compiled bodies of knowledge. Just don't expect them to do more than keep you somewhere in the middle. True leadership and innovation lies where best practices have not been established yet - together with all the dangers and mistakes you can make on uncharted waters.
That's some weird ligature, on that font...
I don't think that the issue is with "best practices," or any other type of dogma.
I think the main issue, is that companies tend to hire folks that aren't especially skilled at what they do, and rely on excessive structure, to compensate, or that they don't stay around, long enough, to get comfortable with the structure.
This can apply to both newer folks, who don't understand the system well enough to effectively deviate, and ones with a lot of experience, who have allowed themselves to get so hidebound, they are afraid to deviate.
As I have gotten older, wiser, and more battle-scarred (often, from self-inflicted injuries), I have learned that "It Depends™" is the only true mantra for my work.
Usually, best practices/dogma/structure becomes important, when the codebase is being worked on by a team, and when there's the need to coordinate work between teams.
There's some type of work that just can't be done, without structure. I've done that type of work. Other work can be killed by too much structure. I've done that kind of work, as well.
There are 2 main kinds of advice that get labeled best practices in an attempt at persuasion:
1. Advice which worked in one situation - “we tried Agile, everyone should use it!”
2. Proven stuff like code review, which you call best practices when begging your org to implement it: “please let’s do it, I can clearly see how this will improve our org.”
These 2 examples represent locations on a spectrum: let’s call it “provenness”.
The author’s problem boils down to subjectivity - everyone positions different practices in different places on the provenness axis. The upshot of that is when one person says “we should do this, it’s obvious and/or it’ll definitely help” another person hears “we tried it once, you should try it too!” and then everyone has a bad time.
Then it gets confounded by everyone calling everything best practices - no matter how long ago or unproven the practices might be.
What would be handy is some generally agreed-upon yardsticks for graduating practices into or out of best practices status, plus better terminology to cover the spectrum of provenness so more sophisticated discussions can be had that account for the nuance and we don’t all talk past each other..
But then analyst companies wouldn’t get to make their glossy 2x2 charts, so it probably won’t happen.
"Don't do X" is bad. "Don't do X because it's not a best practice" is less bad, but still bad. "Don't do X because Y" is good.
Only in the last one you can either understand the reason, or ignore the rule because it doesn't apply to your situation.
Best practices are tough for practices where the foundations are not stable. And with programming, we have trouble defining the foundations. Much less stabilizing them.
And note I don't mean stable as in, not crashing. I mean it as not changing.
For a while, this was doable with java. For its warts, it gave a good foundation. Industry practice got caught up in start up enthusiasm, though, and that went out the window.
Similar could probably be said for Windows. I was not a fan of its model, but it provided a stable base for business apps for a long time.
The author sort of goes against his own advice but not diving into the specifics of why he doesn't like certain things. I get that such a piece would be longer, take more time and effort, and would draw more argumentation but that's what he's asking for in the piece itself.
Software around us is quite terrible, so it’s kind of obvious that one shouldn’t follow “best practices” that led to this state of things.
Maybe if people didn't follow best practices their software would be terrible-er? Guardrails for hard things doesn't imply no guardrails would make it easier.
That implies they were created with best practices in the first place.
If not, then what was created with best practices in the first place?
If we can agree that most large, financially successful software projects are of questionable quality, then either
- they used best practices and yet they still suck, OR
- they did not use best practices, but are widely successful anyway.
So no matter how you look at it, software best practices just haven't panned out.
"All hardware sucks, all software sucks."
The name “best practices” kind of implies that they actually are practiced somewhere. So it’s different from theoretical abstract ideas “how we should write software”, which maybe nobody follows.
Like any other discussion of this kind[1] I think this one will go nowhere because 1) the question doesn't have a black and white answer, it's a sliding scale. 2) almost no one is giving examples of what they mean; if they were, they could find that they agree with the person they are replying to. And 3) most people will discard the experiences of people they disagree with "you didn't even try! maybe the problem is you!", and this is easy because this is the internet and the other person can be (is probably?) making it up.
[1]https://www.joelonsoftware.com/2009/09/23/the-duct-tape-prog...
The longer I program the more I agree with this take. I have seen too much power and dominance from people who wield it as a cudgel against everyone else in the room who doesn't care enough to double-check the practice.
Put another way, knowledge is knowing best-practices, but wisdom is knowing where and when to apply them. Unfortunately, most building software have only the knowledge and there is too little consideration for the fact that structure is not free, and considerations must be made for when velocity is the primary objective vs safety and adherence to best-practices.
It all comes down to architecture design in the end.
“Best practices” have nothing to do with engineering. They are a marketing concept. They are a kind marketing better known as bullying.
In every case where you want to say “best practice” there is a better alternative, which is to say “practice.” The concept of best is never needed or warranted, because practices are not subject to rigorous testing and review.
I have been an independent consultant and trainer since 1999 and not once have I taught or recommended a best practice.
I do have many opinions. I call them: opinions. I think my opinions are the best, but I can’t think of any reason that anyone else beyond my wife and dog should think so.
I agree with the sentiment of the article, but Postel law is a good idea that has very little to do with the context. Of course the real problem is that it's a very small advice that needs to be put into context each time, and here is the failure of all the types of advices: they are not substitutes for intelligence and good design tastes.
Perhaps it's due to the very large number of factors in a code base and/or the particular problem space.
But as I got more senior, when asked by less experienced developers the best way to do something, my answers tended to increasingly start with: "Well, it depends...".
And that's the thing, there is no universal best practice for everything, there are solutions which are more often than not good enough, but as all solutions tend to be a trade off favouring a particular scenario over another, sometimes they're also highly inappropriate to the particular circumstances.
Another term for someone trying to blindly apply supposed best practices is "cargo culting".
In summary, there is lot's nuance to software development and the particular circumstances it's being applied to meaning that you need to understand the trade offs of particular solutions to see which one makes the most sense for a particular situation.
Someone once told me something like, “rules work best when human judgement is used when applying them”.
I think that’s what this article is basically saying. And I agree.
> “use single point of exit”, and that kind of absolute insane bollocks
As always it's a matter of context. If you have a low level language like C without any automatic cleanup having an out label to goto which cleans up in all cases makes a lot of sense.
There's always some reasoning behind those rules, sadly rarely talked about directly. You should check if that reasoning applies to you and weigh the benefits against the costs. A lot of time it's just doing what everyone else is doing, which is a very fine reason, as it makes onboarding new developers easier.
I once put in an e-mail the following "best practice applied for the sake of best practice, is bad practice" to consultant who chalked up an awful decision to "applying best practices"
Very handwavy. Just like the aforementioned zealots, idiots and assholes.
While I guess most would agree all the cited practices have been abused what is really the alternative? Absence of practices does not make things better - history has plenty lessons there. To run medium to large scale projects one needs patterns to ease communication and integration.
The real question is how do we prevent best practices to be perverted and I fear the answer is having the right people in the right places. The one best practice to rule them all: Have knowledgeable balanced people who know when to break the rules.
The alternative is to accept that every solution has drawbacks and trade-offs. Best practices were an attempt to codify design standards that would be equivalent to "buying IBM" in the sense of the old phrase, "Nobody gets fired for buying IBM."
That was always a bad idea. Often the best choice in one context would be a bad choice in other contexts. You don't want an engineer in a 20-person startup making decisions like they're at Google, or vice-versa. You have to take responsibility for deciding what's best for a particular problem in a particular context, without looking for the cover of "everybody does it this way."
No.
I've worked too often with people who think they know better
They do not
Straw men do not change the fact that "best practices" , especially the ones quoted, are very good.
No sensible person is saying "never use globals". We caution you to think very carefully before you do, only for truly global state.
I am suffering from inherited code, written by a very good programmer, who got lazy with globals and comments. Too many of the former, too few of the later. What a nightmare
This article is arrant nonsense
I've been on a team where most comments I added where deleted because they believed it was best practice to avoid them getting out of sync with the code.
So even your comment disagrees with your claim that "best practices" are very good.
> No sensible person is saying "never use globals".
Maybe so, but still, plenty of people are saying it.
"Postel's law" is an ironic name, I don't think the people who coined that term meant that it was an infallible rule of the universe, just what seemed a good rule of thumb for implementing protocols.
I still like Postel's law, but don't imagine for a second it's a 'law' with the kind of authority the article implies, and didn't enjoy the article's slime about people who like it!
So we shouldn't:
- Use source control?
- Have build automation?
- (at least some) automated testing?
- Protect data structures accessed concurrently with a mutex?
- Have backups?
I wouldn't say there isn't some imaginary situation where you'd do something different, but it's safer to fall back to doing the above in most situations.
That said many people make up "best practices" or use them without understanding the circumstances to which they apply.
Best practices and such shouldn't be obeyed as "laws" but guidelines for general work. Once in a while there's good reason to avoid / skip them for whatever reason.
The bigger issue is that developers (me included) are usually making the decisions in their own heads. Usually the reasons are quite ok but they are not really said out loud or documented.
I've been stumbled upon this as both developer trying to get their code approved and when doing a code review.
For the developer it feels super annoying that somebody nitpicks about things which the developer has probably gone through in their heads already and ended up with the resulting solution. Just like Martin in the post complains and reacts passive aggressively towards reviewers mentioning these things.
For the code reviewer it feels like the developer is being sloppy or doesn't care about our common way of doing things thus increasing the need to nitpick and explain so that the developer understands how it should be done.
The solution for this is actually quite easy: document in the comments and in the Pull Request _why_ you're breaking team's / company's guidelines and why you think it's warranted for solution for this case. This seems to remove quite a lot of friction.
Oh and of course it doesn't mean that the guidelines should be broken all the time just because one thinks this kind of stuff is for "idiots|assholes". When working as a team, we need to adhere to common way of working for most of the time.
I see "best practices" as just case studies. Here's one possible way to do it. If it intrigues you, you can do the same.
Most of these "best practices" have valid opposing camps anyway. There's DRY, but there's also "a little copying is better than a little dependency". Both are valid in different contexts.
For me I will always try to seek out best practice as an inspiration - to find out what the most knowledgeable people in this field do.
But if I can't understand why I should do that instead of something else I've thought of, then I'll do it my way thank you.
Following "Best practice" isn't about doing what is "best", it's about being predictable.
In many cases, being predictable is better for future maintenance than forging your own path, even if the custom solution is "better" against all (current) metrics.
"Best practices" is a tool that you need to know how to use it. Same as any other tool. If you do use the same tool for everything, this is when a tool becomes a bad thing.
"Best practices" is a really good tool if you use it in the correct context.
"Because it's best practices" can not be used as an argument.
I've found that "best practice" generally means "what you will not get fired for doing". Sometimes it is a good practice. Sometimes it's something the higher-ups read about in CTO magazine and decided that everybody now must do. I'd say use your individual judgement, but depending on your role and the company, using your own judgement may be beyond your pay grade.
Teaching threads on a operating systems course, i have witnessed some of the more convoluted code ever by students so afraid on global variables that they turned simple exercises into monstrosities. So yes, f*ck the rigid rules. They have to unlearn best practices guidelines, to which they were "blood-written sacred rules".
Well, in class environment, they've probably figured out that it's not a sane or correct answer that gets the points. It's the answer that follows all the rules that have been set in the material for that class. It's extremely typical for teachers to grade based on the "best practices" and "sacred rules" that the classes teach. Showing ideas, outside knowledge or solutions that are simpler than the course material just gets a student punished.
most "best practices" are just trends that got way out of hand. Microservices, long variable names, XML, TDD, SCRUM, OOP were just marketing concepts that turned into a cult.
Books like Code Complete are useful tools but not bibles.
Just like everything on the internet: it's just another person's opinion. What matters is what works (i.e. makes you money).
Best practices for a Google scale company, where most best practices come from, is NOT the best practice for everyone else.
As someone at Google, I hardly believe they deserve to be used at Google. We have like 20% the efficiency we could have if we were more selective about procedure.
The first paragraph of this is the truth. There is something wrong with people who enjoy telling other people what to do.
>There is something wrong with people who enjoy telling other people what to do.
Telling other people what to do is human's favorite thing. Give a man an opportunity to make and/or enforce rules, and you have a very happy man. People dedicate their whole lives to reaching stage after stage in their rule-making game.
Great article! I feel like I am dealing with so much of this behaviour at my job recently. It feels like most of the people espousing Best Practice never learned that "appeal to authority" was not a reasonable argument.
It occurs to me that the author is explaining his best practice is to look at someone else's best practice with a gimlet eye and take what you want from it. So I wonder how to apply his best practice to his own best practice?
Best Practice is to not care about the stupid asshole who came up with a brilliant method that works.
I tried to have the grammar checked by chatgpt but it was too challenging
I want engineers to think about the best for the project but I let engineers talk about the best of implementation because it's better than doing irrevant things.
> “Don’t use globals” is obviously good, but “never use globals under any circumstances” is just silly
I'm actually not even convinced that this is a good rule. When it's explained it makes sense on the surface. I also think that since beginners to programming use global variables, they come to the conclusion that it must be universally bad since that's what their beginner self did.
Having work on codebases with most state being being stored as global variables (in a single namespace at that) is manageable at worst and easy to develop on at best, assuming certain conventions are followed.
The real problem is dogma and cargo culting.
I like to phrase this kind of thought as: "The only dogma I believe, is that all dogmas are wrong".
Living through the 2000s was enough to give me a permanent aversion to "best practices." So many things that we look back on in horror were smugly and haughtily justified using those words.
There was a culture of best practices zealots that had an uncanny resemblance to organized religion. All the answers have been written down for us, and everything that goes wrong is because of arrogant, misguided people who decided to trust their own fallible thoughts instead. Their faith was strong enough to survive any project, no matter the outcome.
There is a whole book on this topic. I think it is called Street Coder.
It is artificial standards for review. You can break it, but do it if u can.
Ralph Waldo Emerson: "A foolish consistency is the hobgoblin of little minds, adored by little statesmen and philosophers and divines. With consistency a great soul has simply nothing to do. He may as well concern himself with his shadow on the wall. Speak what you think now in hard words, and to-morrow speak what to-morrow thinks in hard words again, though it contradict every thing you said to-day. — 'Ah, so you shall be sure to be misunderstood.' — Is it so bad, then, to be misunderstood? Pythagoras was misunderstood, and Socrates, and Jesus, and Luther, and Copernicus, and Galileo, and Newton, and every pure and wise spirit that ever took flesh. To be great is to be misunderstood."
https://en.wikipedia.org/wiki/Wikipedia:Emerson_and_Wilde_on...
(This is relevant to the extent that programming is as much art as science/engineering.)
Follow best practices unless you can give a reason not to. "Best practice" is a shorthand for a lot of accumulated knowledge that you don't want to go over again every time. Also following BP makes the code more consistent and thus easier to understand. But when an argument arises, go back to the underpinnings of the best practice and work from there.
Basically if you know exactly why the best practice/rule is in place, and know for sure it does not apply, just skip it. But not before.
https://en.wiktionary.org/wiki/Chesterton%27s_fence
That's very reasonable.
‘The reasonable man adapts himself to the world; the unreasonable man persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man.’
> Therefore all progress depends on the unreasonable man.
The unfortunate corollary to this is that all retrogression also depends on the unreasonable man. The reasonable person (as defined here) maintains the status quo, for good or ill.
I would reverse this: if you can explain to me exactly why your so called "best practice" applies here and now, good. Otherwise it's a nice input to have in the discussion, but nothing more.
It depends on your risk appetite. A best practice likely keeps you away from pitfalls known to the community of the practice but not to you. It also may keep you away from crafting a solution that is optimal for your specific circumstances not known or applicable more widely.
In high-reward / low-risk environment, such as building an indie turn-based retro-style game, go with your gut feeling unless you have a good reason not to.
In a high-risk / dubious-reward environment, such as implementing cryptography, follow the best practices to a t, unless you know intimately how things work and maybe codified some of the practices.
There is a wide gamut between these two extremes.
> A best practice likely keeps you away from pitfalls known to the community of the practice but not to you.
In my experience, many "best practices" are the pitfalls you should be wary about, as they can easily translate into hundreds or thousands of lost hours of work and derail and doom entire projects. (The most annoying part of this is that the real causes won't be found, precisely because "best practices have been followed". Therefore the reputation of the best practice will stay untarnished).
Cryptography on the other hand is a well known example of something you should not touch at all unless you are an absolute expert- that's not even a "best practice" but probably the only reasonable practice.
I could agree. It boils down to “you have to use your brain, and not try to invent and follow blind some rules”
Often what is one developers "best practice" is another's "anti-pattern" because a lot of this is just arbitrary.
If it is arbitrary, it’s “standard practice”.
Which still has immense value.
It's standard practice to install outlets with NEMA connectors in North American buildings. Sure, you could technically swap those out with a more optimal connector that is "better" (one that prevents electricity from flowing while the plug is partially exposed, for example), but using the standard practice is best practice for human-shaped reasons that are often not apparent to early-career engineers.
I’m a bit confused with the analogy here. Would the non NEMA outlets work with my existing things or is the implication that they wouldn’t?
They wouldn't, but about half of the developers commenting here would do the equivalent of switching from NEMA to something else on the grounds that the something else is better.
There’s usually nothing “best” about it.
There are many like that. Every practice is a trade off.
However, there are many where the cost/benefit ratio is so large that you can default to "you should just do this".
I dont think Id ever look at a company that e.g. had no CI or versioning for a large project for instance and think "they might have had a good reason for this". They didnt.
Hot take, the biggest advantage to following «best practices» is that when someone else stumbles over your project, they can follow along more easily
> Follow best practices unless you can give a reason not to.
Cargo culting much?
I'd say, follow best practices only if you can say exactly why it is best practice.
I don't think so. For example, If I'm writing something in a programming language I don't know by heart, I start by following the best practices recommended by the language authors, then start to flex them when they become less useful, until I hit a nice pitfall which these best practices are designed to avoid.
This allows me to start faster, and be in a better shape in short term. Then I can flex advice more and more as I understand how a language works under the hood.
I also read the language tutorial first. But if they don't explain their best practices (and often the explanation is simple), I don't care much for them.
I don't mean tutorials, I meant guides. For example Go has guides on how to do something. e.g.: How to Organize a Go Module [0].
[0]: https://go.dev/doc/modules/layout
What if the "best practice" was invented by a VC funded company (like, just for example, Vercel) desperate to maintain their growth trajectory by any means necessary, including huge marketing pushes to convince everybody the "best practice" is the "best way to do things"?
It's downright dangerous to assume a "best practice" in software development somehow automatically means it's some super distilled wisdom juice. A whole lot of it, in my experience, is just hype and rapid, unquestioning cargo culting slamming in right behind the hype.
Use your fucking brain. If the "best practice" is objectively a good solution to your problem then use it. If not, think of something better.
I mean, consuming anything without vetting the source of the information is a bad practice in general.
...or,
The best practice of best practices is vetting the source of the best practice to verify its authenticity.
No?
Yeah exactly.
A lot of the time best practice can also mean “we did it this way last time and it was ok”. I don’t think anyone is saying don’t find improvements in “what is currently defined as best practice” and if they are then that’s your problem.
The iThe
I hear the phrase "best practices" not from the best practitioners, but from Dunning-Kruger types.
Recently I was told that Hungarian notation was "best practice" and I must use it.
I think the problem is that the definition of "best practices" is a little muddy.
for example:
- use source control
- don't put spaces in file names
- camelcase is better than underscores in variables
- vi not emacs
some are really best practices, some are controversial choices someone is pushing.
Some places I've worked with good policies deliberately left some open. And made choices earlier on troublesome ones like spaces not tabs.
I think the choosing or defining vs not choosing is what companies should do to define themselves.
Best practices are a way to avoid building everything up from first principles every time. They are more, “here are rules that are generally applicable so you don’t have to waste time thinking,” vs “here are laws of god which are always applicable.”
It is an error to throw out every best practice and reconsider everything just as it is to blindly follow the best practices.
IMO it’s best to apply the best practices by default and question them as they are used. Essentially trust but verify. Assuming they are right most of the time you get the benefit of speed. Assuming some of them are wrong, it still leaves room to course correct.
Maybe "good practice" is a better term.
I've seen (and started using) "current common practices" instead of "best practices" where it makes sense.
In an always-evolving technology landscape, it feels like a better representation of what we're doing, and it helps prevent the kind of dogmatic stubbornness that forms around the word "best".
Ragebait post unless you enjoy playing semantics games with verysmarts. If you don't understand the purpose of the term "best practices", they were probably made for someone like you.
The problem is when one does understand, but is being subjected to them by someone who doesn't.
yikes
[dead]
[dead]
I can sense the hackles rising. The arcane magi of the Guidance Council have scried a rebel wizard in their orb. Every cantrip and charm must be uttered at once, with maximal urgency! Invoke the spirits of the tomes!!!