People sometimes overstate the impact of length extension attacks. If your format has a length prefix (really common) then you may well be "vulnerable" in the sense that appending arbitrary data is "valid", but a canonical form without the appended data is trivial to construct; and indeed most software would likely completely ignore that extra data.
HMAC is a neat trick to avoid length extension attacks (and other issues) in a generalized fashion, but that doesn't mean those risks actually apply in practice. (Some googling finds e.g. this paper: https://www.iacr.org/archive/fse2009/56650374/56650374.pdf which proposes an attack on length-and-key prefixed messages, using some sha1 weaknesses and merely over 2^84 memory and 2^154 queries - color me impressed, but not scared). Edit: just to be clear, I'm not suggesting anyone actally use LPMAC-sha1 given the current state of sha1.
For another example; in general it's unsafe to truncate a "secure" hash - hashes that satisfy most security requirements can be constructed that are not safe when truncated (e.g. sha3 prepended by zeros is still safe, but obviously not if truncate the sha3-provided bits off). But I don't know of any mainstream hash where this theoretical risk actually applies (e.g. no merkle-damgard hash suffers from such a risk); nobody constructs hashes intentionally with more bits than entropy.
It's probably still wise to stick with known-good constructions, but the risks seem overstated, and the difficulty is also overstated - assuming the primitives used aren't too flawed. Sure, it's cool that HMAC can use even flawed things like MD5 and retain safety, but typically nobody is forcing you to stick with md5. I guess the world is more complicated if you need to pick a protocol and then you're unable to change it, but most applications can (with some effort) be changed. You need something safe now, not for all eternity.
So, I think the rule is simpler: this has little to do with crypto per se; just don't be unnecessarily clever, in general. Crypto makes the consequences particularly nasty, often. But that's about it.
This was good meme that served its function well when it was needed - early enthusiasm for reusable cryptographic primitives and a failure to recognise the foot-shooting potential lead to many easily broken schemes.
Now, however, "don't roll your own crypto" is dogma, and if anything we have the opposite problem of monoculture and slow progress. I think a more nuanced view is required, one that encourages experimentation when the stakes are low and more competing implementations when the stakes are high (or perhaps we should call them "complementing" - a standard ought to have multiple implementations).
As Wikipedia puts it, "Mathematical analysis of [security] protocols is, at the time of this writing, not mature... Protocol design is an art requiring deep knowledge and much practice; even then mistakes are common." How are programmers to practice, if they are not allowed to fail?
It's why you don't use a bare hash as authentication, but instead use a HMAC.