The main claims extant in the literature was that very high frequency words and very low frequency words would be immune to leveling, to regularization.
Very high frequency words are learned as isolated words and never fully integrated into a paradigm. Each form is learned separately and there's no overriding need to put them into some sort of pattern. Or at least not into the main pattern.
Very low frequency words can have quirky phonology and morphology because their quirkiness marks them (not in any Prague School sense) as low frequency.
The rule isn't so much a rule as a statistical tendency for innovation of irregularity and innovation of regularity in different contexts. The UR dissertation, IIRC, employed a network model of associations between lexemes and paradigms, coupled with some sort of stylistic/frequency-based metric. Not a quick read, if you look at the details.
When the words became "irregular" also matters and the extent to which they seem to fit into some sub-phonology, into some less common but still regular pattern, also matters. As long as "teeth" looked like it was "tooth" + ending that triggered the regular fronting of /o/ it wasn't irregular. Words like "ride" were regular but were taken to be in the same category with "irregular" verbs with a vowel change more or less regular for verbs in that category.
Last I heard "children" was a linguistic basket case without a good solution, or at least without a solution that was really plausible and provable. The /r/ or the /en/ doesn't belong there. IIRC, the -en was original in the dialects that produced the core of English. My standard guess is "hypercorrection" needs to be invoked--people heard forms ending with /r/ (parallel to German Kinder) and forms ending with /en/ and merged them in a non-regular, unpredictable way. Better to have both endings than leave off the right one.