I was given the following homework problem:Let $x \in \mathbb{R}$ and $f,g \colon \mathbb{R} \to \mathbb{R}$ be differentiable at $x\in \mathbb{R}$.If $f(x)=g(x)=0$ and $g'(x) \neq 0$, show that$$\lim_{t \to x} \frac{f(t)}{g(t)} = \frac{f'(x)}{g'(x)}.$$
We have already proven L’Hopital and have covered theorems concerning limits in the context of “Principles of Mathematical Analysis”, 3rd edition, by Walter Rudin.
The conditions for indeterminacy, non-zero derivative, and existence of quotient do appear to hold, butI am worried that since there is no continuity condition on $f,g$ on a neighborhood of $x$ the question could be flawed, as L’Hopital wouldn’t hold. Furthermore, if L’Hopital does hold, what guarantees $$\lim_{t \to x} \frac{f’(t)}{g’(t)}= \frac{f'(x)}{g'(x)} ?$$