The algebraic proof that 0.999… = 1 must first prove why you can assign 0.999… to x.
My “proof” abuses algebraic notation like this - you cannot assign infinity to a variable. After that, regular algebraic rules become meaningless.
The proper proof would use the definition that the value of a limit approaching another value is exactly that value. For any epsilon > 0, 0.999… will be within the epsilon environment of 1 (= the interval 1 ± epsilon), therefore 0.999… is 1.
Yes, but similar flaws exist for your proof.
The algebraic proof that 0.999… = 1 must first prove why you can assign 0.999… to x.
My “proof” abuses algebraic notation like this - you cannot assign infinity to a variable. After that, regular algebraic rules become meaningless.
The proper proof would use the definition that the value of a limit approaching another value is exactly that value. For any epsilon > 0, 0.999… will be within the epsilon environment of 1 (= the interval 1 ± epsilon), therefore 0.999… is 1.