Seriously, I once had to prove that mulplying a value by a number between 0 and 1 decreased it's original value, i.e. effectively defining the unary, which should be an axiom.
Mathematicians like to have as little axioms as possible because any axiom is essentially an assumption that can be wrong.
Also proving elementary results like your example with as little tools as possible is a great exercise to learn mathematical deduction and to understand the relation between certain elementary mathematical properties.
My math teacher would be angry because you started from the conclusion and derived the premise, rather than the other way around. Note also that you assumed that division is defined. That may not have been the case in the original problem.
Your math teacher is weird. But you can just turn it around:
c < 1
c < x/x | •x
xc < x q.e.d.
This also shows, that c≥0 is not actually a requirement, but x>0 is
I guess if your math teacher is completely insufferable, you need to add the definitions of the arithmetic operations but at that point you should also need to introduce Latin letters and Arabic numerals.
This isn't a rigorous mathematic proof that would prove that it holds true in every case. You aren't wrong, but this is a colloquial definition of proof, not a mathematical proof.