There are many math professors and papers that will tell you all the good reasons for why we cannot multiply or divide by zero. That or why there is no reason to multiply or divide by zero. There is a way to solve these equations and it is by substitution. I will also attempt to explain why there is a reason to do it sometimes.
Does 3 times 3 not equal 3 + 3 + 3, or 4 times 4 not equal 4 + 4 + 4 + 4? Does 9 divided by 3 not equal 9 - 3 - 3, or 16 divided by 4 not equal 16 - 4 - 4 - 4 Therefore could we not rewrite the standard "A" is equal to "B" divided by "C" as "A" is equal to "B" minus "C"? If "B" was equal to 1 then 1"B" - 0"C" = 1"A", 1"A" + 0"C" = 1"B". The same is true for multiplication. If "A" is equal to "B" times "C" and "B" is equal to 1 then 1"A" = 1"B" + 0"C" and 1"A" - 0"C" = 1"B".
Some are asking what the point of this is and that is why does dividing or multiply by zero make the original value or quantity vanish/disappear? If we have 10 units and we divide by 5 people so each person gets 2 units, why do the 10 units vanish if we divide by zero? If a pie exists and we do not divide it up by a certain number of people what causes the pie to cease to exist? Are the 10 units not "already" a group before we decided to mess around with it and perform math operations on? Must one person always own or have possession of something for it to exist? How does dividing or multiply something by "nothing" make "something" disappear? That is some magic there.
Therefore, the 0+0+0+0+0+0 argument is invalid because we never make that mistake. We never attempt to divide zero by zero since we retain the original information. Why throw away the original information or group? It was a group before and being a group the minimum value for the "group" would be "one" not zero. What logic allows us to start with one pie and end up with 100000000000000 pies or none at all? Should we not be able to work backwards and still have the "one" pie we started with? How did we get something from nothing?
Multiplication and division are derived from addition and subtraction. They are nothing more then "simplifications" of addition and subtraction. They are to make math easier for us to do and write so we do not have to do 5 + 5 + 5 + 5 + 5 = 25, 5 x 5 = 25, or five squared is equal to twenty-five. Therefore, math professors are both right and wrong. You should not multiply or divide by zero, but if you did nothing should really happen. Take any object and place it in your hand. Multiply it by zero or "nothing". Did it vanish from your hand? Why not? Divide it by zero. Did the object multiply into infinite copies of itself? Why not?
Now take that same object and "add" nothing to it. Logic would say the object should remain unchanged, WHICH is what happens since you added nothing. Can we "subtract" nothing from the same object, and have it remain unchanged? Yes, we can. Therefore 1 divided by 0 is equal to 1, and 1 multiplied by 0 is equal to 1 in theory.