Viewing a single comment thread. View all comments

Emyrssentry t1_j6k7152 wrote

Let's look at numbers very close to 0

1/.1=10

1/.01=100

1/.0000001=10,000,000

Etc. The closer you get to dividing by 0, the closer your answer tends towards infinity (and negative infinity if you come from the negative direction.)

So why would you arbitrarily define 1/0 to be 0? It doesn't give you any extra predictive power. It doesn't really lead to much, and you still have to leave it out in any situation where you're not defining it. It's just not a very useful definition.

1