0/0
’s value it self equals to ?
or “Any”, You may find 0/0 to be an error or undefined in programming or wikis. But thats actually very wrong, because 0/0 is actually everything we use… hence we use numbers and symbols we define in equatations, problems etc… lets take for example: x^2 + x - 2 = 0
this would be (0/0) ^ (0/0) + (0/0) = (0/0 or well 0, almost always when we use it this way to find x, once again because we defined them to replace $0/0$ by our own value that we choosen)
Or lets take for example a coffee machine, it has 2 buttons… A (lets say id=1) to make a short one and B (lets say id=552) to make a long one.
When those buttons are pressed the machine will choose one of the 2 buttons’s id (the pressed one) and will trigger its event. but the value of the buttons before we made it such is 0/0
or well ?/ANY/UNDEFINED
once we add the button ID, the machine can finally recognize the number choosen and procceed with the event, if the number isn’t defined it ofc will throw an error or if t