In computing, we have boolean values. We expect there to be a true
and there to be a false
.
It lets us do things like this:
a = true
if a == true
puts "a was true!"
else
puts "a was false!"
end
If we run that in IRB, we can see that "a was true" prints out. This makes sense to us because we set a to true.
But we also have a shorthand.
a = true
if a
puts "a was true!"
else
puts "a was false!"
end
This here works the same, doesn't it?
But what if we changed things ever so slightly.
a = "Harry Potter"
if a
puts "a was true!"
else
puts "a was false!"
end
What do you think prints out here? Well, it will print out "a was true!"
Why does it do this? It does this because any object except for nil
and false
will evaluate to true.
If we do this:
a = "Harry Potter"
if a == true
puts "a was true!"
else
puts "a was false!"
end
"a was false!" will print out to the screen because a is not true
.
So we can say that when we do if a == true
we are checking for strict equality,
but when we do if a
we are simply checking to see if is truthy.
This also works in the reverse.
a = false
if a
puts "a was true!"
else
puts "a was false!"
end
This we can predict fairly easily.
But what about this?
a = nil
if a
puts "a was true!"
else
puts "a was false!"
end
Here, nil
will evaluate to a false
but it doesn't equal false
. nil
is falsey
.