Why 0 (zero) is truthy?

After re-reading the tutorials, reading about truthiness in the control flow page, it is strange to me that 0 (zero) is truthy instead of falsey.

Out of curiosity, why is this so? Isn’t 0 falsey in most languages, compiled or not? Sorry if you already discussed this. :blush:

This isn’t a problem for me, but couldn’t this be a problem for other developers who are coming from other languages?

Play link

I believe it’s only false in C and C++? What other languages have this behavior?

The true answer is: because it works like that in Ruby, and I think it makes a lot of sense.

1 Like

Python definitely has false zeros and a wider concept of false (empty strings, IIRC). I’d be surprised if javascript doesn’t automatic coercion of it to false as well, considering it do all other coercions in the most surprising way possible.

Which has led to some really weird behavior, like back in python 2.x where one of the time classes ended up false at utc midnight …

1 Like

Thanks for the answer! Since Crystal is heavily inspired by Ruby, makes sense.

PHP and Javascript

1 Like

Doing a bit of research, this is just a language design decision, and one can base this decision from electronics, math, boolean algebra, or anything else.

There is simply no right or wrong in this particular case.

1 Like

If I remember right, Ruby thinks 0 is a number, and because that’s not false or nil, it’s truthy. It did trip me up as well when I was doing both Ruby and Javascript years ago.


Maybe a good final explanation could be that the integers 0 and 1 have nothing to do with the bits 0 and 1, as explained in the boolean algebra article.

Even if this could be intuitive, it may be better when the programmer understands that distinction.

1 Like

In traditional C there is no bool type. Boolean values are simply expressed as integers 0 and 1. FALSE and TRUE are just macros.
That’s where the association of 0 == false for modern programmers comes from. Of course this is probably rooted in the fact that 0 and 1 are also the respective representations in binary logic.
But I think for software development a major factor is just based on an implementation detail of the C language.

Many more modern languages have a dedicated bool type: Crystal, Ruby, Java, and many more… even JavaScript. Even C99 introduced a _Bool type. The reason for that should be obivous: Avoiding confusion between logic values and integers. When you do that, it’s clear that the integer 0 must not be falsey or you’d add to the confusion again (thanks, JavaScript! :man_facepalming: ).

And in semantic terms it makes very little sense that a number should be falsey. A number is a “thing”. It’s not nothing, even if it has a zero magnitude. And it certainly has no inherent logical proposition.


I think it’s a good decision, even if it may appear a bit unintuitive on the surface. Letting various things evaluate as false may allow for some shorcuts in your code occasionally, but that’s just very unclear bad code to do something like if(arr) (javascript) to see if an array is empty.


Is it possible to pin these type of thread questions for the benefit of new users in the forum? Coming from Ruby you know what truthy|falsy are, but it may seem weird coming from some other environments.

Maybe have an FAQ forum category of threads to point users to when these type of basic questions come up.


I can’t find it right now, but I think to remember there was somewhere a list of differences to ruby (including their crystal equivalents). Imo it might be the best to add at the same spot some kind of “things you should be aware of if you aren’t a rubyist” page, and list all the stuff which crystal adopted from ruby but are less common (and often would be unexpected) in other languages.

0 being falsy in javascript (and other language that support lazy evaluation) was the reason they needed another operator to handle null:

?? Helps differentiate null from falsy. That also means it’s useful in scenarios where we could have Bool | Nil.





Most casting systems from most languages casts any “null, empty, zero” value as False when contexts allow such conversion, everything else is True.

1 Like

Thanks! I don’t think it’s that way in Rust because you probably need to do an explicit cast.


I think you’re confusing different concepts. This topic is about whether integer values 0 and 1 are considered as booleans values false and true in conditional expressions.
In practical terms: Does 0 ? "t" : "f" (or an equivalent statement in the respective language) evaluate to "f"?

Rust: bool is a dedicated type (bool - Rust) and there’s no implicit casting between bool and integer types. The cited docs state that you can explicitly cast a bool to an integer (via mybool as int32) to get those values. That’s equivalent to Crystal’s Bool#to_unsafe.
In fact, Rust is even stricter regarding the type of conditional expressions than Crystal. They must be of type bool: if 0 { "t" } else { "f" } is a type error.

Python: Yes, Python treats zero values as falsey. And other values, such as empty collections. The definition of truthiness is much broader (or narrower rather). That has already been noted in Why 0 (zero) is truthy? - #4 by yxhuvud

PostgreSQL: The cited documentation states that 0 can serve as a literal for the bool type (0::bool). But bool is still a dedicated type and there’s no implicit casting between bool and integer types. It only means that when a bool value is requested, 0 and 1 will be interpreted as boolean literals. That’s similar to Crystal’s autocasting for number literals.
And PostgreSQL has strong typing for conditional expressions, similar to Rust. Only bool is allowed. SELECT (CASE WHEN 0 THEN 't' ELSE 'f' END); is a type error.


The Ruby behavior is unique and kind of insane because it makes the language prone to some stupid errors to any newcomer from other languages. It will mess with the minds of people coming from boolean algebra classes, that probably any other language tries to honor at some level when values as 0 and 1 are accepted and used for boolean expressions.

If I were involved in the Crystal design, that’s one place were I would change the behavior compared to Ruby. Avoiding accepting 0 as true, maybe I would follow the Rust path, and not allowing any implicit conversion where Booleans where expected, only explicit.

So a

x = 0
y = true
if (x && y) then
  puts "what?"

Instead of “wrongly” printing “What?”, would produce a compiler error like “x: not a boolean. convert the expression to proper values”. One fix for it would be

if ((x != 0) && y) then ...

and a

0.to_bool, 1.to_bool

Would render as false, true in Crystal

I feel like treating 0 as false would be madness.

In this example, it would clearly be wrong to treat 0 as false.

delay : Int32? = nil

# set delay from config file if present (else it is nil)

if delay
  # do stuff using the set delay
  delay = 10
  # do stuff using the default delay

In general, I don’t feel like “other languages do x” is a particularly strong argument.
I use Crystal because I agree with the majority of the choices that’s gone into it, and I don’t use the other languages because I disagree with the majority of their choices.


Imo it’s not a strong point for adopting the same behaviour, but a very good reason to mention it in the docs maybe in a way that reflects how common or uncommon it is in other languages (so in this case, explicitly, probably more than once, and maybe even as part of example code if there is any).