@jzakiya Great reply!
As an answer, I think the main reason we do these changes is because we want to see Crystal be as useable as possible without having to introduce breaking changes after 1.0.
The reason for
// is more of a purity thing:
/ for integers does floor division, while for floats it does a float division. The semantic of the operator changes for different types. With the current approach (after Crystal 0.31.0) this is now consistent:
/ always means float division, and
// always means floor division.
This has been discussed a lot in the past (in the GitHub issue and also in other communities or even in Python’s PEP) but as a summary:
10 / 4 giving 2 is not intuitive. If you do math, if you put that in a calculator, if you ask Google, everyone will tell you the answer is 2.5. So expecting 2 is familiar if you come from C, Java, Ruby, Go, etc., but it’s not simple (I recommend Rich Hichkey’s Simple Made Easy talk).
- When programming in Ruby many times I have to remember to add
.to_f to the first or second argument of a math operation just to make sure I do the math right. If you are familiar with C, Ruby, Java, etc., this is something you eventually get used to, but it’s so much simpler just to be able to write
x / y and know you get a correct result. Ruby also has
fdiv because converting to a float doesn’t work nicely with complex numbers.
You can also try to do the exercise I commented here:
Write a function that computes the average of two numbers. The signature is this:
def average(x, y)
# fill in the body
The function should work with every number type: integers, floats and also complex numbers.
Here are some examples:
average(10.5, 2.3) # => 6.4
average(10, 1) # => 5.5
average(Complex.new(1, 2), Complex.new(4, 5)) # => (2.5 + 3.5i)
Do it with a previous Crystal version, for example 0.29.0, where
/ does a floor division for integers. Then do it for 0.31.0 and see which one is simpler and easier to read/write. You can use https://carc.in/ to use older Crystal versions.
It’s also interesting to know that in Python, Haskell, Elm and Nim,
/ is floor division, like in Crystal. And Python is now geared towards math and science so it might make sense for
/ to always do floor division (the PEP explains this too).
This is also the case of something familiar vs. something simple.
Time.now is familiar: it’s in practically every programming language. However, I remember many times having to do
Time.now.utc or the other way around in Ruby after finding bugs. This is because when writing
Time.now it’s not clear where that “now” is: is it here where the code is running, is it UTC, or what?
Time.utc there’s no problem anymore: it’s clear from the method name.
I think this change is mainly annoying because a lot of code that uses
Time.now already exists. But thinking it for the future,
Time.local might be more intuitive to use.
I suggest we put back
Time.now that only gives a compile-time error suggesting to use
Time.utc, instead of just saying “undefined method”.
Why these changes are annoying?
At some point we introduced deprecation warnings. The idea was that you would get warnings when using deprecated methods, or even
/ for integers, so that your code would still compile and you could slowly migrate the code so when we eventually remove or change things your code will still compile.
I suggested to make warnings opt-out: you will get warnings unless you passed a flag. My reasoning was that people will never find or use warnings if it was opt-in because it’s something you have to do, and we humans are lazy by nature (we try to do what’s simpler for us). But it was finally decided to make it opt-in to avoid annoying users with walls of warnings.
What happened? Apparently users never knew about these warnings, and eventually when we broke things people got really annoyed (like it can be seen here).
So I think part of the annoyance was not making warnings opt-out.
Luckily, starting from Crystal 0.31.0 warnings are opt-out
In Crystal we are not afraid of making “breaking changes” compared to Ruby or other languages if we think carrying the baggage over is worse than trying to do something new and better. We think of Crystal as a new language, now a language that you use by copying code from another language and expect it to work with minor changes.
For example we have
Object#to_s(io) instead of the classical
to_s to avoid creating intermediate objects when converting things to strings.
Another example is detecting
nil at compile-time: we could simply define
nil that just keeps returning
nil (like in Objective-C) so that programming becomes easier (no more fighting with the compiler!) but the programs become less reliable or bug-prone.