Reduced, probably could go further, but wanted to keep things somewhat close to how they are in the shards:
module Reply
private class AutoCompletion
def initialize(&@auto_complete : String, String -> {String, Array(String)})
end
end
class Reader
@auto_completion : AutoCompletion
def initialize
@auto_completion = AutoCompletion.new(&->auto_complete(String, String))
end
def auto_complete(current_word : String, expression_before : String)
return "", [] of String
end
end
end
module Docopt
class Tokens < Array(String)
end
end
Reply::Reader.new
The gist of it is docopt is doing a bad thing and inheriting from Array, which is breaking the compiler. Iām 90% sure there is an issue about this somewhere, but Iād have to try and dig it up.
The solution here is probably for docopt to not to do that anymore.
Possibly, but idk if it really makes sense to do that to every stdlib type where itās not expected you do that. Is worth discussing at least. Maybe the real solution is fix the bug, but donāt explicitly prevent this like String. IIRC with String it was a more special type compared to Array and some of the other stdlib types.
Wired, does our official doc say weāre not supposed to do this? inheriting from an Array is seem a very common operation? how to achieve the same goal without using inheritance?
I donāt think thatās a common thing. Why would you want to inherit from Array? It fully encases a concept and is basically feature-complete. So I donāt see any reason for inheritance.
If you want to build some custom behaviour on top of Array, you can use composition, i.e. a type with an instance variable of type Array. If you want it to behave roughly similar to Array, you may include Indexable and Indexable::Mutable.
Itās not really a Crystal specific thing that says why you shouldnāt, but more of a general design reason. Itās usually not a good practice to inherit types you do not own, which accounts for more than just Array, and beyond the stdlib itself.
I am not saying that in a bad way, but that why I try to avoid to use any shards, I donāt want to base my work on a code will be not maintain anymore
My 2 cents on this itās a balancing act. Iām also in the camp of trying to avoid third-party dependencies if possible, but also do not think itās a great idea to be absolute in never using a third-party shard. Writing something yourself instead of using a preexisting shard introduces its own problems. I.e. youāre now responsible for maintaining this extra code, making sure itās secure, correct, etc. versus deferring that to the shard owner.
On the flip side, I think itās important to be cognizant of what shards you depend on. This includes knowing who is the owner/maintainer, the status of the shard, following the changelog to be aware of upcoming changes, having a plan for what happens if this shard disappears or goes EoL, etc.
This is getting a bit off-topic tho, so might be best to create a new thread if you want to discuss this further.
I use shards if I am willing to fix bugs in them and/or adopt them. Itās part of using a āsmallā platform. If nobody uses shards there are no shards, and the oceanās too big to boil by myself.
For example, in a project I am using a VERY nice docker API shard called docr.
I found a couple minor bugs and a larger one. I just fixed them, they got merged a couple months later.
If I had not used it everyone would either not be able to use the docker API or have a buggier shard.
If the author had not written it, that project (which I use and wrote for myself) would not exist.
So sure, a balancing act, but I try to err on the generous side.
I used this to create a simple Matrix class for linear algebra which is 2D array in my case. Due to the issue instead of array inheritance i used a raw class Matrix(T) with internal @buffer object which is exactly Array( Array ( T ) ).