.to_sor_and_data
I have a configurator program and a server program.
The configurator will export an object structur to a file using .to_json
and the server will read the file using .from_json
Works fine, easy usage, but
-
require "json"
in the server increases footprint
-
The intermediate file .json
might be a security issue
-
Relative ‘Slow’ parsing of json
So what about this crazy idea
-
The configurator will, instead of using .to_json, export a source and data to a file, say config23.cr
, using some smart .to_sor_and_data
-
The server will then just include require "config23.cr"
and perhaps handmade additional methods via require "config23servermethods.cr"
-
The server will than just do Config23.new()
instead of Config23.from_json(File.read("config23.json"))
To sum up
Configurator
#require "json"
#is replaced with
require "tosoranddata"
.
#File.write("config23.json", something.to_json)
#is replaced with
File.write("config23.cr", something.to_sor_and_data)
.
Server
#require "json"
#is replaced with
require "config23.cr"
require "config23servermethods.cr"
.
#top = Config23.from_json(File.read("config23.json"))
#is replaced with
top = Config23.new
Well. Stupid or not.
Are there ready to use solutions?
Extend the prettyprint
or using some inspector or writing some new macro?
It seems to me that this would work, so long as your configurator runs before your server is compiled. However, I don’t see this being very helpful given the work required to set it up. You do potentially decrease the executable size of your server by doing this, but your generated .cr
file is just as much of a security issue as a JSON file would be, and you’d only be parsing your JSON once. Unless the JSON configuration is gigabytes in size, parsing it would take at most a few seconds, which seems acceptable as a one-time startup cost. So unless server executable size is a huge concern for you, I don’t recommend generating a .cr
file instead of just having JSON configuration.
Thanks for your sayings.
About security. The .cr is generated at the build - secure -site where the server also is built.
The sever is than distributed with any source. Neither .cr or .json
The server is rebuilt very seldom
Any reason to not just do a pure code based method of configuration? I.e. instead of #to_sor_and_data
just create a config23.cr
file and do like:
class Config
# ...
end
Then your server would be able to require it already w/o writing anything to a file, or having to parse anything from a file.
Just for the record.
The things exported via .to_json is actually 17 classes in a hierarchy structure
Perhaps one can extend Object class with a to_sor_and_data
and than invoke top object with the method which will produce a covering module
(config32.cr) with up 17 classes in it.
I guess what I’m getting at is why export the classes at all? If you already have the configuration types defined in code, just require it directly and skip the serialized state of them.
1 Like
I’m grateful for all comments. That force me to be more concrete
As mentioned the configuration is a hierarchy of 17 classes and instantiated into perhaps 30 objects.
It can be result of an optimization of a mathematical expression which took minutes to calculate. The result is
some basic calculation stored in this configuration. It is secret for some reason and it will excuted very frequently with some parameters in the server
This configuration must be instansiated at the sever side.
One outline if classes in the module.cr
class Some20
property kind : String?
property other : String?
property array_of_some_str : Array(String)?
property array_of_obj21 : Array(ClassSome21)?
def initialize(@kind,
@other,
@array_of_some_str,
@array_of_obj21)
end
end
When initialize it will look like from outside. This is generated dependent of the actual instance values. This will be the tricky part I think
@array_of_obj21 = [
[ArcSin,p1],[Pi],[Div],
[ArcSin,p2],[Pi],[Div]
].map{|params|Class21.new(params)}
Of cource the today solution using .to_json
and .from_json
works fine
Ahh okay, so it’s not really configuration in the normal sense, but more of like pre-calculated data. One thing you could do if you really wanted to avoid the JSON is try and leverage Crystal::Macros - Crystal 1.6.2.
Like it would in theory be possible to have some crystal code that creates a file and write crystal syntax code into it. I.e. like autogenerate the file and its contents. Basically a manual version of your #to_sor_and_data
. From there you could use the read_file
macro to embed the contents of that file into the source code of the main program. This would allow your built binary to not depend on the extra JSON file. Tho really comes down to if it’s worth the effort.
There is another macro function that allows calling another crystal script that produces the data to embed, however if it takes minutes, probably better to have it be its own script that just runs before building the main binary in a pipeline.
A middleground solution could be to use read_file
to embed the JSON string into the binary instead of the raw code. Gives the benefit of not needing the extra file w/o dealing with stringifying source code. Ofc it wouldnt allow restarting the server with a new data file since the whole thing would need to be re-built. So i’d say it ultimately depends on your requirements and if any of it is really worth changing.
1 Like