MicroHs binaries are ~100× smaller and ~5–10× slower for this workload; for many data-wrangling tasks that’s a great swap
Under which conditions is that a great swap? A 5x increase in processing times is absolutely huge, and even for moderate data volumes could make a data processing pipeline completely non-viable.
digdugdirk 4 days ago [-]
Local environments, embedded applications, client side processing via wasm... It's a cool project! We can figure out what to do with it later.
rowanG077 4 days ago [-]
This is what I'm thinking. There are still use cases I would say where small binaries really matter. But then you are really choosing the wrong tool for the job with haskell, and I say this as a haskell stan. I expect an optimized C binary is much, much smaller still.
kreetx 4 days ago [-]
Well, this is only "a great swap" in cases where the time taken is already so high that you won't notice a 10x.
But this tradeoff would actually pay off where the compile time has a similar improvement as the size.
Rendered at 11:44:50 GMT+0000 (Coordinated Universal Time) with Vercel.
Under which conditions is that a great swap? A 5x increase in processing times is absolutely huge, and even for moderate data volumes could make a data processing pipeline completely non-viable.
But this tradeoff would actually pay off where the compile time has a similar improvement as the size.