Plans for v2 #10
davdroman
announced in
Announcements
Replies: 1 comment
-
Another update on this. I believe expression macros might be a better fit for shipping a solution to decimal literals. Instead of the proposed: let a: StrictDecimal = 2.56 // error: 'init(floatLiteral:)' is unavailable: Initialization via floating-point literals loses precision past the 6th decimal place, so it is directly unavailable and has been made explicitly unsafe. Use string literals instead or, if you must, use StrictDecimal.unsafe(_:).
let b: StrictDecimal = .unsafe(2.56) // ok, but you're made aware of the risks at the point of use (and with extensive documentation)
let c: StrictDecimal = "2.56" // ok It'd be something like this: let a: StrictDecimal = 2.56 // error: 'init(floatLiteral:)' is unavailable: Initialization via floating-point literals loses precision past the 6th decimal place, so it is directly unavailable and has been made explicitly unsafe. Use #decimalLiteral instead.
let b: StrictDecimal = #decimalLiteral(2.56) // ok The timeline on Swift macros is quite unclear, so it might be worth releasing a v2 with the |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I'm writing down a few thoughts in terms of what v2 will probably look like. This is in order for me to have a trail of reasoning when implementation time comes, and for anyone interested in letting me know their thoughts on it.
The initial premise of this library was "A Decimal type that plays nicely with literals and Decodable". Unfortunately, this is not entirely the case.
The word "precise" implies absolute precision, which the library does not actually offer to the full extent. These limitations are out of anyone's control but Apple's. There's only so much that can be done to work around it, and as far as I've researched, this library already does the very best that can be done.
Recently, however, I've been pondering whether there's a different angle from which this problem can be approached.
Currently, the library tries to mirror
Decimal
's own API, by allowing initialization via double literals and parsing doubles asDecimal
with as much precision as possible.In an ideal world, we'd have true
Decimal
literals, as opposed to havingDouble
literals that bridge ontoDecimal
, losing precision along the way. So whilst that solution isn't realized, using the literal initializer is actually harmful and counterintuitive.This is why I think the library should take a stance against
Decimal
's API surface, and encourage absolutely safe ways to initialize and parse decimals instead.It then stands to reason that v2 should break the API completely, and discourage harmful usage. Think of v2 as a drop-in replacement for
Decimal
that will scream at you whenever you're "holding it wrong". In other words, not a "precise" decimal, but a "strict" decimal:Conversely, parsing doubles as decimal will still be supported, albeit it will cause a Runtime Warning in order to encourage developers to move away onto safer ways of parsing decimals (strings) until Apple manages to fix parsing too.
One would then conceive of this library as not a complete solution, but a watchful guide to alert, educate, and encourage the usage of the more correct APIs until the root problem is fixed.1
And so, for v2, the library will not only be receiving a new API surface, but also an entirely new name and tagline to reflect its newfound, more honest purpose:
Footnotes
Ironically enough, all of this is subject to another Swift compiler bug being fixed. ↩
Beta Was this translation helpful? Give feedback.
All reactions