Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Overload resolution behavior of == permits type inference to "ignore" default float literal type if Foundation is imported #78365

Open
xwu opened this issue Dec 26, 2024 · 1 comment
Labels
bug A deviation from expected or documented behavior. Also: expected but undesirable behavior. triage needed This issue needs more specific labels

Comments

@xwu
Copy link
Collaborator

xwu commented Dec 26, 2024

Description

As we all know very well, in binary floating-point math, 0.1 + 0.2 != 0.3.

And as users are taught and rightly expect, in the absence of other type context, a float literal defaults to FloatLiteralType (aka Double, unless shadowed).

Therefore, the following code shows the expected behavior:

func sum<T: Numeric>(_ numbers: [T]) -> T {
    return numbers.reduce(0, +)
}
print(sum([0.1, 0.2]) == 0.3)
// Prints "false"

However:

Foundation defines a Decimal type that conforms to ExpressibleByFloatLiteral, in which (rightly) 0.1 + 0.2 == 0.3 as Decimal.

On Apple platforms, Foundation also defines a RunLoop.SchedulerTimeType.Stride type which shockingly also conforms to ExpressibleByFloatLiteral.

For...reasons, importing Foundation (without ever touching Decimal) breaks user expectations:

import Foundation

func sum<T: Numeric>(_ numbers: [T]) -> T {
    return numbers.reduce(0, +)
}
print(sum([0.1, 0.2]) == 0.3)
// Prints "true" on Linux, does not compile on macOS due to ambiguous overloads

Here (unless I'm mistaken), Foundation isn't getting any special treatment that a third-party library wouldn't get. And a third-party library shouldn't be able to break user code that uses float literals without further type context, or worse yet silently change how it executes.

Reproduction

import Foundation

func sum<T: Numeric>(_ numbers: [T]) -> T {
    return numbers.reduce(0, +)
}
print(sum([0.1, 0.2]) == 0.3)
// Prints "true" on Linux, does not compile on macOS due to ambiguous overloads

Expected behavior

Regardless of what libraries are imported, as long as the user isn't shadowing FloatLiteralType, the expression sum([0.1, 0.2]) == 0.3 should always compile and evaluate to false.

Environment

All versions from at least Swift 4.2 onwards (checked on godbolt.org)

Additional information

Based on the question raised on the Swift Forums in:

https://forums.swift.org/t/why-isnt-the-typesolver-unable-to-resolve-this-without-the-implicit-assignment-first/76852

@xwu xwu added bug A deviation from expected or documented behavior. Also: expected but undesirable behavior. triage needed This issue needs more specific labels labels Dec 26, 2024
@xwu xwu changed the title Overload resolution behavior of == permits type inference to "ignore" default float literal type Overload resolution behavior of == permits type inference to "ignore" default float literal type if Foundation is imported Dec 26, 2024
@xwu
Copy link
Collaborator Author

xwu commented Dec 26, 2024

cc @xedin

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug A deviation from expected or documented behavior. Also: expected but undesirable behavior. triage needed This issue needs more specific labels
Projects
None yet
Development

No branches or pull requests

1 participant