• Tenthrow@lemmy.world
    link
    fedilink
    arrow-up
    21
    arrow-down
    14
    ·
    1 year ago

    The only temperature scale that isn’t arbitrary is Kelvin, because it has an actual absolute zero. So 0 in Farenheit is freezing point of water and salt solution, which does have scientific application. What difference does it make? The compression of the Celsius scale forces you to use decimal points for any kind of reasonable daily use which sucks. I am not saying Fahrenheit is better, I am saying neither is better. They are both arbitrary. It doesn’t matter which one you prefer, it’s just what you are used to. The other metric units are objectively superior and given to greater precision.

    • Aram855@feddit.cl
      link
      fedilink
      arrow-up
      23
      arrow-down
      1
      ·
      1 year ago

      What do you mean by decimal use? I live in a country who has always used C°, and never in our life we had to use decimal points for everyday usage. All boilers and ovens and kettles and thermostats and lab equipment is used with whole numbers.

      • kurcatovium@lemm.ee
        link
        fedilink
        arrow-up
        13
        ·
        1 year ago

        Yeah, basically the only time we encounter decimal in celsius is when measuring body temperature.

      • yata@sh.itjust.works
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        That sounds strange. All digital celsius thermometers I have encountered in my celsius using country, whether for measuring air, food or body temperatures have decimals.

        However it also seems to be a specifically US thing to be afraid of decimals, they really aren’t that big of a deal and it seems the only reason to not like them is if you are unfamiliar with them.

    • Rivalarrival@lemmy.today
      link
      fedilink
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      1 year ago

      There are multiple temperature scales that set zero at absolute zero. The “width” of their degrees vary arbitrarily. Because of this, Kelvin is still an arbitrary scale.

      The other metric units are objectively superior and given to greater precision.

      Eh, I think you need to rethink that. Show me a measurement in metric, and I can show you a smaller measurement in US Customary. (The reverse is also true, of course. Any measurement I give you in decimal inches, you can show me a smaller one in metric). Both systems are capable of an arbitrary degree of precision. Precision is certainly not one of the benefits of metric.

      Metric is not “objectively” superior. Metric’s superiority is based on the subjective idea that base-10 scalability is a desirable quality. There are many, many reasons supporting that idea, but there are also certain circumstances for which base-10 is not particularly well suited. Scaling by a factor of 3, for example.

      The idea of base-10 scalability effectively prohibits the metrification of angular measurements: geometry is extraordinarily ugly when you need to represent 1/6th of a circle comprised of 1, 10, 100, or 1000 Degrees, since no power of 10 is evenly divisible by 6. When you can’t even represent an equilateral triangle without repeating digits, your system won’t be adopted for that use.

      Now, if we had evolved with 12 fingers, and developed a number system with 2 additional digits, 6 6 would equal 10. (6 6)^2 would be 100. We’d have an entirely different multiplication table, but a duodecimal metric system would be extraordinarily elegant. With 2 more fingers, we’d have metric clocks. Instead, we are stuck with some bastardized sexagesimal compatibility layer and everyone hates trigonometry.

    • anchr@lemmy.world
      link
      fedilink
      arrow-up
      12
      arrow-down
      1
      ·
      1 year ago

      I would say Kelvin is arbitrary too. It has the temperature of 0K at a natural, non-atbitrairy point, but each increment of 1K is the same as 1°C and thus just as arbitrary as the Celcius scale.