• soulfirethewolf@lemdro.id
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    I still hate that they killed the mid-range model. Your option is the lower end MacBook Air with no fan, or the higher-end MacBook Pro. There is no in between.

    I absolutely love the snappiness of the m1 chip in my current 2020 MBP, and how much more efficient ARM is compared to x86, but it seems really hard to justify going an extra 300$ in the future.

    I really just wish they would bring back the original MacBook (with no suffixes at the end)

    • soulfirethewolf@lemdro.id
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I kind of want to go for the framework laptop, but I still do like ARM and given I want to do more stuff around machine learning in the future, which is already kind of difficult to run large language models with only 8 gigabytes of RAM, it at least kind of runs with ARM. On my basement PC, It will barely do anything

      • tal@lemmy.today
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        There are some external GPUs that can be USB-attached. Dunno about for the Mac. Latency hit, but probably not as significant for current LLM use than games, as you don’t have a lot of data being pushed over the bus once the model is up.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    70
    ·
    1 year ago

    Just upgrade the RAM yourself.

    Oh wait, you can’t because it’s 2023 and it’s become inexplicably acceptable to solder it to the motherboard.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        33
        ·
        1 year ago

        Ah yes, it’s the SSD that’s soldered.

        Just 300 of your English pounds to upgrade from 512GB to 1TB.

        Meanwhile, a 2TB drive at PS5 speeds is under £100.

        For unupgradable kit, the pricing is grotesque.

          • Kidplayer_666@lemm.ee
            link
            fedilink
            arrow-up
            12
            ·
            1 year ago

            Actually no. There’s some pairing trickery going on on the SoC level, so if you change the NAND chips by higher capacity ones without apple’s special sauce, you’ll just get an unbootsble system

            • DaDragon@kbin.social
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              I was under the impression that had been solved by third parties? Or is chip cloning not enough?

            • Skirmish@beehaw.org
              link
              fedilink
              arrow-up
              6
              ·
              1 year ago

              And paging in & out of RAM frequently is probably one of the quickest ways to wear out the NAND.

              Put it all together and you have a system that breaks itself and can’t be repaired. The less RAM you buy the quicker the NAND will break.

        • NattyNatty2x4@beehaw.org
          link
          fedilink
          arrow-up
          8
          ·
          edit-2
          1 year ago

          Apple has put a lot of effort into (successfully) creating a customer-base that thinks overpriced goods and different colored texts make them in a special club, I’m not surprised that an exec thought this excuse would fly

          • monsieur_jean@kbin.social
            link
            fedilink
            arrow-up
            7
            ·
            1 year ago

            It’s a bit more complex than that (and you probably know it).

            When you enter the Apple ecosystem you basically sign a contract with them : they sell you overpriced goods, but in exchange you get a consistent, coherent and well thought-out experience across the board. Their UX is excellent. Their support is good. Things work well, applications are easy to use and pretty stable and well built. And if they violate your privacy like the others, at least they don’t make the open-bar sale of your data their fucking business model (wink wink Google).

            Of course you there’s a price to pay. Overpriced products, limited UI/UX options, no interoperability, little control over your data. And when there’s that one thing that doesn’t work, no luck. But your day to day life within the Apple ecosystem IS enjoyable. It’s a nice golden cage with soft pillows.

            I used to be a hardcore PC/Linux/Android user. Over the last few years I gradually switched to a full Apple environment : MacBook, iPhone, iPad… I just don’t have time to “manage” my hardware anymore. Nor the urge to do it. I need things to work out of the box in a predictable way. I don’t want a digital mental load. Just a simple UX, consistency across my devices and good apps (and no Google, fuck Google). Something I wouldn’t have with an Android + PC setup. :)

            The whole “special club” argument is bullshit, and I hope we grow out of it. Neither the Apple nor the Google/Microsoft environments are satisfactory. Not even speaking of Linux and FOSS. We must aim higher.

            • rwhitisissle@beehaw.org
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              1 year ago

              I’m gonna have to argue against a few of these points:

              When you enter the Apple ecosystem you basically sign a contract with them : they sell you overpriced goods, but in exchange you get a consistent, coherent and well thought-out experience across the board.

              Consistent: yes. Every Apple device leverages a functionally very similar UI. That said, the experience is, in my opinion, not very coherent or well thought out. Especially if you are attempting to leverage their technology from the standpoint of someone like a Linux power user. The default user experience is frustratingly warped around the idea that the end user is an idiot who has no idea how to use a terminal and who only wants access to the default applications provided with the OS.

              Things work well

              Things work…okay. But try installing, uninstalling, and then reinstalling a MySQL DB on a macbook and then spend an hour figuring out why your installation is broken. Admittedly, that’s because you’re probably installing it with Homebrew, but that’s the other point: if you want to do anything of value on it, you have to use a third party application like Homebrew to do it. The fact that you have to install and leverage a third party package manager is unhinged for an ecosystem where everything is so “bundled” together by default.

              Of course you there’s a price to pay. Overpriced products, limited UI/UX options, no interoperability, little control over your data. And when there’s that one thing that doesn’t work, no luck. But your day to day life within the Apple ecosystem IS enjoyable. It’s a nice golden cage with soft pillows.

              I guess the ultimate perspective is one in which you have to be happy surrendering control over so much to Apple. But then again, you could also just install EndeavorOS with KDE Plasma or any given flavor of Debian distribution with any DE of your choice, install KDE Connect on your PC and phone, and get 95 percent of the experience Apple offers right out of the box, with about 100x the control over your system.

              I used to be a hardcore PC/Linux/Android user. Over the last few years I gradually switched to a full Apple environment : MacBook, iPhone, iPad… I just don’t have time to “manage” my hardware anymore.

              I don’t know of anyone who would describe themselves as a hardcore “PC/Linux user,” or what this means to you. I’m assuming by PC you mean Windows. But people who are really into Linux generally don’t like MacOS or Windows, and typically for all the same reasons. I tolerate a Windows machine for video game purposes, but if I had to use it for work I’d immediately install Virtualbox and work out of a Linux VM. For the people who are really into Linux, the management of the different parts of it is, while sometimes a pain in the ass, also part of the fun. It’s the innate challenge of something that can only be mastered by technical proficiency. If that’s not for you, totally fine.

              The whole “special club” argument is bullshit, and I hope we grow out of it.

              It’s less argument and more of a general negative sentiment people hold towards Apple product advocates. You can look up the phenomenon of “green bubble discrimination.” It’s a vicious cycle in which the ecosystem works seamlessly for people who are a part of it, but Apple intentionally makes leaving that ecosystem difficult and intentionally draws attention to those who interact with the people inside of it who are not part of it. Apple products also often are associated with a higher price tag: they’re status symbols as much as they are functional tools. People recognize a 2000 dollar Macbook instantly. Only a few people might recognize a comparably priced Thinkpad. In a lot of cases, they’ll just assume the Macbook was expensive and the non-Macbook was cheap. And you might say, “yeah, but that’s because of people, not because of Mac.” But it would be a lie to say that Apple isn’t a company intensely invested in brand recognition and that it doesn’t know it actively profits from these perceptions.

              • monsieur_jean@kbin.social
                link
                fedilink
                arrow-up
                5
                ·
                1 year ago

                Everything you say is what past me would have answered ten years ago, thinking current me is an idiot. Yet here we are. ;)

                You are right and make good points. But you are not 99% of computer users. Just considering installing a linux distro puts you in the top 1% most competent.

                (Speaking of which, I still have a laptop running EndeavourOS + i3. Three months in my system is half broken because of infrequent updates. I could fix it, I just don’t have the motivation to do so. Or the time. I’ll probably just reinstall Mint.)

                • rwhitisissle@beehaw.org
                  link
                  fedilink
                  arrow-up
                  4
                  ·
                  1 year ago

                  Everything you say is what past me would have answered ten years ago, thinking current me is an idiot. Yet here we are. ;)

                  Wow. Talk about coincidences…

                  you are not 99% of computer users. Just considering installing a linux distro puts you in the top 1% most competent.

                  I’m a dumbass and if I can do it anyone can. But, yes, technology is a daunting thing to most people. Intuition and experience go far. That said, it’s literally easier today than it ever has been. You put in the installation usb, click next a whole bunch, reboot, and you have a working machine. Is it sometimes more complicated than that and you have to do BIOS/UEFI bullshit? Sure, but past that hurdle it’s smooth sailing.

                  (Speaking of which, I still have a laptop running EndeavourOS + i3. Three months in my system is half broken because of infrequent updates. I could fix it, I just don’t have the motivation to do so. Or the time. I’ll probably just reinstall Mint.)

                  Ah, the joys of rolling release distros. Endeavor has been stable for me so far. I’m running it on an X1 Thinkpad. Generally works more reliably than my own vanilla arch installs and more low profile tiling window managers. I’ve found myself sticking to KDE Plasma for a DE because it’s so consistent and has enough features to keep me happy without having to spend all my time fine tuning my own UX, which I just don’t care about. My realization has been that arch distros are best suited for machines running integrated graphics and popular DEs, rather than ones with separate cards and more niche or highly customizable DEs. Prevents you from having to futs about with things like Optimus, with graphics drivers being the primary cause of headaches for that distro, per my experience. That said, I used to run an old Acer laptop with arch and a tiling window manager called qtile. Qtile was great, but every other update completely altered the logic and structure of how it read the config file for it, so the damn thing broke constantly. I’m like…just decide how you want the config to look and keep that. Or at least allow for backwards compatibility. But they didn’t.

    • RickRussell_CA@beehaw.org
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      It’s not “inexplicable”.

      DIMM mounting brackets introduce significant limitations to maximum bandwidth. SOC RAM offers huge benefits in bandwidth improvement and latency reduction. Memory bandwidth on the M2 Max is 400GB/second, compared to a max of 64GB/sec for DDR5 DIMMs.

      It may not be optimizing for the compute problem that you have, and that’s fine. But it’s definitely optimizing for compute problems that Apple believes to be high priority for its customers.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    🤖 I’m a bot that provides automatic summaries for articles:

    Click here to see the summary

    With the launch of Apple’s M3 MacBook Pros last month, a base 14-inch $1,599 model with an M3 chip still only gets you 8GB of unified DRAM that’s shared between the CPU, GPU, and neural network accelerator.

    In a show of Apple’s typical modesty, the tech giant’s veep of worldwide product marketing Bob Borchers has argued, in an interview with machine learning engineer and content creator Lin YilYi, that the Arm-compatible, Apple-designed M-series silicon and software stack is so memory efficient that 8GB on a Mac may equal to 16GB on a PC – so we therefore ought to be happy with it.

    With that said, macOS does make use of several tricks to optimize memory utilization, including caching as much data as it can in free RAM to avoid running to and from slower storage for stuff (there’s no point in having unused physical RAM in a machine) and compressing information in memory, all of which other operating systems, including Windows and Linux, do too in their own ways.

    Given a fast enough SSD, the degradation in performance associated with running low on RAM can be hidden to a degree, though it does come at the expense of additional wear on the NAND flash modules.

    We’d hate to say that Apple has designed its computers so that they perform stunningly in the shop for a few minutes, and work differently after a few months at home or in the office.

    His comment is also somewhat ironic in that much of the focus of YilYi’s interview with Borchers centered around the use of Apple Silicon in machine-learning development, which you don’t do in a store.


    Saved 71% of original text.

  • Chemical Wonka@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    19
    ·
    1 year ago

    8GB for this price in 2023 is a SCAM. All Apple devices are a SCAM. Many pay small fortunes for luxurious devices full of spyware and which they have absolutely no control over. It’s insane. They like to be chained in their golden shackles.

    • java@beehaw.org
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      All Apple devices are a SCAM.

      True. Sometimes I look the specs and prices of Apple devices while visiting large electronic stores. I don’t understand how people who aren’t rich can rationalize buying an Apple device. While it’s true that Windows has become increasingly plagued by invasive ads recently, and macOS seems like the only alternative for many, this issue is relatively recent. On the other hand, MacBooks have been overpriced for years.

    • Echo Dot@feddit.uk
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      1 year ago

      I bought a PC the other day and it only had 6 gigabytes of RAM which is pathetic for what I paid for it but there you go. The thing is for a fraction of the price Apple are asking to upgrade it to 16, I upgraded it to 32 gig.

      I honestly think I could upgrade it to 64 and still come in under the Apple price. They’re charging something like a 300% markup on commercially available RAM, it’s ridiculous.

      • SuperSpruce@lemmy.ml
        link
        fedilink
        arrow-up
        9
        ·
        1 year ago

        On storage, the markup is about 2000%.

        And on RAM if we compare to DDR5 (not totally fair because of how Apple’s unified memory works), it’s about 800% marked up.

    • trevron@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Agree with you on the price, disagree with the sentiment. Unless you’re comparing to a linux machine it is a bad take. You can do plenty to MacOS and it isn’t constantly trying to reinstall fucking one drive or hijack my search bar or reset my privacy settings after an update.

      But yeah, they can fuck off with the prices.

      • Chemical Wonka@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        I don’t trust MacOS, its proprietary code obviously hides evil spying and control functions over the user. Apple has always been an enemy of the free software community because it is not in favor of its loyal customers but only its greedy shareholders. There is no balance, Apple has always adopted anti-competitive measures. That’s just to say the least.

        • abhibeckert@beehaw.org
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          Apple has always been an enemy of the free software community

          Apple is one of the largest contributors to open source software in the world and they’ve been a major contributor to open source since the early 1980’s. Yes, they have closed source software too… but it’s all built on an open foundation and they give a lot back to the open source community.

          LLVM for example, was a small project nobody had ever heard of in 2005, when Apple hired the university student who created it, gave him an essentially unlimited budget to hire a team of more people, and fast forward almost two decades it’s by far the best compiler in the world used by both modern languages (Rust/Swift/etc) and old languages (C, JavaScript, Fortran…) and it’s still not controlled in any way by Apple. The uni student they hired was Chris Lattner, he is still president of LLVM now even though he’s moved on (currently CEO of an AI startup called Modular AI).

          • Chemical Wonka@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Well, look at the annual contribution that Apple makes to the BSD team and see that Apple uses several open source software in its products but with minimal financial contribution. Even more so for a company of this size. Apple only “donates” when it is in its interest that such software is ready for it to use.

        • PenguinTD@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          It took the EU legislation to force them adapt USB 3 charger port. Their consumer base are their cows.

          • dan@upvote.au
            link
            fedilink
            arrow-up
            2
            ·
            1 year ago

            And even though they have USB 3 ports, it’s not even a proper USB 3 port as the lower-end models only support USB 2 speeds (480Mbps max)!

              • dan@upvote.au
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                Lightning was also 480Mbps so I wonder if they just changed the port but kept most of the internals the same

                • Echo Dot@feddit.uk
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  edit-2
                  1 year ago

                  They claim that the die that they use for the M1 chip doesn’t support USB 3 standards but the die that they use for the M1 Pro chip does.

                  Which is probably true, but they also made the chip so it’s not much of a defence.

          • Echo Dot@feddit.uk
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            It’s not even USB 3 it’s USB 2 delivered via USB-C. Because that’s something everybody wants isn’t it, slow charging on a modern standard that should be faster and indeed is faster on every other budget Android phone.

          • Honytawk@lemmy.zip
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            They already do so with apps.

            If Apple deems the app too old, then it won’t be compatible and is as useful as a brick.

            • Stormyfemme@beehaw.org
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              You know I have software on my PC old enough I can’t run it even in compatibility mode, I’d need to spin up a VM to run it or a pseudoVM like DOSBox, it’s not unheard of it’s not even uncommon.

    • SkepticalButOpenMinded@lemmy.ca
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      That’s too simplistic. For example, the entry level M1 MacBook Air is hands down one of the best value laptops. It’s very hard to find anything nearly as good for the price.

      On the high end, yeah you can save $250-400 buying a similarly specced HP Envy or Acer Swift or something. These are totally respectable with more ports, but they have 2/3rd the battery life, worse displays, and tons of bloatware. Does that make them “not a scam”?

      (I’m actually not sure what “spyware” you’re referring to, especially compared to Windows and Chromebooks.)

      • Echo Dot@feddit.uk
        link
        fedilink
        arrow-up
        6
        ·
        edit-2
        1 year ago

        The bloatware really isn’t an arguement because it takes all of 30 seconds to uninstall it all with a script that you get off GitHub. Yeah it’s annoying and it shouldn’t be there but it’s not exactly going to alter my purchase decision.

        The M1’s ok value for money, but the problem is invariably you’ll want to do more and more complex things over the lifetime of the device, (if only because basic software has become more demanding), while it might be fine at first it tends to get in the way 4 or 5 years down the line. You can pay ever so slightly more money and future proof your device.

        But I suppose if you’re buying Apple you’re probably going to buy a new device every year anyway. Never understood the mentality personally.

        My cousin gets the new iPhone every single year, and he was up for it at midnight as well, I don’t understand why because it’s not better in any noticeable sense then it was last year, it’s got a good screen and a nice camera but so did the model 3 years ago. Apple customers are just weird.

        • janguv@lemmy.dbzer0.com
          link
          fedilink
          arrow-up
          4
          ·
          edit-2
          1 year ago

          But I suppose if you’re buying Apple you’re probably going to buy a new device every year anyway. Never understood the mentality personally.

          My cousin gets the new iPhone every single year, and he was up for it at midnight as well, I don’t understand why because it’s not better in any noticeable sense then it was last year, it’s got a good screen and a nice camera but so did the model 3 years ago. Apple customers are just weird.

          I think you’re basing your general estimation of the Apple customer on the iPhone customer a bit too heavily. E.g., I have never had an iPhone and wouldn’t ever consider buying one, considering how locked down and overpriced it is, and how competitive Android is as an alternative OS.

          Meanwhile, I’ve been on MacOS for something like 7 or so years and cannot look back, for everyday computing needs. I have to use Windows occasionally on work machines and I cannot emphasise enough how much of an absolute chore it is. Endless errors, inconsistent UX, slow (even on good hardware), etc. It is by contrast just a painful experience at this point.

          And one of the reasons people buy MacBooks, myself included, is to have longevity, not to refresh it after a year (that’s insane). It’s a false economy buying a Windows laptop for most people, because you absolutely do need to upgrade sooner rather than later. My partner has a MacBook bought in 2014 and it still handles everyday tasks very well.

          • Echo Dot@feddit.uk
            link
            fedilink
            arrow-up
            3
            ·
            1 year ago

            It’s a false economy buying a Windows laptop for most people, because you absolutely do need to upgrade sooner rather than later.

            I think you missed my point.

            You want to keep laptops for ages regardless of what OS it was it runs (really not sure how that would have any bearing on spec fall off), but the MacBook M1 is only competitive now, but it won’t be competitive in 4 to 5 years. The chip is good for its power consumption but it isn’t a particularly high performance chip in terms of raw numbers. But the laptop costs as if it is a high performance chip.

            There’s no such thing as a Windows laptop you just buy a laptop and that’s the specs you get so not quite sure what you’re comparing the MacBook too.

      • Chemical Wonka@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        I 'm not refering to Windows or ChromeOS ( that are full of spyware too ) . The first generation of Mac M1 had a reasonably more “accessible” price precisely to encourage users to migrate to ARM technology and consequently also encourage developers to port their software, and not because Apple was generous. Far from it.Everything Apple does in the short or long term is to benefit itself.

        And not to mention that it is known that Apple limits both hardware and software on its products to force consumers to pay the “Apple Idiot Tax”. There is no freedom whatsoever in these products, true gilded cages. Thank you, but I don’t need it. Software and hardware freedom are more important.

        • SkepticalButOpenMinded@lemmy.ca
          link
          fedilink
          arrow-up
          2
          ·
          1 year ago

          I didn’t claim that Apple is doing anything to be “generous”. That seems like it’s moving the goal posts. Say, are other PC manufacturers doing things out of generosity? Which ones?

          Even the M2 and M3 Macs are a good value if you want the things they’re good at. For just a few hundred more, no other machine has the thermal management or battery life. Very few have the same build quality or displays. If you’re using it for real professional work, even just hours of typing and reading, paying a few extra hundred over the course of years for these features is hardly a “scam”.

          You didn’t elaborate on your “spyware” claim. Was that a lie? And now you claim it’s “known” that Apple limits hardware and software. Can you elaborate?

          • Chemical Wonka@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            3
            ·
            1 year ago

            MacBooks do have excellent screens, software integration and everything else, that’s a fact and I don’t take that away from Apple. But the problem is that it’s not worth paying for this in exchange for a system that is completely linked to your Apple ID, tracking all your behavior for advertising purposes and whatever else Apple decides. Privacy and freedom are worth more. If you can’t check the source code you can’t trust what Apple says, they can lie for their own interests. Have you ever read Apple’s privacy policy regarding Apple ID, for example? If not, I recommend it.

            • SkepticalButOpenMinded@lemmy.ca
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              I think that decision makes sense.

              What you said got me worried, so I looked into the claim that it is “tracking all your behavior for advertising purposes and whatever else Apple decides”. That’s a convincing concern, and you’ve changed my mind on this. I don’t see any evidence that they’re doing anything close to this level of tracking — the main thing they seem to track is your Mac App Store usage — but they may have the potential to do so in the enshittified future. That gives me pause.

              • Stormyfemme@beehaw.org
                link
                fedilink
                English
                arrow-up
                2
                ·
                1 year ago

                Apple has repeatedly stressed that they’re privacy focused in the past, while a major departure from that could happen absolutely it feels a bit like borrowing trouble to assume it will happen soon. Google is an advertising company first, microsoft is just a mess, but Apple is a luxury hardware producer, they have minimal reason to damange their reputation in a way that would make those sorts of consumers upset.

                Please note that I’m not saying it’s impossible just unlikely in the near future

                • SkepticalButOpenMinded@lemmy.ca
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  1 year ago

                  That assessment sounds right. I think we just need to stay vigilant as consumers. We have defeasible reason to trust Apple right now. But we’ve seen, especially recently, what happens when we let corporations take advantage of that hard earned trust for short term gain.

      • lemillionsocks@beehaw.org
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        When compared to other professional level laptops the macbooks do put up a good fight. They have really high quality displays which accounts for some of the cost and of course compared to a commercial grade laptop like a thinkpad the prices get a lot closer(when they arent on sale like thinkpads frequently do).

        That said even then the m1 macbook is over a thousand dollars after tax and that gets you just 256GB of storage and 8GB of ram. Theyre annoyingly not as easy to find as intel offerings but you can find modern ryzen laptops that can still give you into the teens of screen on time for less with way more ram and storage space. The m1 is still the better chip in terms of power per watt and battery life overall, but then getting the ram and storage up to spec can make it $700 more than a consumer grade ryzen.

        • java@beehaw.org
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          They have really high quality displays which accounts for some of the cost and of course compared to a commercial grade laptop like a thinkpad

          Is that important for a professional laptop? I mean, if you use it for work every day, you probably want a screen that is at least 27 inches, preferably two. It should be capable of adjusting its height for better ergonomics.

          • abhibeckert@beehaw.org
            link
            fedilink
            arrow-up
            4
            ·
            edit-2
            1 year ago

            One of the features they highlighted is is the built in display has very similar specs to their 6K 32" professional display (which, by the way, costs more than this laptop). So when you’re not working at your desk you’ll still have a great display (and why are you buying a laptop unless you occasionally work away from your desk?)

            • Both have a peak brightness is 1600 nits (a Dell XPS will only do ~600 nits and that’s brighter than most laptops).
            • Both have 100% P3 color gamut (Dell XPS only gets to 90% - so it just can’t display some standard colors)
            • even though it’s an LCD, black levels are better than a lot of OLED laptops
            • contrast is also excellent
            • 120hz refresh rate, which is better than their desktop display (that only runs at 60Hz. Same as the Dell XPS)
            • 245 dpi (again, slightly better than 218 dpi on the desktop display, although you sit further away from a desktop… Dell XPS is 169 dpi)

            I love Dell displays, I’ve got two on my desk. But even the Dell displays that cost thousands of dollars are not as good as Apple displays.

  • narc0tic_bird@lemm.ee
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    Do they store 32-bit integers as 16-bit internally or how does macOS magically only use half the RAM? Hint: it doesn’t.

    Even if macOS was more lightweight than Windows - which might well be true will all the bs processes running in Windows 11 especially - third party multiplatform apps will use similar amounts of memory no matter the platform they run on. Even for simple use cases, 8 GB is on the limit (though it’ll likely still be fine) as Electron apps tend to eat RAM for breakfast. Love it or hate it Apple, people often (need to) use these memory-hogging apps like Teams or even Spotify, they are not native Swift apps.

    I love my M1 Max MacBook Pro, but fuck right off with that bullshit, it’s straight up lying.

    • abhibeckert@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      1 year ago

      Do they store 32-bit integers as 16-bit internally or how does macOS magically only use half the RAM? Hint: it doesn’t.

      As a Mac programmer I can give you a real answer… there are three major differences… but before I go into those, almost all integers in native Mac apps are 64 bit. 128 bit is probably more common than 32.

      First of all Mac software generally doesn’t use garbage collection. It uses “Automatic Reference Counting” which is far more efficient. Back when computers had kilobytes of RAM, reference counting was the standard with programmer painstakingly writing code to clear things from memory the moment it wasn’t needed anymore. The automatic version of that is the same, except the compiler writes the code for you… and it tends to do an even better job than a human, since it doesn’t get sloppy.

      Garbage collection, the norm on modern Windows and Linux code, frankly sucks. Code that, for example, reads a bunch of files on disk might store all of those files in RAM for for ten seconds even if it only needs one of them in RAM at a time. That burn be 20GB of memory and push all of your other apps out into swap. Yuck.

      Second, swap, while it’s used less (due to reference counting), still isn’t a “last resort” on Macs. Rather it’s a best practice to use swap deliberately for memory that you know doesn’t need to be super fast. A toolbar icon for example… you map the file into swap and then allow the kernel to decide if it should be copied into RAM or not. Chances are the toolbar doesn’t change for minutes at a time or it might not even be visible on the screen at all - so even if you have several gigabytes of RAM available there’s a good chance the kernel will kick that icon out of RAM.

      And before you say “toolbar icons are tiny” - they’re not really. The tiny favicon for beehaw is 49kb as a compressed png… but to draw it quickly you might store it uncompressed in RAM. It’s 192px square and 32 bit color so 192 x 192 x 32 = 1.1MB of RAM for just one favicon. Multiply that by enough browser tabs and… Ouch. Which is why Mac software would commonly have the favicon as a png on disk, map the file into swap, and decompress the png every time it needs to be drawn (the window manager will keep a cache of the window in GPU memory anyway, so it won’t be redrawn often).

      Third, modern Macs have really fast flash memory for swap. So fast it’s hard to actually measure it, talking single digit microseconds, which means you can read several thousand files off disk in the time it takes the LCD to refresh. If an app needs to read a hundred images off swap in order to draw to the screen… the user is not going to notice. It will be just as fast as if those images were in RAM.

      Sure, we all run a few apps that are poorly written - e.g. Microsoft Teams - but that doesn’t matter if all your other software is efficient. Teams uses, what, 2GB? There will be plenty left for everything else.

      Of course, some people need more than 8GB. But Apple does sell laptops with up to 128GB of RAM for those users.

      • rasensprenger@feddit.de
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        1 year ago

        Almost all programs use both 32bit and 64bit integers, sometimes even smaller ones, if possible. Being memory efficient is critical for performance, as L1 caches are still very small.

        Garbage collection is a feature of programming languages, not an OS. Almost all native linux software is written in systems programming languages like C, Rust or C++, none of which have a garbage collector.

        Swap is used the same way on both linux and windows, but kicking toolbar items out of ram is not actually a thing. It needs to be drawn to the screen every frame, so it (or a pixel buffer for the entire toolbar) will kick around in VRAM at the very least. A transfer from disk to VRAM can take hundreds of milliseconds, which would limit you to like 5 fps, no one retransfers images like that every frame.

        Also your icon is 1.1Mbit not 1.1MB

        I have a gentoo install that uses 50MB of ram for everything including its GUI. A webbrowser will still eat up gigabytes of ram, the OS has literally no say in this.

      • narc0tic_bird@lemm.ee
        link
        fedilink
        arrow-up
        3
        ·
        edit-2
        1 year ago

        My 32/16 bit integer example was just that: an example where one was half the size as the other. Take 128/64 or whatever, doesn’t matter as it doesn’t work like that (which was my point).

        Software written in non-GC based languages runs on other operating systems as well.

        I used MS Teams as an example, but it’s hardly an exception when it comes to Electron/WebView/CEF apps. You have Spotify running, maybe a password manager (even 1Password uses Electron for its GUI nowadays), and don’t forget about all the web apps you have open in the browser, like maybe GMail and some Google Docs spreadsheet.

        And sure, Macs have fast flash memory, but so do PC notebooks in this price range. A 990 Pro also doesn’t set you back $400 per terabyte, but more like … $80, if even that. A fifth. Not sure where you got that they are so fast it’s hard to measure.

        There are tests out there that clearly show why 8 GB are a complete joke on a $1600 machine.

        So no, I still don’t buy it. I use a desktop Windows/Linux machine and a MacBook Pro (M1 Max) and the same workflows tend to use very similar amounts of memory (what a surprise /s).

    • Kazumara@feddit.de
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Pied Piper middle out compression for your RAM

      But seriously it’s so ridiculous especially since he said it in an interview with a machine learning guy. Exactly the type of guy who needs a lot of RAM for his own processes working on his own data using his own programs. Where the OS has no control over precision, access patterns or the data streaming architecture.

      • Echo Dot@feddit.uk
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        Apple executives haven’t actually been computed guys for years now. They’re all sales and have no idea how computers work. They constantly saying stupid things that make very little sense, but no one ever calls them on it because Apple.

  • kbal@fedia.io
    link
    fedilink
    arrow-up
    50
    ·
    1 year ago

    With Apple’s new iBits™ the 0s are so much rounder and the 1s are so smooth and shiny that they’re worth at least twice as much as regular bits.

  • Erdrick@beehaw.org
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    I looked at a few Lenovo and MS laptops to see what they are charging to jumps from 8 to 16 GB.
    They are very close to what Apple charges.
    So, they are ALL ripping us off!

  • Paranoid Factoid@beehaw.org
    link
    fedilink
    English
    arrow-up
    26
    ·
    edit-2
    1 year ago

    This is why I left the Mac platform and switched to Linux on Threadripper. Apple is just not being honest.

    The M series Mac is not suitable for performance computing. Outrageous prices and small memory sizes make the computer a toy for professional workloads.

    M gets its performance from RAM proximity to the CPU by connecting RAM and CPU chips together. This proximity lets them reduce the clock divisor and thereby increase total I/O bandwidth. Good idea for phones, tablets, and even laptops. Useless for high end workstations, where you might need 128-256GB. Or more.

    Also, one of these days AMD or Intel will bolt 8GB on their CPUs too, and then they’ll squash M. Because ARM still per clock tick isn’t as efficient as x86 on instruction execution. Just use it for cache. Or declare it fast RAM like the Amiga did.

    Apple has really snowed people about how and where they get their performance from M. It’s a good idea on the small end. But it doesn’t scale.

    This is why they haven’t released an M series Mac Pro.

    Nope, there is in fact a Mac Pro now. I stand corrected.

    • Overzeetop@beehaw.org
      link
      fedilink
      arrow-up
      9
      ·
      edit-2
      1 year ago

      a toy for professional workloads

      [rant]

      I thing this is one of those words which has lost its meaning in the personal computer world. What are people doing with computers these days? Every single technology reviewer is, well, a reviewer - a journalist. The heaviest workload that computer will ever see is Photoshop, and 98% of the time will be spent in word processing at 200 words per minute or on a web browser. A mid-level phone from 2016 can do pretty much all of that work without skipping a beat. That’s “professional” work these days.

      The heavy loads Macs are benchmarked to lift are usually video processing. Which, don’t get me wrong, is compute intensive - but modern CPU designers have recognized that they can’t lift that load in general purpose registers, so all modern chips have secondary pipelines which are essentially embedded ASICs optimized for very specific tasks. Video codecs are now, effectively, hardcoded onto the chips. Phone chips running at <3W TDP are encoding 8K60 in realtime and the cheapest i series Intel x64 chips are transcoding a dozen 4K60 streams while the main CPU is idle 80% of the time.

      Yes, I get bent out of shape a bit over the “professional” workload claims because I work in an engineering field. I run finite elements models and, while spare matrix solutions have gotten faster over the years, it’s still a CPU intensive process and general (non video) matrix operations aren’t really gaining all that much speed. Worse, I work in an industry with large, complex 2D files (PDFs with hundreds of 100MP images and overlain vector graphics) and the speed of rendering hasn’t appreciably changed in several years because there’s no pipeline optimization for it. People out there doing CFD and technical 3D modeling as well as other general compute-intensive tasks on what we used to call “workstations” are the professional applications which need real computational speed - and they’re/we’re just getting speed ratio improvements and the square root of the number of cores, when the software can even parallelize at all. All these manufacturers can miss me with the “professional” workloads of people surfing the web and doing word processing.

      [\rant]

      • Paranoid Factoid@beehaw.org
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        So, one point I’ll make on the hardware assist you discuss is that it’s actually limited to very specific use cases. And the best way to understand this is to read the ffmpeg x264 encoding guide here:

        https://trac.ffmpeg.org/wiki/Encode/H.264

        The x265 guide is similar, so I won’t repeat. But there are a dizzying range of considerations to make when cutting a deliverable file. Concerns such as:

        • target display. Is the display an old style rec709 with 8 bits per color, SDR with of six and a half stops dynamic range, etc? Is it a rec2020, 10 bits per color, about eight stops? Is it a movie projector in a theater, with 12 bits per color and even more dynamic range? When producing deliverables, you choose output settings for encode specific to the target display type.

        • quality settings. Typically handled in Constant Rate Factor (CRF) settings. If you’ve burned video files, you’ll know the lower the CRF number the higher the image quality. But the higher the image quality the lower the overall compression. It’s a tradeoff.

        • compression. The more computation put to compression the smaller the video file per any CRF setting. But also the longer it takes to complete the computation.

        This is only for local playback. Streaming requires a additional tweaks. And it’s only for a deliverable file. In the production pipeline you’d be using totally different files which store each frame separately rather than compress groups of frames, retain far more image data per frame, and are much less compressed or entirely uncompressed overall.

        The point of this is to highlight the vast difference in use cases placed on encoding throughout various stages in a project. And to point out for video production you care about system I/O bandwidth most of all.

        But hardware encode limits you to very specific output ranges. This is what the preset limitations are all about for say nvidia nvenc hardware assist x264 in ffmpeg. The hardware devs select what they think is the most common use case, say YouTube as an output target (which makes network bandwidth and display type presumptions), and targets their hardware accel for that.

        This means most of that marketing talk about hardware assist in M series chips and GPUs etc is actually not relevant for production work. It’s only relevant for cutting final deliverable files under specific use cases like YouTube, or Broadcast (which still wants 10bit ProRes).

        If you look at just x264 settings, the hardware accel presets are so limited most times you’d still be cutting with software encode. Hardware encode comes into play with real time, like streaming and live broadcast. The rest of the pipeline? All software.

      • Paranoid Factoid@beehaw.org
        link
        fedilink
        English
        arrow-up
        13
        ·
        edit-2
        1 year ago

        Well, you’re absolutely right that they’ve released a Mac Pro. Looking it over, the machine is still a terrible deal in comparison to Threadripper. The Mac Pro maxes out at 192GB RAM and 72 GPU cores. Which is a terrible deal compared to Threadripper, which maxes out at 1.5TB RAM and enough PCI lanes for four GPUs.

        From a price / performance standpoint, you could beat this thing with a lower end Ryzen 5950x CPU, 256GB RAM, and two Nvidia 4080 GPUs at maybe $2500-$3000 dollars less than the maxed out Mac Pro.

        But I was wrong there. Thank you for the correction.

        NOTE A 64core Threadripper with 512GB and four 4090 GPUs would be suitable for a professional machine learning tasks. Better GPUs in the Pro space cost much more though. A 5950x 16 core, 256GB, two 4090 GPUs and pci ssd raid would do to edit 8k / 12k raw footage with color grading and compositing in Davinci Resolve or Premiere/Ae. Would be a good Maya workstation for feature or broadcast 3d animation too.

        That Mac Pro would make a good editing workstation in the broadcast / streaming space, especially if you’re using Final Cut and Motion, but is not suitable in the machine learning space. And I wouldn’t choose it for Davinci Resolve as a color grading station. The Mac XDR 6k monitor is not suitable for pro color grading on the feature side, but would probably be acceptable for broadcast / streaming projects. On the flip side, a Pro color grading monitor is $25K to start and strictly PC anyway.

        • monsieur_jean@kbin.social
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          1 year ago

          The Apple M series is not ARM based. It’s Apple’s own RISC architecture. They get their performance in part from the proximity of the RAM to the GPU, yes. But not only. Contrary to ARM that has become quite bloated after decades of building upon the same instruction set (and adding new instructions to drive adoption even if that’s contrary to RISC’s philosophy), the M series has started anew with no technological debt. Also Apple controls both the hardware to the software, as well as the languages and frameworks used by third party developers for their platform. They therefore have 100% compatibility between their chips’ instruction set, their system and third party apps. That allows them to make CPUs with excellent efficiency. Not to mention that speculative execution, a big driver of performance nowadays, works better on RISC where all the instructions have the same size.

          You are right that they do not cater to power users who need a LOT of power though. But 95% of the users don’t care, they want long battery life, light and silent devices. Sales of desktop PCs have been falling for more than a decade now, as have the investments made in CISC architectures. People don’t want them anymore. With the growing number of manufacturers announcing their adoption of the new open-source RISC-V architecture I am curious to see what the future of Intel and AMD is. Especially with China pouring billions into building their own silicon supply chain. The next decade is going to be very interesting. :)

          • barsoap@lemm.ee
            link
            fedilink
            arrow-up
            7
            ·
            edit-2
            1 year ago

            The Apple M series is not ARM based. It’s Apple’s own RISC architecture.

            M1s through M3s run ARMv8-A instructions. They’re ARM chips.

            What you might be thinking of is that Apple has an architectural license, that is, they are allowed to implement their own logic to implement the ARM instruction set, not just permission to etch existing designs into silicon. Qualcomm, NVidia, Samsung, AMD, Intel, all hold such a license. How much use they actually make of that is a different question, e.g. AMD doesn’t currently ship any ARM designs of their own I think and the platform processor that comes in every Ryzen etc. is a single “barely not a microprocessor” (Cortex A5) core straight off ARM’s design shelves, K12 never made it to the market.

            You’re right about the future being RISC-V, though, ARM pretty much fucked themselves with that Qualcomm debacle. Android and android apps by and large don’t care what architecture they run on, RISC-V already pretty much ate the microcontroller market (unless you need backward compatibility for some reason, heck, there’s still new Z80s getting etched) and android devices are a real good spot to grow. Still going to take a hot while before RISC-V appears on the desktop proper, though – performance-wise server loads will be first, and sitting in front of it office thin clients will be first. Maybe, maybe, GPUs. That’d certainly be interesting, the GPU being simply vector cores with a slim insn extension for some specialised functionality.

            • monsieur_jean@kbin.social
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Thanks for the clarification. I wonder if/when Microsoft is going to hop on the RISC train. They did a crap job trying themselves at a ARM version a few years back and gave up. A RISC Surface with a compatible Windows 13 and proper binary translator (like Apple did with Rosetta) would shake the PC market real good!

          • skarn@discuss.tchncs.de
            link
            fedilink
            arrow-up
            8
            ·
            1 year ago

            The whole “Apple products are great because they control both software and hardware” always made about as much sense to me as someone claiming “this product is secure because we invented our own secret encryption”.

            • anlumo@feddit.de
              link
              fedilink
              arrow-up
              4
              ·
              1 year ago

              Here’s an example for that: Apple needed to ship an x86_64 emulator for the transition, but that’s slow and thus make the new machines appear much slower than their older Intel-based ones. So, what they did was to come up with their own private instructions that an emulator needs to greatly speed up its task and added them to the chip. Now, most people don’t even know whether they run native or emulated programs, because the difference in performance is so minimal.

        • Exec@pawb.social
          link
          fedilink
          arrow-up
          3
          ·
          1 year ago

          When you’re nearing the terabytes range of RAM you should consider moving your workload to a server anyway.

          • Paranoid Factoid@beehaw.org
            link
            fedilink
            English
            arrow-up
            5
            ·
            1 year ago

            It really depends on the kind of and size of data you’re moving. A system bus is a whole lot faster than 10Gb networking. If your data is small and the workload heavy, say a Monte Carlo sim, clusters and cloud make sense. But reverse that, like 12k 14bit raw footage, which makes massive files, and you want that shit local and striped across a couple M.2 drives (or more). Close is fast.

    • Echo Dot@feddit.uk
      link
      fedilink
      arrow-up
      3
      ·
      1 year ago

      Also, one of these days AMD or Intel will bolt 8GB on their CPUs too, and then they’ll squash M.

      I can’t remember who it is but somebody is already doing this. But it’s primarily marketed as an AI training chip. So basically only Microsoft and Google are able to buy them, even if you had the money, there isn’t any stock left.

      • Tak@lemmy.ml
        link
        fedilink
        arrow-up
        23
        ·
        1 year ago

        At consumer prices. There’s no way Apple doesn’t pay wholesale rates for memory.

        • TonyTonyChopper@mander.xyz
          link
          fedilink
          arrow-up
          16
          ·
          1 year ago

          they have the memory controllers built into their processors now. So adding memory is even cheaper, it just takes the modules themselves

  • jcrm@kbin.social
    link
    fedilink
    arrow-up
    38
    ·
    1 year ago

    In my entirely anecdotal experience, MacOS is significantly better at RAM management than Windows. But it’s still a $1,600 USD computer, and 16GB of RAM costs nearly nothing, it’s just classic Apple greed.

      • jcrm@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        The main metric has been with Adobe apps. 2017 Macs with 8GB of RAM are still able to run Premiere and a few others things smoothly simultaneously. Windows machines with the same config were crashing constantly and kept going.

        But I’m still not defending Apple here. It’s been 6 years, and their base level MacBook still ships with the same amount of RAM.

    • meseek #2982@lemmy.ca
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      It’s not anecdotal in the least. It’s been widely tested. There’s a reason an M1 Mac mini with 8GB of RAM can load and fully support over 100 tracks in Logic Pro. The previous Intel machines would buckle with just a few.

      ARM is not comparable to x86-64. The former is totally unified, the latter totally modular.

    • WashedOver@lemmy.ca
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      I’m also under the impression the M powered books are much better at thermo management and battery usage over PC versions?

  • Eggyhead@kbin.social
    link
    fedilink
    arrow-up
    7
    ·
    1 year ago

    16gb is always better, and I usually recommend it to people looking to buy a Mac, but they aren’t wrong about Macs handling RAM more efficiently. They still sound arrogant af when using that as their excuse, though.

    • Helix 🧬@feddit.de
      link
      fedilink
      English
      arrow-up
      11
      ·
      edit-2
      1 year ago

      they aren’t wrong about Macs handling RAM more efficiently.

      More efficiently than what other system? How did you come to that conclusion? If you open tabs in your browser, do you think MacOS will allow you to open more tabs than other operating systems?

      • JustARegularNerd@aussie.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Just from my observations from owning a 2015 MBP with 8GB of memory, it is easy to be fooled into thinking memory management is much better on macOS because you can effectively have more open than you would on an equivalent Windows laptop with 8GB memory.

        From what I understand though, the SSD is used to compensate as swap a lot more than Windows, and I believe this is causing a lot of ewaste with the M1 Macs in particular being effectively binned because the SSDs are worn out on them from swapping and they’re soldered.