• stolid_agnostic
    link
    fedilink
    English
    arrow-up
    75
    ·
    8 个月前

    lol it’s already out there on tens of millions of laptops, but I guess hubris is the way to go

  • conciselyverbose@kbin.social
    link
    fedilink
    arrow-up
    60
    ·
    8 个月前

    Apple already did though. Even specifically replacing Intel chips because Intel’s offering was dogshit that was destroying their ability to offer the design they wanted with their stupid power draw.

    The rest of ARM is behind, and Windows has done a shit job of ARM support, but that doesn’t mean that’s forever.

    • megopie@beehaw.org
      link
      fedilink
      arrow-up
      37
      ·
      8 个月前

      Windows also seems more concerned with going all in on cloud computing, the whole “you will own nothing and like it” paradigm. So making a faster and more efficient mobile platform isn’t probably a high priority for them.

        • megopie@beehaw.org
          link
          fedilink
          arrow-up
          8
          ·
          8 个月前

          Yah, I’m really not enthused with the idea of having to pay monthly rent for my computers ability to function.

          I wonder if intel just values their existing experience with 86 more than any potential efficiency gains since the efficiency matters a lot less when the whole system is just a glorified screen and antenna.

          • entropicdrift@lemmy.sdf.org
            link
            fedilink
            arrow-up
            3
            ·
            8 个月前

            I’d say their recent trend towards packing in E(fficiency)-cores along with their previously standard P(erformance)-core design shows that they’re sensitive to and reacting to both the higher core counts of AMD and the greater efficiency of ARM

          • flashgnash@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            edit-2
            8 个月前

            I’m really not sure even Microsoft could get away with that

            The moment a subscription service comes into play for something they take for granted as free people will start looking at alternatives. Chromebooks and macbooks exist and from what I hear Chromebooks are starting to become serious competition for Windows

            Plus Linux desktop obviously getting more user friendly and preinstalled on laptops

        • WetBeardHairs
          link
          fedilink
          arrow-up
          1
          ·
          8 个月前

          Them taking control away from me makes me not use them. Not a problem at all.

          • conciselyverbose@kbin.social
            link
            fedilink
            arrow-up
            3
            ·
            8 个月前

            I was never too deep because I always hated everything about Windows UX, but I was stuck with them for gaming for a bit. Luckily Steam fixed that for pretty much everything I wanted to play but Madden (and after hours of it also not working on a separate Windows install I tried just for that purpose, I threw in the towel on that, too).

            The funny thing is I actually kind of like the idea of a thin client as a general rule. Not for gaming or anything else latency sensitive, but offloading heavy lifting is perfectly fine with me. Just not in a way I don’t have control of.

            • WetBeardHairs
              link
              fedilink
              arrow-up
              2
              ·
              8 个月前

              I’m stuck with it because of work. Luckily, “Industry 4.0” is completely fucking fed up with M$ and they’re abandoning Windows in droves. I’m just waiting for my vendor to finish polishing their MacOS and Linux alternatives.

    • V ‎ ‎ @beehaw.org
      link
      fedilink
      English
      arrow-up
      22
      ·
      8 个月前

      Especially when it’s becoming increasingly obvious that Windows isn’t the future. Windows has maintained dominance because it is great at backwards compatibility. ARM erodes that advantage because of architectural differences, coupled with the difficulty and drawbacks of emulating x86 on ARM. Mobile is eating more and more market share, and devs aren’t making enterprise software for Windows like they use to.

      No one working on a greenfield project says “let’s develop our systems on Windows server” unless they already were doing that. Windows as a service is the more likely future, funneled by Azure.

      • catacomb@beehaw.org
        link
        fedilink
        English
        arrow-up
        4
        ·
        8 个月前

        Even some shops working with Windows Server are asking “wait, why are we paying for these licenses?”

        Then it comes down to whether it’s cheaper to rewrite legacy applications or continue to pay for licenses.

        • V ‎ ‎ @beehaw.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          8 个月前

          My former employer made this decision recently. They moved off .NET and onto a web app with a RHEL server. Time will tell if they pull it off.

    • ripcord@kbin.social
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      8 个月前

      Also, Chromebooks. And the more powerful CPUs the more they’ll be purchased too.

      And low-end Windows laptops.

      Maybe not a giant piece of the pie of the current market, but definitely a dent as these more powerful CPUs come online.

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    1
    ·
    8 个月前

    The problem with ARM laptops is all of the x86 windows software that will never get ARM support and all of the users that will complain about poor performance if an emulator is used to run the x86 software.

    Most Linux software already supports ARM natively. I would love to have an ARM laptop as long as it has a decent GPU with good open source drivers. It would need full OpenGL and Vulkan support and not that OpenGL ES crap though.

      • w2tpmf@kbin.social
        link
        fedilink
        arrow-up
        19
        arrow-down
        1
        ·
        8 个月前

        Windows has nothing to do with it. They are talking about software applications that were made for x86. Stuff like Adobe CC, etc.

        Windows runs on ARM (and has for a decade) and the apps available in the Windows app store run on ARM.

        • upstream@beehaw.org
          link
          fedilink
          arrow-up
          7
          ·
          8 个月前

          Apple has shown that the market could be willing to adapt.

          But then again, they’ve always had more leverage than the Wintel-crowd.

          But what people seem to ignore is that there is another option as well: hardware emulation.

          IIRC correctly old AMD CPU’s, notably the K6, was actually a RISC core with a translation layer turning X86 instructions into the necessary chain of RISC instructions.

          That could also be a potential approach to swapping outright. If 80% of your code runs natively and then 20% passes this hardware layer where the energy loss is bigger than the performance loss you might have a compelling product.

          • DJDarren@thelemmy.club
            link
            fedilink
            English
            arrow-up
            5
            ·
            8 个月前

            Apple has shown that the market could be willing to adapt.

            It’s less that they’ll adapt, and more that they don’t really care. And particularly in the case of Apple users: their apps are (mostly) available on their Macs already. The vast majority of people couldn’t tell you what architecture their computer runs on and will just happily use whatever works and doesn’t cost them the earth.

        • meseek #2982@lemmy.ca
          link
          fedilink
          arrow-up
          3
          ·
          8 个月前

          Except software applications like Adobe CC have supported ARM for nearly 5 years now. As do most software because mobile exists (and mobile is exclusively ARM) and these days, apps need to cover desktop and mobile and web. ARM has essentially been forced on everyone because of mobile. Whether they like it or not, ARM is here to stay.

          But none of this is a technical limitation. It’s a political one. Companies like MS don’t care about the technology, they just care about moving in a way that gives them control so they can maintain and expand their monopoly through licensing and other restrictions.

      • anlumo@feddit.de
        link
        fedilink
        arrow-up
        6
        ·
        8 个月前

        Microsoft is actually pushing Windows on ARM right now, since their exclusivity deal with Qualcom expired. This is going to get interesting.

    • anlumo@feddit.de
      link
      fedilink
      arrow-up
      14
      ·
      8 个月前

      Modern ARM GPUs already support OpenGL and Vulkan, that’s not a problem. Just some platforms chose to go mobile APIs due to running Android.

      The trick with emulation that Apple did was to add custom instructions to the CPU that are used by the emulation layer to efficiently run x86_64 code. Nothing is stopping other CPU manufacturers from doing the same, the only issue is that they have to collaborate with the emulation developer.

    • interolivary@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      8 个月前

      Doesn’t Microsoft have something similar to Apple’s Rosetta 2 JIT x86 -> ARM code translation kajigger? I could swear I’ve seen something like that mentioned

      Edit: not sure whether it was WOW64 that I read about, that seems to only work for running 32 bit intel code on ARM (although I have no idea if that’s actually a problem or not when running modern Windows binaries, the last Windows I ran was Vista)

      • aard@kyu.de
        link
        fedilink
        arrow-up
        1
        ·
        8 个月前

        They have, and in my experience it works nicer than Rosetta.

        Windows 10 had it limited to 32bit binaries (but Windows 10 on ARM is generally very broken). Windows 11 can handle both 32 and 64bit emulation.

          • aard@kyu.de
            link
            fedilink
            arrow-up
            2
            ·
            8 个月前

            Don’t want to go into too much details - from a high level perspective the Windows version integrates better into the overall system. In Rosetta, once you’re in the emulation layer it can be rather complicated to execute native components from there. In Windows - with some exceptions - that’s not a problem.

  • anlumo@feddit.de
    link
    fedilink
    arrow-up
    26
    ·
    8 个月前

    This sounds very familiar to when Steve Ballmer wasn’t worried about the iPhone at all.

  • Pantherina@feddit.de
    link
    fedilink
    arrow-up
    24
    ·
    8 个月前

    I hope Risc-V will make it. Even though idk? But it literally has no weird proprietary shit like ARM and it actually makes sense.

    Going away from x86_64 is important, even for the environment

    • happyhippo@feddit.it
      link
      fedilink
      arrow-up
      6
      ·
      8 个月前

      My hopes for RISC V are higher than I like to admit.

      I really hope it goes mainstream and gives us ARM benefits with the open nature awesomeness

    • taanegl@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      8 个月前

      Much like the open source movement before it, the open hardware movement will have a slow crawl to a bare victory.

      It’ll first be used a lot by labs, embedded applications and general infrastructure, far away from the consumer space with only a little bit of overlap.

      Then, hopefully, some new Apple-like company manages to slam dunk their presentation and introduction to market, effectively disrupting the market - in a good way.

      Follow me for more hopeful divining. We’ll have the shaking of sticks, a dead goat boy and symbols written in the floor.

      Bring candles.

  • Peter Willemsen@lemmy.emerald.show
    link
    fedilink
    arrow-up
    24
    ·
    edit-2
    8 个月前

    I replaced my old Intel Core i7 HP ProLiant server with an Odroid M1 (ARM Based) and it consumes 2 watts compared to 72 that the Intel Server did.

    The only thing I can’t do with it is my Minecraft server, it runs all else perfectly. Even the Lemmy instance of this account is powered by the same server! And what’s more it basically runs for free, as solar generates enough power for the server to consume, even when it’s cloudy.

    Yes, I believe Intel should be afraid.

    • Jears@social.jears.at
      link
      fedilink
      arrow-up
      2
      ·
      8 个月前

      I run my lemmy instance on a pine64 quartz64 which uses an rk3566. It runs really well and power consumption is totally negligible. Didn’t notice any increase in my power bill since it’s been running.

      • beefcat@beehaw.org
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        8 个月前

        that must be the reason seeing as java is available for just about everything these days

        modern arm socs are impressive but i seriously doubt that 2 watt chip is beating the 75 watt chip it replaced

        • Peter Willemsen@lemmy.emerald.show
          link
          fedilink
          arrow-up
          1
          ·
          8 个月前

          The server was a second hand server that has 32GB RAM and 2 i7 CPU’s, it was made in 2015 so quite old. The Odroid has only 8GB of RAM but for my purposes that’s enough, and given the power it saves it’s absolutely a bargain!

          If I ever need this much memory again I can just temporarily spin up something more powerful, for all other 24/7 tasks I can keep up the efficient ARM server.

          • beefcat@beehaw.org
            link
            fedilink
            arrow-up
            3
            ·
            edit-2
            8 个月前

            it’s great that the new machine suits your needs with so little power. whatever gets the job done with the least energy and cost is almost always the best option.

            we are just questioning whether its performance is truly comparable with the old one. because arm cannot replace x86 on performance per watt alone, many applications need more performance regardless of wattage. i think your old machine was overkill for your use case

            • Peter Willemsen@lemmy.emerald.show
              link
              fedilink
              arrow-up
              1
              ·
              8 个月前

              Yeah makes sense! It probably doesn’t although I have no benchmarks to prove it, it just is enough for me. I know this much though: even if the x86 server had the same specs (ram, GHz) as the arm version it likely still draws more power

        • Dark Arc@social.packetloss.gg
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          8 个月前

          Well, Java can call into native code. I’m pretty sure Mojang isn’t doing that sort of thing, but I wasn’t entirely sure they weren’t depending on a subset of the JVM or a native library that is defacto standard in the x86 world.

            • Dark Arc@social.packetloss.gg
              link
              fedilink
              English
              arrow-up
              2
              ·
              8 个月前

              I don’t remember Minecraft server edition relyint on native binaries, but it’s been a while since last ran it, maybe Mojang changed it.

              Same, and it sounds like it doesn’t… I just wasn’t sure. You can also run into things like “I never realized this was using… glibc… which is on every x86 Linux computer.” I don’t think that’s happened either though.

              Programs like Box86 and Box64 csn efficiently make native calls work out if there are native equivalents available, and there’s always qemu-static if that fails.

              Interesting, good to know

      • Peter Willemsen@lemmy.emerald.show
        link
        fedilink
        arrow-up
        2
        ·
        8 个月前

        Yeah it’s mostly performance related. I have like 10 different websites running all at once, and while CPU and RAM aren’t 100% all the time, with a heavy load I don’t have enough free to do it

  • melroy@kbin.melroy.org
    link
    fedilink
    arrow-up
    23
    ·
    edit-2
    8 个月前

    Well they should be afraid. I want a ARM Linux laptop as well. Or even better RISC-V! Yes plz… THE WORLD NEED RISC-V, Yesterday.

  • flatbield@beehaw.org
    link
    fedilink
    English
    arrow-up
    19
    ·
    8 个月前

    I think I remember Intel saying that 64 bit on the desktop was not needed. They are great at making meaningless predictions it seems.

  • figaro@lemdro.id
    link
    fedilink
    English
    arrow-up
    14
    ·
    8 个月前

    My m1 MacBook Air is hands down the most incredible laptop I’ve ever owned. I’ve had it for 3ish years now and it just doesn’t fucking stop. Battery life is still amazing and runs just as fast as it did day 1.

    I’ve NEVER had that experience with any Intel/PC laptop, ever. Honestly I’m never going back.

    • Paranoid Factoid@beehaw.org
      link
      fedilink
      English
      arrow-up
      13
      ·
      8 个月前

      If Intel or AMD ever bolted RAM straight on to the CPU the way Apple does with ARM, their CPUs would offer similar performance and battery life. Apple gets its performance gains from increased I/O bandwidth, not radical design. But there is a tradeoff with reduced expandability. Which may well be worth it in the laptop space. But not desktops and workstations.

      • beefcat@beehaw.org
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        8 个月前

        that is a big part of the performance, but the battery life savings also come from clever chip design and and the fact that TSMC has been ahead of Intel on feature size for years now.

  • lemillionsocks@beehaw.org
    link
    fedilink
    arrow-up
    11
    ·
    8 个月前

    They’re of course exaggerating a little and speaking confidently because theyre in the business of selling a product and not in the business of trash talking what they sell or reducing confidence in their product.

    That said the M1/M2 silicon battery life gains were a huge leap forward when they first launched but in terms of battery efficiency and power AMD has been nipping at their heels, and in due time intel will likely get it’s stuff together and join them. You can already get ryzen laptops efficient enough and cool running enough that the fan is off during most light usage, and they can get hours into the mid to high teens on some models.

    Likewise even macs will start to drain quite a bit when say watching an hd video 1.75x speed, or playing a video game, or encoding something using max CPU power. So while the Macs do have a power per watt advantage, you’ll still need to be plugged in.

    And thats BEST arm vs intel and amd as they catch up. Samsung, google, and qualcom dont really have anything like the m2 at play and while qualcom is rumored to be close the samsung fab’d chips definitely arent.

    So as things are the death Intel and AMD has been greatly exaggerated and in part due a combination of the usual apple hype combined with that hype being VERY VERY justified this go around.

    • abhibeckert@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      edit-2
      8 个月前

      Likewise even macs will start to drain quite a bit when say watching an hd video 1.75x speed, or playing a video game

      That’s not my experience. I can play demanding games (CPU/GPU flat out) for several hours on battery on my Mac, and it only has a 50Wh battery.

      With “normal” use I get about 18 hours on a charge.

      I generally charge it overnight, like a phone, except I don’t do it every night. I often don’t even have access to a charger for days at a time, a laptop charger isn’t part of my normal travel kit. If I notice the battery “running low” that means I need to find a charger in, like, five hours time.

      The high end MacBook Pros, with a 12 core CPU and 38 core GPU… yeah those can draw a lot of power. In fact they even drain the battery while plugged into a charger if you really push them. But I don’t think of those as “proper” laptops. They’re more like a portable desktop.

      • lemillionsocks@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        8 个月前

        A demanding game on a macbook air m2 will still draw close to 30 watts and while that is actually still good for a laptop relative to what the output is, and you can probably do things to improve that by tweaking in game settings, it’s still going to suck power out of a 50Whr battery.

        Steamdecks also run an efficient ryzen apu that lets them play games for 2-8 hours depending on how things are tweaked. Likewise on my 39Whr ryzen thinkpad(intel got a 59whr battery dont get me started on that nonsense) I can get 8-12 hours depending on usage normal browsing as well.

        This isnt to take down the m1 & m2. They are definitively more powerful, theyre definitively more efficient, I’m not disputing that. But the gap isnt as huge as it was when the m1 launched.

        • abhibeckert@beehaw.org
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          8 个月前

          I’m on an M1 MacBook Air - Anandtech measured between 11 and 17 watts with an M1 Mac Mini.

          However, the Mac Mini has an excessively large cooling system for the chipset it runs (before Apple Silicon, they sold the same Mac with an Intel i7 that turbo boosted to 4.6Ghz).

          The MacBook Air has basically no cooling at all and it definitely throttles under high load. It’s still fast enough to get 60fps with good graphics settings while throttled for the games I play - I’d say it’s about on par with my gaming PC that has an entry level Nvidia GPU, but there’s no way it’s drawing as much power as in Anandtech’s testing on an actively cooled chip.

          Based on the battery life I’m getting, I’d guess it’s drawing somewhere around 8 watts on average while playing games. It’s a very efficient chip… it draws 0.2 watts while idle according to Anandtech testing. Remember, this family of chips started life on devices with a 10Wh battery and the MacBook Air isn’t much faster than an iPhone.

          • mudeth@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            8 个月前

            You are absolutely right about efficiency. Even the (less efficient) M2 is way better than the 6800U for example under single-threaded load. It’s ~5W vs ~15W, around 3 times as power hungry as the M2, while performing slightly worse.

            The M1 is around 25% more efficient than the M2.

            • abhibeckert@beehaw.org
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              8 个月前

              The M1 is around 25% more efficient than the M2.

              No that’s not right. The M2 is far more efficient. Third party tests report he M2 MacBook Air lasts up to 3 hours longer than the M1.

              Yes, it draws more power under peak load… but it more than makes up for that with better performance allowing it to return to an idle state more quickly. Give an M1 and an M2 the same task, and the M2 will draw less power to get the task done.

              • mudeth@lemmy.ca
                link
                fedilink
                arrow-up
                1
                ·
                8 个月前

                Your original discussion with @lemillionsocks@beehaw.org, was about power usage while gaming, and the corresponding worst-case battery life. I was referring to this as efficiency.

                I understand now that the term was misleading The M1 is 25% more frugal than the M2 under worst-case load.

    • SenorBolsa@beehaw.org
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      8 个月前

      Yeah, I hope so, but they also cannot just lie about the direction they think they are headed like that as a public company. With the kind of progress translation has made it just seems inevitable that the switch will happen for lower power consumer devices at least. (Lower power being relative to a high end workstation) interesting to see if maybe this means a pivot to commercial only products.

  • thingsiplay@kbin.social
    link
    fedilink
    arrow-up
    10
    ·
    8 个月前

    My hope, no… dream, is that we get both ARM and x86 compatible chips on the same motherboard one day. Off course the operating system needs to support dual architectures. Then they could run ARM binaries directly without any major compatibility or performance hit, without the need for recompilation.

    A man can only hope. Is this something that could happen? Technically it should be possible, but realistically, probably not.

    • u_tamtam@programming.dev
      link
      fedilink
      arrow-up
      6
      ·
      8 个月前

      But then you end up with the downsides of having both and none of the upsides? Wouldn’t that incur an enormous effort on the software side to make it all possible, so you could run a less efficient chip in the end (practically two instead of one)

      • thingsiplay@kbin.social
        link
        fedilink
        arrow-up
        5
        ·
        8 个月前

        Having compatibility to legacy software is a pretty upside. Either you use an application that runs power efficient, maybe the entire operating system uses the power efficient ARM at default and then for compatibility or for faster calculation (games?) the x86 cores could be used. Intel already does two different kind of cores, performance and efficiency cores. And smartphones have something similar too. I imagine this would be expensive and it is not for everyone. And who knows what other cutbacks and drawbacks it would require.

    • Pantherina@feddit.de
      link
      fedilink
      arrow-up
      5
      ·
      8 个月前

      I think thats a pretty unmotivated approach. Imagine every invention replacing previous ones, just getting piled on top of each others?

    • Dark Arc@social.packetloss.gg
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 个月前

      This isn’t all that different from having a coprocessor. I don’t think it’s very useful to have an ARM or x86 coprocessor though because the major benefit to ARM is lower power consumption… Adding in a whole coprocessor is just going to increase power consumption.

      Things like Rosetta are probably the better way.

      Or maybe we see Java/the JVM make a comeback. This is the exact sort of world Java was built for. It just turned out that right around the time Java was taking off, everyone basically went for Windows and x86 chips… Which became the defacto standard.

      Granted, at this point, folks would probably be going for more WASM (in browser or not) than JVM.

      • thingsiplay@kbin.social
        link
        fedilink
        arrow-up
        1
        ·
        8 个月前

        But Java and WASM doesn’t solve the compatibility issue on ARM. Games and other programs for x86 are still something people want to execute on ARM machines. That’s why compatibility layers and emulators are build for. And having a dedicated CPU would help with that. And if you do not use the x86 “extension”, then you won’t pay for power consumption. And if you aren’t interested into x86, then you simply don’t buy a dual architecture motherboard.

        I’m not looking this from the perspective of laptops or handhelds BTW, but from the perspective of desktop PC. Overall I think its not practical to have them both on a single motherboard. But you know, the industry is full of non practical ideas. So it’s not unimaginable this could be reality someday. Maybe just for a small audience.

        • Dark Arc@social.packetloss.gg
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 个月前

          I’m not looking this from the perspective of laptops or handhelds BTW, but from the perspective of desktop PC.

          I guess, I don’t see ARM taking off on the desktop anytime soon. Everything is still going to be released with x86 binaries for the foreseeable future.

          And having a dedicated CPU would help with that. And if you do not use the x86 “extension”, then you won’t pay for power consumption. And if you aren’t interested into x86, then you simply don’t buy a dual architecture motherboard.

          You’d still have power draw, just not as much. Maybe for desktop it could be worth it, but I’d say a good emulator/translator would be a better option for most people.

          Rosetta from what I understand can do ahead of time translation which should get you pretty close to a usable piece of software for the vast vast vast majority of software. The exception would be things like games of course

  • Dark Arc@social.packetloss.gg
    link
    fedilink
    English
    arrow-up
    9
    ·
    8 个月前

    It’s possible this is a result of improvements Intel is planning for their x86 chips. They’ve already mirrored the efficiency and performance core designs that AFAIK originated in ARM.

    In a way, this might be Intel making a prediction based on how years ago Intel launched an x86 replacement, and AMD launched x86-64 … and AMD won because people didn’t want to rebuild all their software/couldn’t get their software.

    • sanzky@beehaw.org
      link
      fedilink
      arrow-up
      7
      ·
      8 个月前

      yeah but back then it was not 90% web apps. also programming languages are way better supporting both platforms. ARM is far from being a little player anymore

      • Dark Arc@social.packetloss.gg
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 个月前

        That’s true, but Windows ARM and Linux desktop ARM are still pretty niche.

        The web apps thing definitely makes this a lot easier for ARM to takeoff in the PC segment. Though, a lot of those devices are pretty well served by Chromebooks … of which, I think many are already ARM.

    • erwan
      link
      fedilink
      arrow-up
      1
      ·
      8 个月前

      Their market share is up but still less than 10%.