I was taking a look at the Naomi Wu situation (A Chinese DIY tech youtuber who went missing after being watched by the government) and in one part they mentioned that she was concerned about her privacy, so started using Signal, but had a default chinese keyboard that had a keylogger and the police had looked into what she was talking on there.

I’m not sure if it was a mobile only thing, but it was mentioned that the keyboard app was used in like 70% por chinese smarthphones.

Now, I use AnySoftKey and refuse to use default keyboard apps, but how far can we reach on the keyboard security thing? Is typing on a computer or using a physical keyboard on a mobile device 100% safe? I think the keyboard issue is often overlooked and would like to know what recommendations your have? Or what should be known more?

  • 7heo
    link
    fedilink
    arrow-up
    9
    arrow-down
    3
    ·
    1 year ago

    So, the keyboard is important, and is one thing; but one must not overlook that on touch devices, software keyboards run on top of operating systems, and hardware, too.

    Actual keyboards made of physical keys/domes/switches, embedded controllers, and USB chips/cables, even if they seemingly present a similar attack surface (after all, they are also connected to hardware with an operating system), are practically harder to compromise.

    Yes, in both cases, the operating system has access to the information (be it via the USB subsystem, or via the touch/input subsystem, etc), and the hardware too; but computer hardware is more heterogeneous, more modular and standardized, and “auditable” operating systems are much more common, and readily deployable. Heck, it is entirely possible to run open hardware and open software (including the toolchain to build said software) with a computer; but the same for mobile devices is very much uncharted territory.

    The result is that even with our best effort, a mobile device is at the very best a black box, with an unlocked bootloader, a community provided recovery and operating system, often downloaded from random file hosting services, and built with toolchains of variable quality with several proprietary components.

    This is not ideal, and it is the best case scenario. In many cases, people run the stock recovery/operating system, and simply sideload software on a device they do not even have root access to.

    The takeaway here, is that while nothing is perfect, and there is always an attack surface, using unpredictable components (e.g. standardized, modular), with unpredictable software (e.g. dozen of different operating system families, each with their own version and qwirks), while also having a community of technologists auditing the code (even if only a handful of developers, there is much less risk of them all organizing towards a harmful goal than with a team of employees from a company ultimately led by a single person), practically provides a significantly different environment.
    And unless you’re an embedded engineer/genius able to design bug-free (g’luck) PCBs from open hardware ICs (so you can have them made via your vendor of choice) and simple components, and then use software you, or people you trust, audited, you have to recognize that you can only do “less bad”.

    Less bad being a dev-friendly device (old Google pixels come to mind, not sure if they’re still like that), installing lineage, /e/ or another alternative system on it, and using software from f-droid (like AnySoftKeyboard - the one I am actually using for typing this).

    That, or you decide to trust the reputable (YMMV) company of your choice with your digital life, identity, and data. Many pros I know in infosec go for Apple. It is true that, at least for professional use cases, with companies in the US or in western Europe, it arguably makes sense.