• Lvxferre
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    1 year ago

    Note: long comment because I like Linguistics and spelling matters (regardless of language).

    A language is not its written system. And as overly complex, convoluted and erratic as English usage of the Latin alphabet might be, it’s by no means pictographic.

    HN comment worth mentioning:

    I often say you’ll have a much easier time accepting English if you consider it to be more like Mandarin; a collection of images.
    And what that user often says is nonsense.

    Neither Han characters as used by Mandarin, nor the Latin alphabet as used by English, is a “collection of images”. In both cases you have an outdated phonetic component and referring to an older state of the language, but that still somewhat works, and it’s foolish to pretend that it isn’t there.

    Here’s an example. If I show you the word “wug”, even if you never saw it in your life, you’ll probably render phonemically as /wʌɡ/. You won’t associate it with a /tʃ/ or a /p/, or be completely clueless on how to pronounce it, as you’d have with a completely pictographic system.

    • jet@hackertalks.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      So how do we get the written language to keep in step with the spoken language?

      Semantic versioning of the language? Do you speak UK English 2020?

      • Lvxferre
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Sorry for the rather long reply.

        One of the alternatives is to simply do nothing - it’s not that broken, and the cost of switch is going to bloody hurt.

        But if doing something, IMO the starting point would be the creation of a multi-governmental and cooperative entity, in charge of spelling, backed by at least the governments of a handful of English-speaking countries. It’ll probably look more similar to Portuguese’s CPLP than Spanish’s RAE, given the political landscape. That entity should:

        1. Create and update transcriptions for the English lexicon. The transcriptions should be diaphonemic, to handle dialectal differences.
        2. Use those transcriptions to find and address the current issues with the English spelling.
        3. Coin alternative spellings for words showing those issues.
        4. Suggest and encourage the usage of those new spellings to the member governments.

        When creating the diaphonemic transcriptions, there’ll be some disputes on what should be included as “English dialect” or excluded as “another language”. For example, in my opinion it’s better for both sides if Scots is handled as its own language; while Scottish English should be still included. Same deal with acrolectal Jamaican English vs. basilectal Jamaican Patwa.

        Steps #2 and #3 are also arbitrary, since there are multiple principles for an orthography. I think that the ones that would work better are:

        • morphemic - one spelling per morpheme
        • graphemic - one reading per grapheme
        • diaphonemic - one spelling per diaphoneme
        • status quo - no unnecessary changes
        • etymological - spelled accordingly to the source (for borrowings).

        When 2+ principles clash, you should pick the highest one. For example, it’s fine to bend the graphemic principle to keep morphemes consistent, but it isn’t fine to do it for etymological reasons.

        Do you speak UK English 2020?

        I’m L3, mind you, but I do use British Received Pronunciation as a basis.

        Still, this sort of subject applies to languages in general. And a lot of what I’m saying here is based on how CPLP, ABL and ACL do it for Portuguese (my L1) and how La Crusca does it for Italian (my L2). Specially their mistakes (as the Portuguese spelling being mostly decided by writers, instead of linguists; and the weird obsession of La Crusca for linguistic purism).

        • jet@hackertalks.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          That committee would have a hard job setup. I think if we used IPA only to phonetically write down what people are saying, humans would still get lazy and start modifying the phonetic writing to make it easier, and the language would move yet again. It’s a moving target in every dimension

          • Lvxferre
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            1 year ago

            The transcription is just a planning tool for the committee itself, to guide its decisions. It is not supposed to be used directly by the speakers; IMO the orthography itself should be kept Latin-26, at most using one or two diacritics or new letters if necessary (see note*).

            And the transcription should not be phonetic, for the “raw” sounds. If dealing with a single dialect, it would be phonemic, and deal with the abstract units of speech aka phonemes, since they’re the ones that you need to distinguish on first place.

            And, since you’re dealing with multiple dialects, you need to reach a compromise between them. Then instead of a phonemic transcription, you go a step further, with a diaphonemic transcription. (Wikipedia has a rather good article on diaphonemes, but in a nutshell: they’re a compromise between the phonemes of multiple dialects.)

            *note: if I were the one in charge (Bad Idea®), I’d probably consider the diaeresis; English has a lot of vowel phonemes but usually avoids vowel sequences, so it’s less intrusive to mark “this looks like a digraph, but it isn’t!” than to mark vowel quality itself.

      • cypherix93@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I only speak EnglishNext. And I use autism as a transpiler to support speakers of older versions.