Yup - they struggle really hard with syntactical ambiguity that relies on world knowledge for disambiguation. We know that “it” = “the bed” in this sentence because “it is too big” needs to be logically connected as the reason for “the bed does not fit in the tent”, and the only way for this to happen that doesn’t conflict with our world knowledge is if the bed is big, but the tent is small. And we can even change the “it” to refer to the object by simply changing the adjective:
The bed does not fit in the tent because it is too small.
Without any sort of grammatical change.
Donkey sentences are also hard for them, like:
Everyone who owns a donkey beat it.
If you’re human, this sentence implies that 1) there are multiple donkeys, owned by different people; and 2) each of those people beat one’s own donkey. But machines have a really hard time getting those two things right.
And you can exploit a lot of those quirks of RL language to make the bots go nuts. A few of them might slip through, but this is low-cost for the humans, so you can pile them up.
Yup - they struggle really hard with syntactical ambiguity that relies on world knowledge for disambiguation. We know that “it” = “the bed” in this sentence because “it is too big” needs to be logically connected as the reason for “the bed does not fit in the tent”, and the only way for this to happen that doesn’t conflict with our world knowledge is if the bed is big, but the tent is small. And we can even change the “it” to refer to the object by simply changing the adjective:
Without any sort of grammatical change.
Donkey sentences are also hard for them, like:
If you’re human, this sentence implies that 1) there are multiple donkeys, owned by different people; and 2) each of those people beat one’s own donkey. But machines have a really hard time getting those two things right.
And you can exploit a lot of those quirks of RL language to make the bots go nuts. A few of them might slip through, but this is low-cost for the humans, so you can pile them up.