Think, when you hear a speech, what do you do to understand it? You have to listen, you have to understand separate words, you have to combine those words in your head and understand the meaning of the sentences. Do you always understand everything that you hear? No, sometimes there is too much noise or maybe you were daydreaming and didn’t quite catch what the speaker told you. Same with the sign languages, only you need to pay attention, focusing your eyes on the signer.
To teach machines sign language, they need to be able to see, track human movement, track human hands, track facial expressions. The machine should be able to tell when one sign begins and another one ends (just like hearing separate words when listening to the speech). Once the machine identifies separate signs, it has to label them (meaning of a sign). There are many many more things to understanding sign languages by the machines. And no, it does not always work 🙂