We now know that natural signed languages such as American Sign Language, French Sign Language, British Sign Language and others are fully independent languages. But natural signed languages are only one way of conveying language in the visual/gestural modality. Signed languages also have mechanisms for representing the material of oral languages. Fingerspelling is one example of such a representational system. This book examines fingerspelling from a phonetic perspective. Several studies of the kinematics of fingerspelling articulators are reported. From these detailed analyses of articulator timings and velocities, conclusions are drawn which suggest that, like speech, fingerspelling may be explained in terms of coordinative structures and task dynamics. The thrust of the book is to explore the notion that signed and spoken languages can be compared not only as abstract linguistic systems but also at the physical level as dynamically structured articulations. An implication of these studies is that a common basis in gesture can be found for the production, perception, and neural organization of signed and spoken languages.