My PhD Project - Predictive Action Perception in Autism
This project aims to explore social challenges in autism , potentially arising from specific neurocognitive mechanisms. Drawing on predictive coding theory, we hypothesize that autistic people use distinct predictive processes. Our study uses a 'take it/leave it' paradigm to assess perceptual biases in aligning actor intentions and actions via touchscreen responses. We are also correlating these perceptual differences against scores of autistic traits. Differentiating such perceptual processes could serve as a non-invasive screening tool.
Interestingly, social difficulties in autism are less pronounced when interacting with others who share the same condition. Additionally, Autism presents atypical fine motor movements as characteristic features, which may impact how behavior is perceived and interpreted by others. To investigate this, we are using motion-capture technology to record reaching and withdrawing actions of autistic people on an x/y/z axis. Incorporating these recordings into our 'take it/leave it' perception experiments, we anticipate that autistic people will find it easier to predict each other's actions, but that this will be less effective in neuro-divergent groups (autistic & non-autistic). As such, this research also sheds light on the mechanisms influencing social discrimination and affiliation within autism. Our project goes against the grain of current medical perspectives, which views social difficulties as an individual deficit in these conditions that require intervention or treatment. We shift to the view that these difficulties emerge via the environment of which we interact, including those we interact with.
“Grant me the dignity of meeting me on my own terms. Recognize that we are just as alien to each other, and that my ways of being are not merely damaged versions of yours” (Sinclair, J. ,2010).
Codamotion motion capture equipment to record movements.
Reaching movements from the ‘take it/leave it’ paradigm.
Motion capture of hand movements example - Valtin et al (2017). Got em'