The question: the extent to which the mind can be used to control a computer. Neural signals are already used to control artificial limbs, for instance: if you "think" walking, you send those control signals whether you physically walk or not.
I gather that this is the case for speech, too. So that it would seem feasible (if not currently practicable) for us to be able to "think" words onto a computer screen by monitoring neural speech control signals; the software would do a statistical analysis similar to that performed on audio signals for convention speech recognition, deriving the words but having no sense of their meaning. [follow source for more]