Interaction with the external world requires the ability to perceive dynamic changes in complex sensorial input and react promptly. Here we show that perception of dynamic stimuli in the visual and tactile sensory modalities share fundamental psychophysical aspects that can be explained by similar computational models. In vision, optic flow provides information on relative motion between the individual and the content of percept. For instance, radial patterns of optic flow are used to estimate time before contact with an approaching object4. Similarly, in the tactile modality, radial patterns of stimuli provide information on softness of probed objects3. Optic flow is also invoked to explain several visual illusions, including the well-known "barber-pole" effect10. Here we introduce a computational model of tactile flow, which is intimately related to existing models for the visual counterpart. The model accounts for psychophysical aspects of dynamic tactile perception and predicts illusory phenomena in the tactile domain, analogous to the barber-pole effect. When subjects touched translating pads with differently oriented gratings, they perceived a direction of motion that was significantly biased towards the orientation of the gratings. Therefore, these findings indicate that visual and tactile flow share similarities at the psychophysical and computational level and may be intended for similar perceptive goals. Results of this analysis have impact on the engineering of better haptic and multimodal interfaces for human-computer interaction.