In our everyday interactions we encounter a plethora of novel experiences in different contexts that require prompt decisions for successful actions and social interactions. Despite the seeming ease with which we perform these interactions, extracting the key information from the highly complex input of the natural world and deciding how to interpret it is a computationally demanding task for the visual system. Accumulating evidence suggests that the brain solves this problem by combining sensory information and previous knowledge about the environment. Here, we review the neural mechanisms that mediate experience-based plasticity and shape perceptual decisions. We propose that learning plays an important role in the adaptive optimization of visual functions that translate sensory experiences to decisions by shaping neural representations across cortical circuits in the primate brain.