class: left, middle, inverse, title-slide .title[ #
Pre-existing categorization diminishes attractive and repulsive temporal history effects on perception
] .author[ ### Eline Van Geert
1
, & Johan Wagemans
1
] .institute[ ###
1
KU Leuven, Belgium ] .date[ ### ICPS Brussels, Symposium Categorization across Fields: Insights from Learning, Cognition, and Perception March 9
th
, 2023 ] --- layout: true .footer_right[] .footer_middle[<a href="https://tinyurl.com/ICPSBRU23" target="_blank">slides at tinyurl.com/ICPSBRU23</a>] --- class: top, normal, center ### Perception depends on... .footnote[<a href = "https://doi.org/10.3389/fnhum.2015.00594"> Snyder et al. (2015)</a>; <a href = "https://doi.org/10.1177/20416695221109300"> Van Geert, Moors, Haaf, & Wagemans (2022)</a></br></br>] <div style="width:48%; margin-left:1%; margin-right:1%; margin-top: -30px; float: left; font-size:30px;"> <center>... what was just <b>seen</b></center> </div> <center><img src = "figs/hyst.png" style = "width: 100%; margin-bottom:-2%;"> </center> ??? Hello everyone, Today I will present you my work on the relation between categorization and immediate temporal context effects on perception. From previous research we know that perception does not only depend on the stimulus that is shown on a certain point in time, but also depends on what was shown and perceived just before the current stimulus was presented. Earlier research found two different temporal context effects. First, when we consider the effect of the previous percept on the current percept, we speak of hysteresis. This refers to an attractive effect of the previous percept, or in other words, a tendency to organize the visual input in a similar way as preceding context stimuli. When we perceive an ambiguous stimulus at time 1 as a car, the chance to perceive a car again at the second timepoint will increase. --- class: top, normal, center ### Perception depends on... .footnote[<a href = "https://doi.org/10.3389/fnhum.2015.00594"> Snyder et al. (2015)</a>; <a href = "https://doi.org/10.1177/20416695221109300"> Van Geert, Moors, Haaf, & Wagemans (2022)</a></br></br>] <div style="width:48%; margin-left:1%; margin-right:1%; margin-top: -30px; float: left; font-size:30px;"> <center>... what was just <b>seen</b></center> </div> <div style="width:48%; margin-left:1%; margin-right:1%; margin-top: -30px; float: right; font-size:30px;"> <center>... what was just <b>shown</b></center> </div> <center><img src = "figs/hystadapt.png" style = "width: 100%; margin-bottom:-2%;"> </center> ??? Second, when we consider the effect of the previous stimulus on the current percept, we speak of "adaptation". Adaptation refers to a repulsive effect of the previous stimulus evidence, or in other words, a tendency that pushes the current percept away from the organization that was most dominant in the previous stimulus. When we are presented with a very clear example of a car at time 1 and perceive it as a car, the chance to perceive a car again at the second timepoint will decrease compared to when a less clear example of a car was presented at time 1. --- class: top, normal ### Data collection .pull-left[ - 283 first-year psychology students (88% female) - 3 recognizable and 3 non-recognizable morph series <img src="figs/line_cartotortoise.png" width="95%" /><img src="figs/line_penguintochild.png" width="95%" /><img src="figs/line_watchtoseahorse.png" width="95%" /> <img src="figs/line_set1b.png" width="95%" /><img src="figs/line_set3b.png" width="95%" /><img src="figs/line_set4a.png" width="95%" /> ] ??? In the study that I will present to you today, 283 first year psychology students participated in a categorization task, a discrimination task, and a similarity judgment task. Each participant completed each task for a different recognizable and non-recognizable series --- class: top, normal ### Data collection .pull-left[ - 283 first-year psychology students (88% female) - 3 recognizable and 3 non-recognizable morph series <img src="figs/line_cartotortoise.png" width="95%" /><img src="figs/line_penguintochild.png" width="95%" /><img src="figs/line_watchtoseahorse.png" width="95%" /> <img src="figs/line_set1b.png" width="95%" /><img src="figs/line_set3b.png" width="95%" /><img src="figs/line_set4a.png" width="95%" /> ] .pull-right[ <img src="figs/cattask.png" width="100%" /> ] ??? In this presentation I will focus on the results for the categorization task (for which you see the trial structure presented here). Before starting the categorization task, participants were presented with 4 examples of category A and 4 examples of category B, that is the only information they received. They did not receive any form of feedback during the task. --- class: top, normal ### Data collection .pull-left[ - 283 first-year psychology students (88% female) - 3 recognizable and 3 non-recognizable morph series <img src="figs/line_cartotortoise.png" width="95%" /><img src="figs/line_penguintochild.png" width="95%" /><img src="figs/line_watchtoseahorse.png" width="95%" /> <img src="figs/line_set1b.png" width="95%" /><img src="figs/line_set3b.png" width="95%" /><img src="figs/line_set4a.png" width="95%" /> ] .pull-right[ <img src="figs/cattask.png" width="100%" /> </br> <img src="figs/naminglevels.png" width="100%" /> ] ??? We will indicate the stimulus evidence or morph level with a number from -5 to 5, with -5 being the stimulus that provides most evidence for category A, 5 being the stimulus that provides most evidence for category B, and 0 being the stimulus exactly in between category A and B. --- class: top, normal ### Categorization responses .footnote[<a href = "https://doi.org/10.31234/osf.io/6x75c"> Van Geert & Wagemans (2022)</a></br></br>] <!-- <center> --> <!-- <div id="wrap3"> --> <!-- <iframe id="scaled-frame3" src="interactivefigs/test_interactivefig_catresp.html"></iframe> --> <!-- </div> --> <!-- </center> --> <img src="figs/catresp_plot.png" width="55%" style="display: block; margin: auto;" /> ??? First I show you the overall categorization responses. On the x axis: morph level On the y axis: probability of perceiving the current stimulus as part of category B the top row of graphs shows the results for the recognizable morph series the bottom row of graphs shows the results for the non-recognizable morph series stronger categorization for the recognizable morph series compared to the non-recognizable morph series What about the temporal context effects (effects previous percept and previous stimulus)? --- class: top, normal ### Attractive context effect previously perceived category <!-- <div id="wrap3"> --> <!-- <iframe id="scaled-frame3" src="interactivefigs/test_interactivefig_cathyst.html"></iframe> --> <!-- </div> --> <img src="figs/cathyst_plot.png" width="55%" style="display: block; margin: auto;" /> ??? For the non-recognizable morph series, we see a clear, overall, attractive effect of the previously perceived category, a clear hysteresis effect. The chance of perceiving the current stimulus as category B is higher when you have just perceived the previous stimulus B, compared to having perceived the previous stimulus as A. This attractive effect of the previously perceived category is largely absent in the recognizable morph series. --- class: top, normal ### Attractive context effect previous perceived category <img src="figs/cathyst_typeplot.png" width="75%" style="display: block; margin: auto;" /> ??? Here we see the same results but averaged across the different recognizable and non-recognizable morph series. --- class: top, normal ### Attractive context effect diminishes over time <center> <img src="figs/cathyst_partplot.png" width="55%" /> </center> ??? When we split the data for the different blocks part of the experiment, we see that the attractive effect of the previously perceived category present in the data for the non-recognizable morph series diminishes over time. This result is in line with the idea that stronger categorization diminishes the effect of the immediate context on perception. --- class: top, normal ### Repulsive context effect previous category evidence <!-- <center> --> <!-- ```{r, out.width="45%"} --> <!-- knitr::include_graphics(here("05_dissemination", "figs", "catadapt_plot1.png")) --> <!-- ``` --> <!-- </center> --> <!-- --- --> <!-- class: top, normal --> <!-- ### Repulsive context effect previous category evidence --> <center> <img src="figs/catadapt_plot_nonrec.png" width="85%" /> </center> ??? Let's first have a look at the results for the non-recognizable morph series. On the x axis: current morph level On the y axis: probability of perceiving the current stimulus as category B In the panels: previous percept / previously perceived category indicated In the legend (colors): previous stimulus/category evidence The effect of the previous stimulus evidence on the current percept is dependent on the congruency between the previous percept and the current stimulus. When the previous stimulus was perceived as part of category A, and the current stimulus is in favor of category B, you see a typical repulsive adaptation effect: the chance of perceiving the current stimulus as B is then higher when the previous stimulus was in favor of A (than when the previous stimulus was in favor of B). As we can see in the rectangle in the left plot, the light lines lay higher than the darker lines. Similarly, in the rectangle in the right plot, when the previous stimulus was perceived as B and the current stimulus is in favor of A, the chance of perceiving the current stimulus as A is higher when the previous stimulus was in favor of B. However, when the current stimulus is in favor of the same category as the previous percept, we do not see a lot of evidence for such a repulsive adaptation effect. This effect is clear in the non-recognizable morph series, but absent in the recognizable morph series. In the recognizable morph series, this effect of the previous stimulus is not evident. We could thus conclude that these context effects of previous category evidence are present only in the non-recognizable morph series → in other words, one could say that stronger categorization diminishes the use of immediate context information in forming the current percept. [because less data points per condition per series, here combined across different morph series] --- class: top, normal ### Repulsive context effect previous category evidence <center> <img src="figs/catadapt_plot.png" width="85%" /> </center> ??? This effect is clear in the non-recognizable morph series, but absent in the recognizable morph series. In the recognizable morph series, this effect of the previous stimulus is not evident. We could thus conclude that these context effects of previous category evidence are present only in the non-recognizable morph series → in other words, one could say that stronger categorization diminishes the use of immediate context information in forming the current percept. [because less data points per condition per series, here combined across different morph series] --- class: top, normal ### No indication for temporal change in repulsive context effect <center> <img src="figs/catadapt_partplot.png" width="85%" /> </center> ??? When we plot the results for the different blocks, we do not see any indications for a temporal change in the size of the repulsive effect of the previous stimulus evidence. --- class: top, normal ### Conclusion **For non-recognizable morph series:** - clear **attractive effect** of **previously perceived category** (i.e., hysteresis) - when current stimulus incongruent with previous percept, **repulsive effect** of **previous category evidence** (i.e., adaptation) </br></br> -- **No context effects present for recognizable morph series** (i.e., when strong categorization is present) ??? We found a clear attractive effect of the previously perceived category: a clear hysteresis effect When the current stimulus was from a different category than the previous percept, there was a repulsive effect of the previous stimulus evidence present. Important to distinguish between the effects of the previous stimulus and the previous percept: both effects need to be taken into account to understand the pattern of results present in this study on shape categorization. We also replicated these findings for two of the non-recognizable morph series in a separate study --- class: top, normal ### Takeaway </br> <div style = font-size:26px;"> <b>→ Strong pre-existing categorization diminishes attractive and repulsive temporal history effects on perception</b> </div> ??? As a key takeaway, this study brings evidence for the idea that strong pre-existing categorization can diminish attractive and repulsive temporal history effects on perception. That is, in case a clear categorization is present, we will use this categorization to clarify our percept. In case no clear categorization is present, we make more use of immediate temporal context to clarify our percept. --- class: top, normal <center> <img src = "img/EVG_OCTA.svg" style = "width: 50%;" margin-bottom: -10px;"> <div style = "margin-top: -30px;"><a href = "http://evg.ulyssis.be/evg/"><b>Eline Van Geert</b></a></br><a href="mailto:eline.vangeert@kuleuven.be" style="display:inline-block; float:middle;"><span style="font-size:5;">eline.vangeert@kuleuven.be</span></a></div></br> <div style = "float:middle; margin-top: -20px; "> <a style = "font-size: 16px;" href="http://gestaltrevision.be/en/" target="_blank" rel="noopener"> <span>GestaltReVision research group</br>Laboratory of Experimental Psychology - KU Leuven</span> </a></br> <a href="http://evg.ulyssis.be/evg/" style="display:inline-block; float:middle;padding:10px; padding-left:0px;"> <svg viewBox="0 0 496 512" style="position:relative;display:inline-block;top:.1em;fill:#119abb;height:2em;" xmlns="http://www.w3.org/2000/svg"> <path d="M336.5 160C322 70.7 287.8 8 248 8s-74 62.7-88.5 152h177zM152 256c0 22.2 1.2 43.5 3.3 64h185.3c2.1-20.5 3.3-41.8 3.3-64s-1.2-43.5-3.3-64H155.3c-2.1 20.5-3.3 41.8-3.3 64zm324.7-96c-28.6-67.9-86.5-120.4-158-141.6 24.4 33.8 41.2 84.7 50 141.6h108zM177.2 18.4C105.8 39.6 47.8 92.1 19.3 160h108c8.7-56.9 25.5-107.8 49.9-141.6zM487.4 192H372.7c2.1 21 3.3 42.5 3.3 64s-1.2 43-3.3 64h114.6c5.5-20.5 8.6-41.8 8.6-64s-3.1-43.5-8.5-64zM120 256c0-21.5 1.2-43 3.3-64H8.6C3.2 212.5 0 233.8 0 256s3.2 43.5 8.6 64h114.6c-2-21-3.2-42.5-3.2-64zm39.5 96c14.5 89.3 48.7 152 88.5 152s74-62.7 88.5-152h-177zm159.3 141.6c71.4-21.2 129.4-73.7 158-141.6h-108c-8.8 56.9-25.6 107.8-50 141.6zM19.3 352c28.6 67.9 86.5 120.4 158 141.6-24.4-33.8-41.2-84.7-50-141.6h-108z"></path></svg> </a> <a href="mailto:eline.vangeert@kuleuven.be" style="display:inline-block; float:middle;padding:10px;"> <svg viewBox="0 0 512 512" style="position:relative;display:inline-block;top:.1em;fill:#119abb;height:2em;" xmlns="http://www.w3.org/2000/svg"> <path d="M502.3 190.8c3.9-3.1 9.7-.2 9.7 4.7V400c0 26.5-21.5 48-48 48H48c-26.5 0-48-21.5-48-48V195.6c0-5 5.7-7.8 9.7-4.7 22.4 17.4 52.1 39.5 154.1 113.6 21.1 15.4 56.7 47.8 92.2 47.6 35.7.3 72-32.8 92.3-47.6 102-74.1 131.6-96.3 154-113.7zM256 320c23.2.4 56.6-29.2 73.4-41.4 132.7-96.3 142.8-104.7 173.4-128.7 5.8-4.5 9.2-11.5 9.2-18.9v-19c0-26.5-21.5-48-48-48H48C21.5 64 0 85.5 0 112v19c0 7.4 3.4 14.3 9.2 18.9 30.6 23.9 40.7 32.4 173.4 128.7 16.8 12.2 50.2 41.8 73.4 41.4z"></path></svg> </a> <a href="http://orcid.org/0000-0002-7848-5998" target="_blank" rel="noopener" style="display:inline-block; float:middle;padding:10px;"> <svg viewBox="0 0 512 512" style="position:relative;display:inline-block;top:.1em;fill:#119abb;height:2em;" xmlns="http://www.w3.org/2000/svg"> <g label="icon" id="layer6" groupmode="layer"> <path id="path2" d="m 336.6206,194.53756 c -7.12991,-3.32734 -13.8671,-5.55949 -20.25334,-6.61343 -6.36534,-1.09517 -16.57451,-1.61223 -30.71059,-1.61223 h -36.70409 v 152.74712 h 37.63425 c 14.6735,0 26.08126,-1.01267 34.22385,-3.01709 8.14259,-2.00442 14.92159,-4.52592 20.35674,-7.62608 5.43519,-3.07925 10.416,-6.8615 14.94192,-11.38742 14.4876,-14.71475 21.74129,-33.27334 21.74129,-55.7176 0,-22.05151 -7.44016,-40.05177 -22.34085,-53.98159 -5.49732,-5.16674 -11.82143,-9.44459 -18.88918,-12.79281 z M 255.99999,8.0000031 C 119.02153,8.0000031 8.0000034,119.04185 8.0000034,255.99998 8.0000034,392.95812 119.02153,504 255.99999,504 392.97849,504 504,392.95812 504,255.99998 504,119.04185 392.97849,8.0000031 255.99999,8.0000031 Z M 173.66372,365.51268 H 144.27546 V 160.1481 h 29.38826 z M 158.94954,138.69619 c -11.13935,0 -20.21208,-9.01056 -20.21208,-20.21208 0,-11.11841 9.05183,-20.191181 20.21208,-20.191181 11.18058,0 20.23244,9.051831 20.23244,20.191181 -0.0219,11.22184 -9.05186,20.21208 -20.23244,20.21208 z m 241.3866,163.59715 c -5.29051,12.54475 -12.83407,23.58066 -22.65053,33.08742 -9.98203,9.83734 -21.59659,17.19443 -34.84378,22.19616 -7.74983,3.01709 -14.83852,5.06335 -21.30725,6.11726 -6.4891,1.01267 -18.82759,1.50883 -37.07593,1.50883 H 219.5033 V 160.1481 h 69.23318 c 27.96195,0 50.03378,4.1541 66.31951,12.54476 16.26485,8.36977 29.18144,20.72859 38.79164,36.97254 9.61013,16.26483 14.4254,34.01757 14.4254,53.19607 0.0227,13.76426 -2.66619,26.90802 -7.93576,39.43187 z" style="stroke-width:0.07717"></path> </g></svg> </a> <a href="https://twitter.com/eline__vg" target="_blank" rel="noopener" style="display:inline-block; float:middle;padding:10px;"> <svg viewBox="0 0 512 512" style="position:relative;display:inline-block;top:.1em;fill:#119abb;height:2em;" xmlns="http://www.w3.org/2000/svg"> <path d="M459.37 151.716c.325 4.548.325 9.097.325 13.645 0 138.72-105.583 298.558-298.558 298.558-59.452 0-114.68-17.219-161.137-47.106 8.447.974 16.568 1.299 25.34 1.299 49.055 0 94.213-16.568 130.274-44.832-46.132-.975-84.792-31.188-98.112-72.772 6.498.974 12.995 1.624 19.818 1.624 9.421 0 18.843-1.3 27.614-3.573-48.081-9.747-84.143-51.98-84.143-102.985v-1.299c13.969 7.797 30.214 12.67 47.431 13.319-28.264-18.843-46.781-51.005-46.781-87.391 0-19.492 5.197-37.36 14.294-52.954 51.655 63.675 129.3 105.258 216.365 109.807-1.624-7.797-2.599-15.918-2.599-24.04 0-57.828 46.782-104.934 104.934-104.934 30.213 0 57.502 12.67 76.67 33.137 23.715-4.548 46.456-13.32 66.599-25.34-7.798 24.366-24.366 44.833-46.132 57.827 21.117-2.273 41.584-8.122 60.426-16.243-14.292 20.791-32.161 39.308-52.628 54.253z"></path></svg> </a> <a href="https://github.com/ElineVG" target="_blank" rel="noopener" style="display:inline-block; float:middle;padding:10px;"> <svg viewBox="0 0 496 512" style="position:relative;display:inline-block;top:.1em;fill:#119abb;height:2em;" xmlns="http://www.w3.org/2000/svg"> <path d="M165.9 397.4c0 2-2.3 3.6-5.2 3.6-3.3.3-5.6-1.3-5.6-3.6 0-2 2.3-3.6 5.2-3.6 3-.3 5.6 1.3 5.6 3.6zm-31.1-4.5c-.7 2 1.3 4.3 4.3 4.9 2.6 1 5.6 0 6.2-2s-1.3-4.3-4.3-5.2c-2.6-.7-5.5.3-6.2 2.3zm44.2-1.7c-2.9.7-4.9 2.6-4.6 4.9.3 2 2.9 3.3 5.9 2.6 2.9-.7 4.9-2.6 4.6-4.6-.3-1.9-3-3.2-5.9-2.9zM244.8 8C106.1 8 0 113.3 0 252c0 110.9 69.8 205.8 169.5 239.2 12.8 2.3 17.3-5.6 17.3-12.1 0-6.2-.3-40.4-.3-61.4 0 0-70 15-84.7-29.8 0 0-11.4-29.1-27.8-36.6 0 0-22.9-15.7 1.6-15.4 0 0 24.9 2 38.6 25.8 21.9 38.6 58.6 27.5 72.9 20.9 2.3-16 8.8-27.1 16-33.7-55.9-6.2-112.3-14.3-112.3-110.5 0-27.5 7.6-41.3 23.6-58.9-2.6-6.5-11.1-33.3 2.6-67.9 20.9-6.5 69 27 69 27 20-5.6 41.5-8.5 62.8-8.5s42.8 2.9 62.8 8.5c0 0 48.1-33.6 69-27 13.7 34.7 5.2 61.4 2.6 67.9 16 17.7 25.8 31.5 25.8 58.9 0 96.5-58.9 104.2-114.8 110.5 9.2 7.9 17 22.9 17 46.4 0 33.7-.3 75.4-.3 83.6 0 6.5 4.6 14.4 17.3 12.1C428.2 457.8 496 362.9 496 252 496 113.3 383.5 8 244.8 8zM97.2 352.9c-1.3 1-1 3.3.7 5.2 1.6 1.6 3.9 2.3 5.2 1 1.3-1 1-3.3-.7-5.2-1.6-1.6-3.9-2.3-5.2-1zm-10.8-8.1c-.7 1.3.3 2.9 2.3 3.9 1.6 1 3.6.7 4.3-.7.7-1.3-.3-2.9-2.3-3.9-2-.6-3.6-.3-4.3.7zm32.4 35.6c-1.6 1.3-1 4.3 1.3 6.2 2.3 2.3 5.2 2.6 6.5 1 1.3-1.3.7-4.3-1.3-6.2-2.2-2.3-5.2-2.6-6.5-1zm-11.4-14.7c-1.6 1-1.6 3.6 0 5.9 1.6 2.3 4.3 3.3 5.6 2.3 1.6-1.3 1.6-3.9 0-6.2-1.4-2.3-4-3.3-5.6-2z"></path></svg> </a> </div> </center> <div> <center><b>Thanks to</b></center> <center>my research group, institution, and funder</center></br> <center><img src = "logos/gestaltrevisionlogosmall.png"> <img src = "logos/KULeuven_small.png"> <img src = "logos/FWOlogosmall.jpg" ></center> </div> ??? I want to thank you all for your attention. If there are any questions, I am more than happy to answer them :-) If you want to know more, feel free to talk to me during the conference or to send me an email.