{"id":522,"date":"2009-11-29T12:21:02","date_gmt":"2009-11-29T16:21:02","guid":{"rendered":"http:\/\/nim.ir\/?p=522"},"modified":"2009-11-29T12:21:02","modified_gmt":"2009-11-29T16:21:02","slug":"definition-of-the-master-thesis-2","status":"publish","type":"post","link":"http:\/\/nim.ir\/?p=522","title":{"rendered":"Definition of the Master Thesis (2) &#8211; Draft"},"content":{"rendered":"<p><strong>Title:<\/strong> Modeling human auditory synchronization behavior based on EEG data<\/p>\n<p><strong>Student:<\/strong> &#8211;<br \/>\n<strong>Supervisor:<\/strong> &#8211;<br \/>\n<strong>Co-supervisors:<\/strong> <a href=\"http:\/\/nim.ir\/\" target=\"_blank\" rel=\"noopener noreferrer\">Nima Darabi<\/a>, PhD at Q2S \/ <a href=\"http:\/\/www.iet.ntnu.no\/%7Esvensson\/\" target=\"_blank\" rel=\"noopener noreferrer\">Peter Svensson<\/a>, Professor, IET and Q2S<br \/>\n<strong> Semester:<\/strong> Spring 2010<\/p>\n<p><strong>An overall view: <\/strong>While dancing, performing or listening to a rhythmic music, we are synchronizing our reactions based on the auditory stimuli. The characteristics of such an action depend on how we use our short-term auditory memory, as the ability to recall something heard very recently. Every suggested synchronization model should take the role of this memory into account. The traditional way of understanding this memory function is analysis of the recorded behavior of synchronous cooperative subjects such as processing of their produced sound signals. In addition to this, the measurement of the electrical brain activity might be a very useful source, assigning to the relative sound signals and provided by EEG&#8217;s good temporal resolution.<\/p>\n<p><strong>The Assignment: <\/strong>In this suggested project, We will do experiments in which subjects passively listen to hand claps, in order to find out how that translates to EEG activity. We will set up some well-defined and simple subjective experiments with auditory stimuli and use quantitative EEG methods (mathematical measurement of aspects of the EEG signal) to analyze the provided information. The process should be passive given the circumstances such as EEG sensitivity to body movements and their results are supposed to address these questions:<\/p>\n<ul>\n<li>How the brain electrical activity is influenced by passive rhythm perception?<\/li>\n<li>Which temporal structures are human perceivable as rhythm?<\/li>\n<li>How fast the auditory stimuli can be traced in the electrical brain activity?<\/li>\n<li>How this can be changed by different patterns and tempos?<\/li>\n<li>How much is this individual or training dependent?<\/li>\n<li>How we can explore memory aspects of human rhythmic and musical behavior using EEG?<\/li>\n<li>when experimenting with delays between handclapping performers, and<br \/>\nwe ask them to keep the tempo stable, it sounds bad to them but they can keep the tempo. On average, though, people compensate not enough so the tempo always decreases. Why don\u2019t they compensate completely?<\/li>\n<\/ul>\n<p>We also need to figure out the extent to which asynchrony is perceived. We will come up with a model that describes the behavior of a clapper. One aspect is the perception of the other clapper and the perception of one\u2019s self: EEG might be able to show when a situation is unusual which might be related to asynchrony.<\/p>\n<p>In this case we just have subject listen to a bunch of pre-recorded hand claps. Do an experiment with active clapping; subsequently, compare the EEG data and find any differences. Here E(t), B(t) and A(t) stand for Ear(t), Brain(t) and Arm(t). Brain (T) comes from EEG data related to hearing of stimuli and will help us to understand more about the processing of perceived stimuli (synchrony check \/ prediction performed by subject), and movement of arms, respectively.<\/p>\n<p>This study finally aims at defining some memory\/inertia-related parameters as a measure of strategy taken by the performers in a musical collaboration\/synchronization process.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Title: Modeling human auditory synchronization behavior based on EEG data Student: &#8211; Supervisor: &#8211; Co-supervisors: Nima Darabi, PhD at Q2S \/ Peter Svensson, Professor, IET and Q2S Semester: Spring 2010 An overall view: While dancing, performing or listening to a rhythmic music, we are synchronizing our reactions based on the auditory stimuli. The characteristics of &hellip; <a href=\"http:\/\/nim.ir\/?p=522\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Definition of the Master Thesis (2) &#8211; Draft&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[16],"tags":[],"_links":{"self":[{"href":"http:\/\/nim.ir\/index.php?rest_route=\/wp\/v2\/posts\/522"}],"collection":[{"href":"http:\/\/nim.ir\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/nim.ir\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/nim.ir\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/nim.ir\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=522"}],"version-history":[{"count":0,"href":"http:\/\/nim.ir\/index.php?rest_route=\/wp\/v2\/posts\/522\/revisions"}],"wp:attachment":[{"href":"http:\/\/nim.ir\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=522"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/nim.ir\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=522"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/nim.ir\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=522"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}