Greetings

I'm a 4th-year Ph.D. candidate in Computer Science at the University of Toronto, working with Prof. Khai Truong in the DGP lab.

My research lies at the intersection of human–AI interaction, accessibility, and creativity support, with a special focus on improving music accessibility for d/Deaf and hard-of-hearing (DHH) individuals. My projects include song signing (CHI ’23) to explore how music is experienced and expressed within Deaf culture, and ELMI (CHI ’25), an LLM-supported English lyrics to ASL gloss translation system.

I completed my B.Sc. in Computer Science and Engineering at Ewha Womans University, where I was advised by Prof. Uran Oh (Human-Computer Interaction Lab) and Prof. Hyokyung Bahn (Distributed Computing and Operating Systems Lab).

Additionally, I worked as a research intern at Samsung AI Center Toronto, where I was mentored by Iqbal Mohomed, and at NAVER AI LAB under the supervision of Young-Ho Kim. Most recently, I interned at Adobe Research in the STORIE Lab, supervised by Anh Truong and Justin Salamon .

πŸ’Ό I am currently exploring academic (Assistant Professor) and industry (Research Scientist) positions starting in Summer/Fall 2026.

Google Scholar | LinkedIn | suhyeon.yoo[at]mail.utoronto.ca

Human-Computer Interaction Notes

I wrote this note when I took the <Human-Computer Interaction> class in the 2020 Spring semester.

The PDF files are shared through Google Docs and there is some minor image blur since I used the Adobe Scanner app on my phone. 

Note: Only those who have ewhain account can access <this folder>.

(Only 4 pages for Midterm)






Preview:






Hope these could help you guys studying :)


+

You can check the Class Materials at the below link: 

6.813/6.831: User Interface Design & Implementation

http://web.mit.edu/6.813/www/sp16/





Comments