Annual Review 2011 > Division of Information Systems

Computer Arts Laboratory

Michael Cohen

Professor

Pang Wai Man

Assistant Professor

Most of the courses taken by engineers and computer science students emphasize scientific discipline and the accumulation of “truth.” The Computer Arts Lab. activities include such technically objective factors, but also encourage original expression, subjectively motivated by aesthetics rather than “correctness,” sometimes “putting the art before the course!” Unlike many other labs' activities that try to converge on a “right answer” sharable by everyone else, artistic disciplines encourage originality, in which the best answer is one that is like no one else's.

The Computer Arts Lab., through its resident Spatial Media Group1 is researching projects including practical and creative applications of virtual reality and mixed (augmented, enhanced, hybrid, mediated) reality and virtuality; panoramic interfaces and spatially-immersive displays (especially stereotelephonics, spatial sound, and stereography); wearable and mobile applications, computing, and interfaces; and networked multimedia, with related interests in cve (collaborative virtual environments), groupware and cscw (computer-supported collaborative work); hypermedia; digital typography and electronic publishing; force-feedback displays; telecommunication semiotics (models of teleconferencing selection functions); information furniture; way-finding and navigation (including using a Segway personal transporter); entertainment computing; ubicomp (ubiquitous computing), calm (ambient), and pervasive technology. We are particularly interested in narrowcasting commands, conference selection functions for adjusting groupware situations in which users have multiple presence, virtually existing in more than one space simultaneously. We explore realtime interactive multimedia interfaces— auditory, visual, haptic, and multimodal:

Other activities:

We host an annual symposium, the Int. Symposium on Spatial Media,35 inviting experts to share their knowledge and passion regarding such themes as “Spatial Sound and Spatial Telepresence” ('01), “Magic in Math and Music” ('02), “Advanced Multimedia and Virtual Reality” ('03), “Spatial Sound” ('04), “Hearing and Sound Installations” ('05), “Sound, Audio, and Music” ('06), “Interactive Media, Security, and Stereography” ('06), “Music XML and the Structure of Swing, Understanding Color Media, Media Grid, and Visualization Tools” ('07), “Multimedia Computing” ('08), “Systems and Applications” ('09'10) “Distributed, Mobile, and Ubiquitous Multimodal Interfaces” ('10'11), and “Social Multimedia” ('11'12).

Our lab sponsors several student performance circles, including the Yasakoi It Dance Circle,36 and Disco Mix Club.We also sponsor a couple of other student circles, the Dual Boot (Ultimate Frisbee) Flying Disc Club,37 and the Furiten Mah Jongg Circle .38

We are working with Aizu Yougo Gakko ,39 the special education school next to the university, to develop multimedia interfaces that can encourage and entertain students with special needs. We have consulted on deployment of switch-adapted media players, have deployed some iPad accessibility applications, and have developed a song selection program using a “step-scan, select” affordance.

Through the research & development, the deployment & integration, of stereographic, spatial sound, haptic, and mobile applications, including virtual and mixed reality, we nurture scientific and artistic interest in advanced computerhuman and humanhuman communication. Our ultimate domain is the exploration of interfaces and artifacts that are literally sensational.

Some relevant links:

Audio Courseware: http://sonic.u-aizu.ac.jp
Spatial Media: http://sonic.u-aizu.ac.jp/spatial-media/Videos/cohea.html
English: http://sonic.u-aizu.ac.jp/spatial-media/Videos/coheen.mpg
Japanese: http://sonic.u-aizu.ac.jp/spatial-media/Videos/cohejp.mpg
Multimedia and Virtual Reality Videos: http://sonic.u-aizu.ac.jp/ spatial-media/Videos/
Mobile control of rotary motion platform http://sonic.u-aizu.ac.jp/spatial-media/Videos/keitai+Schaire2.mov
Dual Driving Simulator http://sonic.u-aizu.ac.jp/spatial-media/Videos/DualDrivingSimulator.mov
“VMP My Ride” http://sonic.u-aizu.ac.jp/spatial-media/Videos/VMPMyRide.mp4
Mixed Reality Videos http://sonic.u-aizu.ac.jp/spatial-media/mixedreality/VideoClips
Cluspi Control of Rotary Motion Platform http://sonic.u-aizu.ac.jp/spatial-media/Videos/CLUSPI_demo-QT.mov
Sudden Motion Sensor Control of Collaborative Virtual Environment http://sonic.u-aizu.ac.jp/spatial-media/Videos/SMS-CVE.mov
“Twin Spin” iOS and Android CVE Interface http://sonic.u-aizu.ac. jp/spatial-media/Videos/Twin_Spin.m4v
“Whirled Worlds” iOS and Android CVE Interface http://sonic.u-aizu. ac.jp/spatial-media/mixedreality/VideoClips/Whirled_Worlds.mov
QuickTime Virtual Reality: http://sonic.u-aizu.ac.jp/spatial-media/QTVR/
U. of Aizu Panorama: http://sonic.u-aizu.ac.jp/spatial-media/QTVR/ Aizu_Daigaku.mov
Object Movie: http://sonic.u-aizu.ac.jp/spatial-media/QTVR/shoe.mov
Hideo Noguchi + Akabeko: http://sonic.u-aizu.ac.jp/spatial-media/ QTVR/Noguchi+Akabeko.mov
Rotational Degrees of Freedom: http://sonic.u-aizu.ac.jp/spatialmedia/QTVR/Rotational-DsoF.mov
Press and Mass Media Coverage:
Fukushima Minpo, June 10, 2010 http://www.u-aizu.ac.jp/~mcohen/ scrapbook/FukushimaMinpo-10.6.10.jpg
“Nikkei”: Nihon Keizai Shimbun, Nov. 5, 2010 (p. 35) http://www. u-aizu.ac.jp/~mcohen/scrapbook/NihonKeizaiShimbun-2010-11-5p.35.png
“Switch”, Teleview Fukushima, Jan. 4, 2011 http://gallery.me.com/ xilehence#101561/MAH04434-edited
Fukushima Minpo, February 18, 2011 (p. 9) http://www.minpo.jp/ view.phpview.php?pageId=4107&mode=0&classId=0&blockId=9789060& pageId=4107&mode=0&classId=0&blockId=9789060& newsMode=article
University Newspaper, Apr. 8, 2011 http://www.u-aizu.ac.jp/~mcohen/scrapbook/UniversityNewspaper-8.4.11.pdf
FutureGov Asia Pacific, 20 May 2011 http://www.futuregov.asia/articles/2011/may/20/japan-university-helps-special-education-school-ic/
AERA English, October 2011 http://www.u-aizu.ac.jp/~mcohen/scrapbook/AERAEnglish004.pdf

1 http://www.u-aizu.ac.jp/~mcohen/spatial-media/welcome.html

2http://www.ubic-u-aizu.jp/shisetsu/kengaku.html

3 http://www.u-aizu.ac.jp/~mcohen/spatial-media/groupware/clients/HKB/welcome-J.html, http://www.ubic-u-aizu.jp/images/stories/pdf/coh-02-01.pdf

4 http://sonic.u-aizu.ac.jp

5 http://www.u-aizu.ac.jp/~mcohen/welcome/courses/AizuDai/graduate/Sound+Audio/syllabus.html

6 http://web-int/~j-huang/Lecture/3DSound/3dsound.html

7 http://www.u-aizu.ac.jp/official/curriculum/syllabus/3_E_000.html

8 http://www.gimp.org

9 http://www.u-aizu.ac.jp/~mcohen/welcome/courses/AizuDai/undergraduate/Computer_Music

10 http://www.apple.com/ilife/garageband, http://www.pgmusic.com/band.htm

11 http://sonic.u-aizu.ac.jp/~mcohen/welcome/courses/AizuDai/graduate/Computer_Music/syllabus.html

12 http://www.u-aizu.ac.jp/~mcohen/welcome/courses/AizuDai/Mma.html

13 http://www.u-aizu.ac.jp/~mcohen/spatial-media/stereograms.html

14 http://www.chromatek.com

15http://mpro-aizu.blogspot.com

16http://sonic.u-aizu.ac.jp/spatial-media/QTVR/

17http://www.u-aizu.ac.jp/~mcohen/welcome/courses/AizuDai/undergraduate/HI&VR/VirtualTour/

18http://www.u-aizu.ac.jp/~mcohen/welcome/publications/SMS-CVE.mov

19http://www.u-aizu.ac.jp/official/curriculum/syllabusCFS/curr04-cfs_e.html

20http://www.u-aizu.ac.jp/~rentaro

21http://www.u-aizu.ac.jp/official/curriculum/syllabusCFS/curr04-cfs-1_e.html\#CFS1-1

22http://www.sensable.com/products-freeform-systems.htm

23http://www.zcorp.com/Products/3D-Printers/spage.aspx

24http://sonic.u-aizu.ac.jp/spatial-media/mixedreality/VideoClips/KuruKuru-pitcher-long.mov

25http://sonic.u-aizu.ac.jp/spatial-media/mixedreality/VideoClips/CITMixedReality_Demo.wmv

26http://sonic.u-aizu.ac.jp/spatial-media/mixedreality/VideoClips/keitai+Schaire2.mov

27http://openwonderland.org

28http://www.u-aizu.ac.jp/~mcohen/spatial-media/VMPMyRide

29http://www.thedesignium.com

30http://www.aizu.com

31http://www.gclue.com

32http://www.segway.com

33http://web-int.u-aizu.ac.jp/~mcohen/welcome/courses/AizuDai/undergraduate/HI\&VR

34http://www.alice.org

35http://www.u-aizu.ac.jp/~mcohen/welcome/ISSM/10-12/

36http://www.u-aizu.ac.jp/circles/yosakoi

37http://www.u-aizu.ac.jp/circles/dualboot

38http://www.u-aizu.ac.jp/circles/furiten

39http://www.aizu-sh.fks.ed.jp

Refereed Proceedings Papers

[mcohen-01:2011]

Rasika Ranaweera, Michael Frishkopf, and Michael Cohen. Folkways in Wonderland: a laboratory for the ethnomusicology of cyberworlds. In Proc. Int. Conf. on Cyberworlds, pages 106-112, Banff, Alberta; Canada, oct 2011.

cw2011.cpsc.ucalgary.ca. In this paper, we outline a musical cyberworld (a collaborative, immersive virtual environment), together with an experimental design, for the purposes of initiating the ethnomusicology of controlled musical cyberspace. Ethnomusicology, the ethnographic study of music in its social environment, has typically been conducted through qualitative fieldwork in uncontrolled, real-world, socio-cultural environments. Recently, ethnomusicologists have begun to attend to the study of virtual environments, including pre-existing cyberworlds (such as videogames). But we adopt an unprecedented approach by designing our own a custom musical cyberworld to serve as a virtual laboratory for the ethnographic study of music. By constructing an immersive social cyberworld suitable for ethnomusicological fieldwork, we aim for much greater control than has heretofore been possible in ethnomusicological research, leading to results that may suggest better ways of designing musical cyberworlds for research, discovery, learning, entertainment, and e-commerce, as well as pointing towards broader principles underlying the role of music in human interaction and community-formation. Such controlled research can usefully supplement—though never replace—traditional real-world fieldwork.

[mcohen-02:2011]

Senaka Amarakeerthi, Tin Lay Nwe, Liyanage C De Silva, and Michael Cohen. Emotion Classification Using Two-Layered Cascaded Subband Filters. In Proc. Interspeech, page (none), Florence, aug 2011.

http://www.interspeech2011.org. Speech is one of the most important signals that can be used to detect human emotions. Speech is modulated by different emotions by varying frequency- and energy-related acoustic parameters such as pitch, energy and formants. In this paper, we describe research on analyzing inter- and intra-subband energy variations to differentiate five emotions. The emotions considered are anger, fear, dislike, sadness, and neutral. We employ a Two-Layered Cascaded Subband (TLCS) lter to study the energy variations for extraction of acoustic features. Experiments were conducted on the Berlin Emotional Data Corpus (BEDC). We achieve average accuracy of 76.4% and 69.3% for speaker-dependent and -independent emotion classifications, respectively.

[mcohen-03:2011]

Senaka Amarakeerthi, Kithsiri Liyanage, and Michael Cohen. Delay Pools-Based Uninterrupted and Dynamic Bandwidth Distribution of Squid Proxy Cache. In Vitaly Klyuev and Alexander Vazhenin, editors, Proc. HCCE: Int. Conf. on Human-Centered Computer Environments, pages 244 246, March 2012.

http://sparth.u-aizu.ac.jp/hcce2012, isbn 978-1-4503-1191-5, doi 10.1145/2160749.2160801, http://dl.acm.org/citation.cfmdoi 10.1145/2160749.2160801, http://dl.acm.org/citation.cfm?id=2160801. id=2160801. Squid proxy sever is a popular web content caching service trusted by many network administrators. In this paper, we describe a method of managing the bandwidth of Squid proxy cache through the World Wide Web, thus allowing squid administrators and authorized users to allocate a percentage of bandwidth to a particular computer or group of computers without disturbing connected clients. This approach is useful for squid administrators with low bandwidth internet connections in dynamically prioritizing existing bandwidth for bandwidth-hungry applications such as videoconferencing and bulk file downloading and uploading. In controlling bandwidth utilization of users, delay pools is a widely used technique. In a Squid proxy cache, the delay pools setting should be made by editing a configuration manually and restarting the service. The proposed framework allows to grab delay pool parameters through a web interface. PHP and C++ programs were used in implementing the system. With the proposed approach, existing bandwidth can be allocated within hosts or subnets dynamically without disturbing existing connections. By analyzing realtime bandwidth distribution graphs, we could conclude that the proposed framework can distribute the bandwidth as required without affecting connected clients.

[mcohen-04:2011]

Rasika Ranaweera, Michael Cohen, and Shun Endo. Smartphone as Articulated Conductor's Baton for Virtual Concerts. In Naotoshi Osaka, editor, ACMP: Proc. of the Asia Computer Music Project, page (none), Tokyo, dec 2011. JSSA: Japanese Society for Sonic Arts.

www.acmp.asia/acmp2011sche.html. We describe how smartphones can be used as a simplified baton of a ensemble conductor in virtual concerts. Alice, a 3D programming environment, can be used to develop interactive virtual spaces. One can simply drag-and-drop 3D graphics and transform/arrange them in a WYSIWYG editor. Alice also provides a rich API to control and display objects, including some audiorelated functions. We have created a virtual concert application in which instruments are arranged around a conductor (in this case the user) located at their center. A workstation running Alice can render such scenes, including displaying auditory cues via stereo speakers. A user-conductor with a smartphone can simply point at a preferred instrument and tap to select or start playing. By tilting the device or sliding a control on the screen, volume of a selection can be adjusted. We used our own collaborative virtual environment (CVE) session server as the communication link between the smartphone or tablet and Alice. A middleware program reads orientation of the smartphone, sending such events to the CVE server. Another program, the Alice Cve Bridge, retrieves these events through the server and selects/plays/adjusts the instruments in the Alice virtual environment. Such explorations suggest the power of emerging mobile devices as generalized remote controls for interactive multimedia and ubiquitous computing.

[mcohen-05:2011]

Kensuke Nishimura and Michael Cohen. Multitouch Media Player for Accessibility. In Vitaly Klyuev and Alexander Vazhenin, editors, Proc. HCCE: Int. Conf. on Human-Centered Computer Environments, pages 184 189, Aizu-Wakamatsu, mar 2012.

http://sparth.u-aizu.ac.jp/hcce2012, isbn 978-1-4503-1191-5, doi 10.1145/2160749.2160787, http://dl.acm.org/citation.cfmdoi 10.1145/2160749.2160787, http://dl.acm.org/citation.cfm?id=2160787. id=2160787. We have developed two versions of a media player designed for differently abled users. The media player, suitable for selecting and playing songs or videos, runs on either computer-hosted “Alice” or an iOS smartphone or tablet. Even though special users have trouble with normal mouse and keyboard, such accessibility allows them to enjoy selected media.

[mcohen-06:2011]

Rasika Ranaweera, Michael Cohen, and Shun Endo. iBaton: Conducting Virtual Concerts Using Smartphones. In Vitaly Klyuev and Alexander Vazhenin, editors, Proc. HCCE: Int. Conf. on Human-Centered Computer Environments, pages 178-183, Aizu-Wakamatsu, mar 2012.

http://sparth.u-aizu.ac.jp/hcce2012, isbn 978-1-4503-1191-5, doi 10.1145/2160749.2160786, http://dl.acm.org/citation.cfmdoi 10.1145/2160749.2160786, http://dl.acm.org/citation.cfm?id=2160786. id=2160786. With the emergence of virtual environments, even general computer users can drag-and-drop 3d objects (cities, buildings, furniture, instruments, controls, animals and avatars) from a built-in galleries and arrange them to create attractive cyberworlds. We have created a virtual concert application using Alice, a 3d programming environment, in which instruments are arranged around a virtual conductor (in this case the user) located at their center. A user-conductor with a smartphone can use it as a simplified baton, pointing at a preferred instrument and tapping to select or start playing. When selected, an instrument is jiggled or its components dilated and contracted, and a spotlight appears until the instrument is muted, providing the conductor and audience with visual cues about the ensemble. One can also adjust volume/panning of a selected instrument by sliding corresponding controls on the smartphone screen. Alice 3 (Beta) provides a rich api to control and display objects, but native audio capabilities are limited to playing an audio file and adjusting volume before playing. Using a plugin for NetBeans, Alice programs (scenarios) can be edited as Java code outside Alice. Java Sound api which provides more complete control for controlling sampled audio and midi, we implement audio-related functions to the virtual concert. Smartphones have magnetometers that can be used to detect yaw and programmed through a compatible language, spatial information of smartphones can be accessed using the platform's api and transmitted to a collaborative virtual environment (cve) session server via Wi-Fi. A middleware program, the AliceCve Bridge, retrieves these events through the server and selects/plays/adjusts instruments in the Alice virtual environment. Such explorations suggest the power of emerging mobile devices as generalized remote controls for interactive multimedia and ubiquitous computing.

[mcohen-07:2011]

Prabath Weerasinghe and Michael Cohen. Beat Detection Animating Virtual Environment Models. In Vitaly Klyuev and Alexander Vazhenin, editors, Proc. HCCE: Int. Conf. on Human-Centered Computer Environments, pages 200-202, Aizu-Wakamatsu, mar 2012.

http://sparth.u-aizu.ac.jp/hcce2012, isbn 978-1-4503-1191-5, doi 10.1145/2160749.2160790, http://dl.acm.org/citation.cfmdoi 10.1145/2160749.2160790, http://dl.acm.org/citation.cfm?id=2160790. id=2160790. Alice is an innovative 3D programming environment that makes it easy to create an animation. Various virtual environment (VE) models are available in the Alice 3D environment. We created VE models using the Alice 3DIDE (integrated development environment). We deploy a beat detector to detect a beat of a song, based on PD (Pure Data, a free dataflow programming environment similar to MAX/MSP). It can extract the beat of a song while it is playing. Using our Alice-CVE (Collaborative Virtual Environment) Bridge and CVE-PD Bridge, we can create a communication link between the beat detector and Alice 3D environment. The CVE is a Java clientserver protocol developed by the Spatial Media Group of the University of Aizu. Clients connect to session server host via channels, and when clients need to communicate with each other they subscribe to the same channel. The AliceCVE Bridge allows any device that can connect to the CVE server to communicate with Alice without regard for architectural differences. When a song is played, its beat can be detected and sent data to CVE to animate objects and avatars. An avatar can dance (admittedly poorly) in the Alice 3D environment while receiving a realtime rhythm data stream from a CVE session server. Stage color can also be rhythmically changed according to the beat using the same communication link.

[mcohen-08:2011]

Michael Cohen, Rasika Ranaweera, Hayato Ito, Shun Endo, Sascha Holesch, and Juliän Villegas. Whirled Worlds: Pointing and Spinning Smartphones and Tablets to Control Multimodal Augmented Reality Displays. In HotMobile: Proc. Int. Wkshp. on Mobile Computing Systems and Applications, page (online archive), San Diego, feb 2012.

http://www.hotmobile.org/2012/papers/dandp/abstract.pdf Modern smartphones and tablets have magnetometers that can be used to detect yaw, which data can be distributed to modulate ambient media. We have implemented such functionality for both a Google Android smartphone and Apple iOS iPhone & iPad. A client-server architecture synchronizes distributed displays across shared channels, including image-based renderings of panoramic photos and object movies, spatial sound (periphonic) speaker arrays, rotary motion platforms, and the position of avatars or other objects in virtual environments such as Alice and Open Wonderland.

[mcohen-09:2011]

Prabath Weerasinghe and Michael Cohen. Beat Detection Animating Virtual Environment Models. In Naotoshi Osaka, editor, ACMP: Proc. of the Asia Computer Music Project, in Conjunction with the Joint Meeting, page (none), Tokyo, dec 2011. JSSA: Japanese Society for Sonic Arts.

www.acmp.asia/acmp2011sche.html. Alice is an innovative 3D programming environment that makes it easy to create an animation. Various virtual environment (VE) models are available in the Alice 3D environment. We created VE models using the Alice 3DIDE (integrated development environment). We deploy a beat detector to detect a beat of a song, based on PD (Pure Data, a free dataflow programming environment similar to MAX/MSP). It can extract the beat of a song while it is playing. Using our Alice-CVE (Collaborative Virtual Environment) Bridge and CVE-PD Bridge, we can create a communication link between the beat detector and Alice 3D environment. The CVE is a Java clientserver protocol developed by the Spatial Media Group of the University of Aizu. Clients connect to session server host via channels, and when clients need to communicate with each other they subscribe to the same channel. The Alice- CVE Bridge allows any device that can connect to the CVE server to communicate with Alice without regard for architectural differences. When a song is played, its beat can be detected and sent data to CVE to animate objects and avatars. An avatar can dance (admittedly poorly) in the Alice 3D environment while receiving a realtime rhythm data stream from a CVE session server. Stage color can also be rhythmically changed according to the beat using the same communication link.

[mcohen-10:2011]

Michael Cohen, Rasika Ranaweera, Hayato Ito, Shun Endo, Sascha Holesch, and Juliän Villegas. Whirling Interfaces: Smartphones & Tablets as Spinnable Affordances. In ICAT: Proc. Int. Conf. on Artificial Reality and Telexistence, page 155, Osaka, nov 2011.

www.ic-at.org/2011, issn 1345-1278. Interfaces featuring smartphones and tablets that use magnetometer-derived orientation sensing can be used to modulate virtual displays. Embedding such devices into a spinnable affordance allows a “spinning plate”-style interface, a novel interaction technique. Either static (pointing) or dynamic (whirled) mode can be used to control multimodal display, including panoramic and turnoramic images, the positions of avatars in virtual environments, and spatial sound.

Academic Activities

[mcohen-11:2011]

Michael Cohen, October 2011.

Program Committee, NIME 2011 (Int. Conf. on New Instruments for Musical Expression), http://www.eecs.umich.edu/nime2012/

[mcohen-12:2011]

Michael Cohen, October 2011.

Program Committee, IEEE ICEC 2011 (Int. Conf. on Entertainment Computing), http://www.icec2011.org

[mcohen-13:2011]

Michael Cohen, 2011.

ICAT (Int. Conf. on Artificial Reality and Telexistence) Best Paper Award Committee Co-Chair, http://www.ic-at.org/2011/

[mcohen-14:2011]

Michael Cohen, March 2011-12. Executive Committee, IEEE Computer Society Technical Committee on ComputerGenerated Music

[mcohen-15:2011]

Michael Cohen, March 2012.

Program Committee, HC: Fourteenth Int. Conf. on Human and Computer (Hamamatsu and Aizu-Wakamatsu and Duesseldorf), http://sparth.u-aizu.ac.jp/ hcce2012, http://ktm11.eng.shizuoka.ac.jp/HC2011/

[mcohen-16:2011]

Michael Cohen, 2011.

Reviewer, Haptics Symposium, 2012.hapticssymposium.org

[mcohen-17:2011]

Michael Cohen, 2011-12. Reviewer, Entertainment Computing, http://www.journals.elsevier.com/ entertainment-computing/#description

[mcohen-18:2011]

Michael Cohen, 2011-12. Voting Member, IEEE MMTC (Multimedia Communications Technical Committee), http://community.comsoc.org/groups/ieee-mmtc

[mcohen-19:2011]

Michael Cohen, 2011-12. Editorial Review Board, ACM Computers in Entertainment (CiE), http://www.acm. org/pubs/cie/

[mcohen-20:2011]

Michael Cohen, 2011-12. Reviewer and Scientific Committee, JVRB, The Journal of Virtual Reality and Broadcasting, http://www.jvrb.org

Ph.D., Master and Graduation Theses

[mcohen-21:2011]

Shohei Abe (s1160005). USB Interface for Driving Simulator Controls. Graduation thesis, School of Computer Science and Engineering, 2011-12.

Thesis Adviser: Michael Cohen

[mcohen-22:2011]

Koichirou Amitou (s1160008). Developing Driving Simulator with Alice 3.0 and Cockpit of a Real Vehicle. Graduation thesis, School of Computer Science and Engineering, 2011-12.

Thesis Adviser: Michael Cohen

[mcohen-23:2011]

Masaki Okano (s1160049). Narrowcasting Interface for Alice. Graduation thesis, School of Computer Science and Engineering, 2011-12.

Thesis Adviser: Michael Cohen

[mcohen-24:2011]

Kensuke Nishimura (s1160160). Multimedia for Accessibility: Media Players for Special Users. Graduation Thesis, School of Compute Science and Engineering, 2011-12.

Thesis Adviser: Michael Cohen

[mcohen-25:2011]

Shun Endo (s1160037). Whirling Interface: iPhone and iPad as Spinnable Affordances. Graduation thesis, School of Computer Science and Engineering, 2011-12.

Thesis Adviser: Michael Cohen

[mcohen-26:2011]

Hayato Ito (s1160049). Whirling interface: Spinnable Interface for Android Smartphone. Graduation thesis, School of Computer Science and Engineering, 2011-12.

Thesis Adviser: Michael Cohen

[mcohen-27:2011]

Prabath Weerasinghe (m5141110). Animating virtual environments with voice-based emotion and beat-tracked rhythm. Master thesis, Graduate School of Computer Science and Engineering, 2010-12.

Thesis Adviser: Michael Cohen

Others

[mcohen-28:2011]

Kensuke Nishimura, Rasika Ranaweera, and Michael Cohen. “Yowme” Cybereye-exam. Health 2.0 Hackathon; Codethon, Koriyama, February 2012.

Combining modern desktop virtual reality with remote control and gestural interpretation of a smartphone, we reimagine the traditional eye exam in a digital format. http://health2con.jp/hackathon/