2026年度 シラバス大学院

IT教育研究領域 (応用情報工学)

2026/03/01  現在

コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  1学期 /First Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
西舘 陽平
担当教員名
/Instructor
西舘 陽平
推奨トラック
/Recommended track
先修科目
/Essential courses
List of courses that students are expected to have studied in advance (the course will be conducted assuming that part or all of the content in the courses listed below is already known):
MA01 Linear Algebra I
MA02 Linear Algebra II
MA03 Calculus I
MA04 Calculus II
FU11 Numerical Analysis
IT02 Computer Graphics
更新日/Last updated on 2026/02/06
授業の概要
/Course outline
This course is a practical introduction to the finite element method. It focuses on algorithms of the finite element method for solid mechanics modeling. Mesh generation and visualization issues are considered.
授業の目的と到達目標
/Objectives and attainment
goals
[Corresponding Learning Outcomes]
(A) Graduates are aware of their professional and ethical responsibilities as an engineer, and are able to analyze societal requirements, and set, solve, and evaluate technical problems using information science technologies in society.
(C) Graduates are able to apply their professional knowledge of mathematics, natural science, and information technology, as well as the scientific thinking skills such as logical thinking and objective judgment developed through the acquisition of said knowledge, towards problem solving.

[Competency Codes]
C-AL-002-1, C-AR-008, C-AL-005-1, C-AL-008

Students who successfully complete this course will be able to:
1. Explain the basic workflow of finite element analysis (problem setting, discretization, element formulation, assembly, boundary conditions, solution, and visualization) and choose appropriate procedures for a given problem.
2. Derive and verify 1D/2D/3D shape functions and use interpolation u=[N]{q}; compute derivatives via mapping and the Jacobian, and evaluate the accuracy of FEM derivatives against exact derivatives in simple cases.
3. Formulate the variational/energy-based finite element equations for solid mechanics and derive the element stiffness matrix [k] and element load vector {f}; identify the roles of [B], material matrices, and numerical integration.
4. Implement connectivity-based assembly to construct the global stiffness matrix [K] and global load vector {F}; apply displacement boundary conditions using standard approaches and verify the assembled system (symmetry, sparsity, and consistency).
5. Prepare FEM input data (nodes, elements, materials, boundary conditions, loads), perform consistency checks, and generate regular structured meshes with correct numbering and connectivity.
6. Solve FE equation systems using appropriate solvers (direct and/or iterative) depending on problem size and sparsity, and validate results through basic verification (toy problems, convergence checks, and sanity checks).
7. Implement basic visualization of FEM models and results by constructing a continuous field from nodal data, subdividing higher-order element surfaces into triangles, and producing contour/colormap renderings; explain how curvature and result gradients affect required subdivision.
8. Communicate numerical results and assumptions clearly in written form, including derivations, algorithm descriptions, and verification/validation evidence.
授業スケジュール
/Class schedule
1. Intro; FE Equation Formulation
Lecture: scope; FEM flow (discretize→element→assemble→BCs→solve); shape functions u=[N]{q}; role of [k]{q}={f}, [K]{Q}={F}
Preparation: (1.0h) Linear algebra (matrices/vectors, transpose, Ku=f). (1.0h) Calculus (derivatives/integrals; basic variational idea).
Review: (1.0h) Summarize key terms (element, DOF, stiffness, connectivity). (1.0h) Restate u=[N]{q}, Ni(xj)=δij with meaning/dimensions. (0.5h) Make a “formula map” ([k],{f}→[K],{F}).

2. Exercise 1
Exercise: 1D linear shape functions; interpolation evaluation; selected higher-order 1D shape functions
Preparation: (1.0h) Review Lecture 1 (u=[N]{q}, nodal property, partition of unity). (1.0h) Re-derive linear 1D shape functions in local coord.
Review: (1.0h) Check nodal/unity/consistency quickly. (1.0h) Write final solutions; substitution tests at sample points. (0.5h) Note typical mistakes in higher-order derivations.

3. Solid Mechanics FEM 1
Lecture: variational formulation (Π); discretization {u}=[N]{q}, {ε}=[B]{q}; derive element [k] and {f}
Preparation: (1.0h) Stationary condition; integration by parts. (1.0h) Quadratic forms; symmetry of [k].
Review: (1.0h) Re-derive {ε}=[B]{q}; interpret [B]. (1.0h) Identify [k],{f} from Π (body/surface terms). (0.5h) Write assembly-ready element equation.

4. Solid Mechanics FEM 2
Lecture: global system; local→global DOF mapping; direct (naive) assembly of [K],{F}; apply displacement BCs
Preparation: (1.0h) Practice local→global index tables. (1.0h) Review accumulation of element blocks into [K].
Review: (1.0h) Manual assembly for a tiny mesh; verify insertion locations. (1.0h) Check symmetry/sparsity from connectivity. (0.5h) Summarize workflow (element→assembly→BCs→solve).

5. Exercise 2
Exercise: assembly practice for {F},[K] on a small 2D triangular mesh; discuss efficient implementation idea
Preparation: (1.0h) Review [k]{q}={f}, DOF ordering. (1.0h) Review mapping/assembly rules; prepare index lists.
Review: (1.0h) Write assembly pseudocode; discuss cost qualitatively. (1.0h) Validate connectivity and accumulation consistency.

6. 2D Isoparametric Elements
Lecture: isoparametric idea; 4/8-node quads; mapping (ξ,η)→(x,y); Jacobian; chain rule for [B]; Gauss quadrature
Preparation: (1.5h) 2D Jacobian (det/inverse) + chain rule. (0.5h) 1D Gauss rule and tensor product idea.
Review: (1.0h) Compute [J],|J| for a simple 4-node element. (0.5h) Steps for ∂Ni/∂x,∂Ni/∂y. (1.0h) Evaluate |J| and [B] at 2×2 points (sanity checks).

7. 3D Isoparametric Elements
Lecture: 8/20-node hex; mapping (ξ,η,ζ)→(x,y,z); 3×3 [J], 3D [B]; 3D Gauss; surface load basics
Preparation: (1.5h) 3×3 Jacobian and coordinate transformation. (0.5h) 3D Gauss as tensor product.
Review: (0.5h) Organize 8-node formulas (Ni, local derivatives). (1.0h) Workflow for [J],[J]^{-1} and spatial derivatives. (1.0h) Evaluate |J|,[B] at 2×2×2 points on simple geometry.

8. Exercise 3
Exercise: 1D quadratic—FEM vs exact du/dx (nodes, ξ=∓1/√3); 20-node hex—derive selected Ni (N5,N6,N11)
Preparation: (1.0h) 1D quadratic mapping; du/dx via chain rule. (1.0h) 20-node node types and Ni structure.
Review: (1.0h) Compare FEM vs exact at nodes; interpret differences. (1.0h) Compare at reduced points; what is being checked. (0.5h) Verify derived Ni by nodal property/consistency.

9. FE Input Data Format
Lecture: nodes/elements/materials/BCs/loads; connectivity and DOF numbering; consistency checks
Preparation: (1.0h) Indexing rules used in assembly. (1.0h) Data handling basics (arrays/lists; text I/O).
Review: (1.0h) Create a small input set for a simple 2D model. (1.0h) Implement checks (range/duplicate/connectivity/orientation). (0.5h) List typical data errors and detection rules.

10. Regular Mesh Generation
Lecture: structured mesh; numbering/connectivity; basic mesh-quality notes
Preparation: (1.0h) Nested loops; (i,j,k)↔node ID mapping. (1.0h) Coordinates, spacing, boundary labeling.
Review: (1.0h) Implement structured mesh generator (nodes+elements). (1.0h) Validate counts/connectivity/boundaries; test resolutions. (0.5h) Summarize refinement vs accuracy/cost.

11. Exercise 4
Exercise: sinusoidal traction→equivalent nodal forces; degenerate quad→det[J] & derivative feasibility; 1D quadratic→extrapolation matrix [L]
Preparation: (1.0h) Jacobian role in derivatives/integration. (1.0h) Reduced integration in 1D; point evaluation.
Review: (1.0h) Derive equivalent nodal forces; check symmetry/units. (1.0h) Evaluate det[J]; explain consequence for derivatives. (0.5h) Build/verify [L] (reproduce linear fields).

12. Assembly and Solution of FE Systems
Lecture: assemble {F},[K]; displacement BCs (elimination/large-number); solve (direct LDU vs iterative PCG)
Preparation: (1.0h) Sparsity and storage ideas for [K]. (1.0h) LU/LDU and iterative residual concept.
Review: (1.0h) Connectivity-based assembly pseudocode (multi-DOF). (1.0h) Compare two BC methods on a toy system; verify equivalence. (0.5h) Solver choice vs size/sparsity.

13. Exercise 5
Exercise: mixed-element assembly (tri+quad; 2 DOF/node); convert symmetric [K] to profile and sparse-row formats
Preparation: (1.0h) Multi-DOF assembly indexing and block placement. (1.0h) Profile vs sparse-row (CSR-like) formats.
Review: (1.0h) Redo assembly on paper; confirm insertion locations. (1.0h) Convert and reconstruct; verify equality. (0.5h) Note DOF ordering effects on sparsity/storage.

14. Visualization of FE Models and Results
Lecture: visualization pipeline (geometry+fields); surface subdivision for higher-order elements; contour/colormap on triangles
Preparation: (1.0h) Field interpolation/evaluation on elements. (1.0h) Surface geometry (tangents, normals, cross product).
Review: (1.0h) Summarize steps: evaluate→subdivide→color map. (1.0h) Implement minimal triangular subdivision (uniform ok). (0.5h) Map values to vertices and visualize by color interpolation.
教科書
/Textbook(s)
Lecture handouts/materials.
成績評価の方法・基準
/Grading method/criteria
Exercise (50%)
  - Assessment is based on submitted exercise solutions (Exercise 1–5).
  - The exercises evaluate practical skills required for FEM: derivation/verification of shape functions, element formulation, assembly procedures, numerical integration, and basic implementation/validation.
  - Each exercise is graded on correctness, clarity of derivation/explanation, and consistency checks (e.g., nodal property, symmetry, units, and reproducibility of results).

Project (50%)
  - Assessment is based on an individual project that integrates the course topics into a small FEM workflow.
  - The project evaluates the ability to (i) prepare input data, (ii) generate a mesh, (iii) assemble and solve a FE equation system, and (iv) visualize models and results.
  - The project is graded on technical correctness, implementation quality (robustness and organization), verification/validation of results, and clear documentation of assumptions and limitations.
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
Gennadiy Nikishkov, Programming Finite Elements in Java. Springer, 2010, 402 pp.


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  3学期 /Third Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
矢口 勇一
担当教員名
/Instructor
矢口 勇一
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/05
授業の概要
/Course outline
This course provides a structured overview of computer vision and image processing for graduate students who plan to define and develop their research topics. Beyond basic undergraduate image processing, we cover the physical formation of images, feature representations, segmentation/clustering, and recognition/understanding models, including classical statistical methods and modern deep learning approaches. Through weekly hands-on exercises in Google Colab, students will implement representative algorithms, compare outputs, and report findings in a research-oriented style to build the foundation for reading and writing academic papers.
授業の目的と到達目標
/Objectives and attainment
goals
By the end of the course, students will be able to:
- Explain key concepts in image formation, representation, and optics, and relate them to algorithm design.
- Implement and compare core feature extraction and segmentation techniques using Python-based tools.
- Select appropriate learning/recognition approaches (classical ML and deep learning) for a given visual task and justify design choices.
- Read, summarize, and critique academic papers in computer vision/image processing with correct technical terminology.
- Produce concise technical reports that include experimental design, results, and discussion.
授業スケジュール
/Class schedule
[Class Schedule]
Meeting 1
Lecture: Image physical properties & representation
(Image acquisition, optics, projection models, EM spectrum basics, color models)
Exercise: Colab warm-up: image I/O, color space conversions, basic photometric transformations
[Preparation & Review] See “Study time expectation” below.

Meeting 2
Lecture: Image features and descriptors
(Point/line/region; morphology; corners/blobs; histograms; SIFT/SURF/ORB and practical notes)
Exercise: Implement and compare detectors/descriptors; matching and visualization
Report 1 assigned (due in ~2 weeks): Feature extraction comparison

Meeting 3
Lecture: Image segmentation
(Gray-level/texture; region growing; watershed; graph cut/energy view; segmentation as classification)
Exercise: Compare multiple segmentation pipelines and parameter sensitivity
Report 2 assigned: Segmentation study

Meeting 4
Lecture: Image clustering & representation learning (mid-level)
(Model-based clustering; classifiers; dimensionality reduction; MDS; NN overview; BoW/BoVW)
Exercise: Clustering vs classification baseline + feature spaces (hand-crafted vs embedded)
Report 3 assigned: Clustering / representation analysis

Meeting 5
Lecture: Perception/recognition models
(Bayesian networks; belief update; CNN as recognition model; bridging probabilistic and deep models)
Exercise: Simple Bayesian reasoning / inference + a small CNN experiment (conceptual)
Report 4 assigned: Bayesian update / recognition model task

Meeting 6
Lecture: Image–video–space
(Optical flow; dynamic disparity; motion models; spatial models; stereo overview)
Exercise: Optical flow experiments; motion feature visualization and discussion
Report 5 assigned: Motion/flow analysis

Meeting 7
Lecture: Vision and modern ML
(YOLO; ViT; “Mamba”-style sequence models as an emerging direction)
Exercise: Run/inspect a pre-trained detector (YOLO or equivalent) and analyze failure cases
Wrap-up: how to connect course topics to thesis planning and paper reading

[Study time expectation (2 credits/learning time consistency)]
To satisfy the standard learning-time expectation for a 2-credit graduate course, students are expected to spend approximately 570 minutes per meeting on out-of-class study on average (in addition to the 200 minutes in class). A typical breakdown is:
- Preparation (≈50 min): slide reading + code/notebook pre-reading
- Weekly Colab exercise and review (≈180 min): complete and extend the in-class exercise, rerun experiments, adjust parameters, summarize results, organize notes, and prepare for reports
- Note: Five report assignments require ≈480 minutes each and are assigned after Meetings 2–6.
Therefore, the workload is intentionally “peaked” around report weeks, while the average remains the target.
教科書
/Textbook(s)
- Main coursebook: Richard Szeliski, Computer Vision: Algorithms and Applications (recommended; not required to purchase)
- Course website: ELMS page
- Prerequisites: undergraduate-level image processing and basic programming in Python

成績評価の方法・基準
/Grading method/criteria
- Report assignments (5 reports): 80%
  > Each report evaluates: correctness of implementation, experimental design, comparison/analysis, clarity of writing.
- Weekly exercises (7 mini tasks): 20%
  > Completion + correctness + short interpretation/comments.
- No final exam is conducted.
履修上の留意点
/Note for course registration
- When faculty members are away on business trips for international conferences, etc., they will conduct remote classes via Zoom or similar platforms. Even in such cases, students must attend the designated classroom and have their attendance recorded by substitute instructors or teaching assistants.


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  1学期 /First Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
愼 重弼
担当教員名
/Instructor
愼 重弼
推奨トラック
/Recommended track
先修科目
/Essential courses
None
更新日/Last updated on 2026/01/08
授業の概要
/Course outline
This course deals with the design, analysis, and development of methods for the classification or description of patterns, objects, signals, and processes. The main goal of this area is to develop advanced technology and paradigms for human activity pattern processing, and our ability to create new ideas related to the topics covered. There are many pattern recognition applications that exists today, including online /offline pattern recognition, the use of pen-tablets, pattern processing, touch panels, RGB-D cameras, iOS / Android smart devices, and virtual reality. We focus on related issues in human activity pattern processing from 3 perspectives: Recognition, Authentication, and Synthesis.
授業の目的と到達目標
/Objectives and attainment
goals
At the end of this course, students will be able to:
- Perceive an overview of the field of pattern processing related to  
  human activity and pattern processing.
- Learn how various techniques of human activity pattern processing can  
  be applied to the software.
授業スケジュール
/Class schedule
Introduction to human activity pattern processing
Fundamentals of online/offline pattern recognition
Pattern recognition involves human activity (HA)
Current problems and solving methods associated with the following topics:
- Non-touch Interface for Character Input
- Pen-based interactive systems
- Handwritten font generation
- Signature verification and writer identification system
- Brush painting systems
- HCI using calligraphy systems
- Gesture recognition using RGB-D, Leap motion, Myo controller, and web camera
- Disease diagnosis using pen-tablet
- Daily activity recognition using smartwatch and camera sensor
- Multichannel EEG signal analysis for brain computer interface (BCI)
- Design of experiments associated with human activity pattern processing
- HCI for smart and mobile devices
- Applications of image recognition and computer vision
The presentation of some application programs
Students' work:
- Investigation, presentation, research report, and discussion of current techniques and producing new ideas.
- Programming related to pattern processing.
教科書
/Textbook(s)
There are a lot of textbooks available online. Instructors will provide selected topics from books and various journals and conference papers, moreover, our goal in this course is to give you a broad perspective on the field.
成績評価の方法・基準
/Grading method/criteria
Investigation, presentation, and research report (40%)
Positive class participation (20%)
Programming project (40%)
履修上の留意点
/Note for course registration
Permission of the instructor.
Interest in the area of pattern processing.
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
Useful Links:
Course Web Site: http://web-int.u-aizu.ac.jp/~jpshin/GS/HAPP.html
References:
[1] Scott MacKenzie, Human-Computer Interaction: An Empirical Research Perspective (2013) ISBN-10: 0124058655
[2] Jonathan Lazar, Jinjuan Heidi Feng, Harry Hochheiser, Research Methods in Human-computer Interaction, Wiley; ISBN-10: 0470723378 (2010)
[3] Alan Dix, Janet E. Finlay, Gregory D. Abowd, Russell Beale, Human-Computer Interaction (2003)  ISBN-10: 0130461091


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  1学期 /First Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
ヴィジェガス オロズコ ジュリアン アルベルト
担当教員名
/Instructor
ヴィジェガス オロズコ ジュリアン アルベルト, 黄 捷
推奨トラック
/Recommended track
先修科目
/Essential courses
IT09 Sound and Audio Processing
ITA01 Digital Audio Effects
更新日/Last updated on 2026/01/29
授業の概要
/Course outline
The purpose of this course is to study the fundamentals of spatial hearing and its application to virtual environments. By using two ears, human among other species, are able to determine the direction from where a sound is being emitted in a real environment. For virtual environments (e.g., movies, games, recorded or live concerts) is desirable to provide the spatial cues found in nature to increase the realism of a scene. Besides reviewing the underlying theories of spatial hearing, this course focuses in practical implementations of binaural hearing techniques, so the course is intense in hands-on exercises, assignments, and projects mainly based on Pure-data programming language.
授業の目的と到達目標
/Objectives and attainment
goals
Be able to understand the basic mechanisms of spatial hearing, as well as the terminology on this topic.
• Be able to decide which of the presented techniques is best for creating the 3D aural illusion.
• Be able to implement virtual 3D sound environments based on headphones and multi-loudspeaker systems.
授業スケジュール
/Class schedule
1 Introductions
2 Spatial hearing and psychoacoustics
3 Lateralization
4 Lateralization II
5 Elevation cues
6 Distance cues
7 Room cues
8 Motion cues
9 Transfer functions
10 Head-Related Transfer Functions (HRTFs)
11 Loudspeaker techniques
12 Loudspeaker techniques II
13 Ambisonics
14 Recent developments
教科書
/Textbook(s)
• Durand R. Begault, 3-D Sound for Virtual Reality and Multimedia, Academic Press, 2000.
• Jens Blauert, The Technology of Binaural Listening (Modern Acoustics and Signal Processing)
• Various materials prepared by the instructors
成績評価の方法・基準
/Grading method/criteria
Quizzes 40%
Assignments 60%
履修上の留意点
/Note for course registration
* This course uses Matlab and Pure-data for practical demonstrations. Some assignments must be completed in either of these languages as well.
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
Prof. Villegas has practical working experience. He worked as an Ikerbasque researcher for about three years at the laboratory of phonetics in the Basque Country University.

• Bregman, Albert S., Auditory Scene Analysis: The Perceptual Organization of sound. Cambridge, Massachusetts: The MIT Press, 1990 (hardcover)/1994 (paperback).


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  4学期 /Fourth Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
ウィルソン イアン
担当教員名
/Instructor
ウィルソン イアン
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/06
授業の概要
/Course outline
This course introduces the mechanisms of speech articulation and how to measure them. It also investigates the mapping between articulation and acoustics. Articulation is investigated using tools such as ultrasound and video. Speech acoustics is investigated using Praat – open-source acoustic analysis software.
授業の目的と到達目標
/Objectives and attainment
goals
After completing this course, students will be able to:
(1) describe how human speech is produced and how changes in articulation affect the acoustics of speech
(2) use an ultrasound machine to collect speech data
(3) analyze speech acoustics and write short scripts to automatically analyze acoustic data
(4) understand acoustic concepts such as speech waveforms, formants, and sine wave speech synthesis
授業スケジュール
/Class schedule
Each class meeting will consist of a lecture by the professor, as well as discussions about speech production, and some desk work by students using Praat software on computers.

Class 1: How speech is produced
Class 2: How articulation is measured
Class 3: Acoustic properties of speech sound classes
Class 4: Praat script writing
Class 5: Using Praat to synthesize vowels
Class 6: Using Praat to manipulate speech
Class 7: Ultrasound speech data collection and analysis (part 1)
Class 8: Ultrasound speech data collection and analysis (part 2)
Class 9: Voice Onset Time (VOT)
Class 10: Mapping of articulation to acoustics
Class 11: Spectrogram reading
Class 12: Visemes versus phonemes; face reading
Class 13: Phonetic variability - within and across speakers/languages
Class 14: Final Project

No Final Exam will be held. The Final Project serves that purpose instead.

[Preparation/Review] After each class, students are expected to spend 4-5 hours studying class material, doing readings, and using Praat to record, analyze their speech, and writing scripts to automate speech analysis.
教科書
/Textbook(s)
Handouts and other materials will be made available on the course website in Moodle. Praat software will be used and is available on classroom computers. It can also be downloaded for free on your own computer. For class recordings, please use the classroom computer or make sure you have a good microphone for your own laptop computer.
成績評価の方法・基準
/Grading method/criteria
Grades will be awarded based on the following:

• Active participation in class: 40%
• Assignments (Praat script writing, etc.): 20%
• Final Project: 40%
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
Praat official website: Praat

CLR Phonetics Lab website: CLR Phonetics Lab

Office Hours: By appointment; please email the professor to set up an appointment.


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  3学期 /Third Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
白 寅天
担当教員名
/Instructor
白 寅天, 矢口 勇一
推奨トラック
/Recommended track
先修科目
/Essential courses
1. NLP-IR
2. Linear Algebra
更新日/Last updated on 2026/02/06
授業の概要
/Course outline
Natural Language Processing (NLP) is a rapidly developing field with broad applicability in computer science and other various applications. From linguistic and textual data, we can get very useful information, and the data can be used for creating artificial intelligence (AI) applications such as language translator, several kinds of text generation, chatting, etc. In this course, you will study some basic of theoretical and methodological introduction to NLP, and its application to information retrieval, text mining, several language processors using AI. Also, we will focus on strategies and toolkits for NLP and Deep Learning (DL), principle of LLM and its application. Throughout this course, the sources, architectures and tools we will focus on will be introduced for student's own term project.
授業の目的と到達目標
/Objectives and attainment
goals
Students will obtain knowledge about foundational understanding in NLP methods and strategies. They will also learn principle of neural language processing and its architecture for several AI application together with LLM and its application. And they can know how to evaluate the characteristics of NLP technologies and frameworks as they carry out practical exercise and term project using NLP and DL toolkits available.

[Corresponding Learning Outcomes]
• Foundational NLP and Lexical Analysis: Students will be able to perform essential text preprocessing and lexical analysis by mastering core NLP techniques, including NLTK-based tokenization, tagging, and statistical language models like TF-IDF and N-Gram.
• Advanced Deep Learning Architectures: Students will demonstrate a comprehensive understanding of Transformer-based architectures, including BERT and GPT, and be able to implement model alignment using SFT and RLHF for specialized language tasks.
• LLM Optimization and Agentic Application: Students will be able to develop efficient and scalable AI systems by applying advanced optimization techniques—such as RAG, Quantization, and Distillation—and designing agentic applications through prompt engineering.

[Preparation/Review]
• Before each class session, read and understand the lecture materials (slides) and the corresponding sections in the exercise notebook. After class, review the lecture content, run the provided code, and think of ways to apply it.
• The standard out-of-class learning time for this course is 240 minutes per session, broken down as follows: 120 minutes for preparation (reading slides + running example code), and 120 minutes for review and practice (reviewing slides, applying example code, completing exercises and examples, and organizing notes).
• Report assignments will be given as needed and should be completed during the review and practice time.
授業スケジュール
/Class schedule
Session 1.
• Lecture: Introduction to NLP and DL - Python, Google Colaboratory, NLTK Library, Word Tokenizing

Session 2.
• Exercise: NLTK Pipeline Exercise - Text manipulation, Sentense segmentation, Tokenize, POS Tagging, Entity analisys, IR Application construction

Session 3.
• Lecture: Statistical Modeling for Language - Parsing (PSG, CFG, Dependency), Language Model (N-Gram, HMM), Document Model (BoW, TF-IDF, Word2Vec)

Session 4.
• Lecture: Reccurrent Neural Networks and LSTM - NN, RNN, LSTM

Session 5.
• Lecture: Deep Models and Classical NLP - Bidirectional RNN, Sec2Sec, Attention, PyTorch Implementation, BiLSTM

Session 6.
• Exercise: RNN/LSTN for QnA system, BiLSTM implementation, IR Application implementation

Session 7.
• Lecture: Deep Learning Architectures for Language Model(I) - Language Preprocessing for LLM, Attention and Transformer

Session 8.
• Lecture: Deep Learning Architectures for Language Model(II) - BERT, GPT, and LLM

Session 9.
• Lecture: Deep Learning Application for Neural Language - Translation, Code Generation, Sentence Generation

Session 10.
• Lecture: Alignment of Large Language Model with SFT & RLHF

Session 11.
• Lecture: Application of Aligned Large Language Model and Prompt Engineering

Session 12.
• Lecture: Principle of LLM and Its Agentic Application

Session 13.
• Lecture: Efficient Algorithms for LLM - Distillation, Quantization, RAG, Mixture of Expertise, Reinforcement Learning

Session 14.
• Lecture: Term Project Presentation
教科書
/Textbook(s)
- A lecturer will provide necessary materials.
成績評価の方法・基準
/Grading method/criteria
- Term Project: 100%
履修上の留意点
/Note for course registration
- There can be homework such as pre-reading or material preparation during lectures.
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
- It will be introduced on the Moodle lecture Web page.


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  3学期 /Third Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
出村 裕英
担当教員名
/Instructor
小川 佳子, 本田 親寿, 山田 竜平, 出村 裕英, 山本 圭香, 岡山大学講師
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/17
授業の概要
/Course outline
This course focuses on developments of hardware instruments including rover and control system for lunar and planetary explorations. Envisioned main target is the moon. This course follows an omnibus form and the course consists of a classroom lecture and practices. Practices consist of maneuvering a rover and obtaining/processing instrument data onboard the rover.
授業の目的と到達目標
/Objectives and attainment
goals
To learn developments of hardware instruments and control system for landing missions.
To learn basic knowledge in space developments as topics of computer science and engineering.
To practices maneuvering a rover and obtaining/processing instrument data.
授業スケジュール
/Class schedule
This is a lecture-based course with group exercise, providing 30 hours of class time (2 credits) per quarter, along with approximately 60 hours of pre- and post-study sessions, which may vary depending on individual progress and achievement.

Q3: Wed 1-2 & 7-8 periods (partly cancelled for RTF experiments, etc.) + one day experiment in RTF, Minami-Soma on Nov. 26 (Extra Day)

Tentative schedule in AY2025.
<Time Table>
Lecture/Exercise@UoA
#1-4 Prof. Ohtake "Introduction of General Space Probe”
#5-6 Prof. Ogawa "Data in Exploration Programs”
#7-10  Prof. Yamada “Preparations for RTF Practice”
#11-14 Prof. Honda "Path-finding”

Practice@Fukushima Robot Test Field
#15-24 TBD (one day)

Final Presentation and Wrap-up
#25-28
教科書
/Textbook(s)
N/A
成績評価の方法・基準
/Grading method/criteria
Presentation and report on the practice.
履修上の留意点
/Note for course registration
prerequisite:N/A
related course:
ITC09 Fundamental Data Analysis with Lunar and Planetary Database
ITC10 Practical Data Analysis with Lunar and Planetary Databases
SEA11 Software Engineering for Space Programs
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
This course is supported by "FY2022-24 Coordination Funds for Promoting AeroSpace Utilizaiton MEXT, Japan".


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  3学期 /Third Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
陳 文西
担当教員名
/Instructor
陳 文西
推奨トラック
/Recommended track
先修科目
/Essential courses
Some basic knowledge of biosignals, probability and statistics, discrete mathematics and linear algebra, and digital signal processing are required.
更新日/Last updated on 2026/01/21
授業の概要
/Course outline
Biosignal enhancement, feature extraction and physiological interpretation are important aspects in biomedical engineering field. Various biosignals can be manipulated through proper decomposition, transformation, representation, classification, optimization and visualization.
This course will introduce fundamental concepts and approaches, such as filtering, detection, estimation, and data mining for various biosignals in temporal domain, frequency domain and nonlinear domain. It will provide students a brief picture of biosignal from detection to analysis, from physiological significance to clinical application following the course “Introduction to Biosignal Detection”.
授業の目的と到達目標
/Objectives and attainment
goals
1. To understand how to apply statistical mathematics and digital signal processing methods to deal with various biosignals.
2. To understand how to utilize fundamental approaches of signal processing and data mining in biomedical information engineering field.
授業スケジュール
/Class schedule
1. Introduction
2. Decomposition and Reconstruction of Biosignals
3. Detection of Biosignatures
4. Processing of Biosignals and Biosignatures
5. Analysis of HRV in Time Domain
6. Analysis of HRV in Frequency Domain
7. Analysis of HRV in Nonlinear Domain
教科書
/Textbook(s)
 Biomedical Signal Processing and Signal Modeling, Eugene N. Bruce, ISBN: 978-0-471-34540-4, December 2000, Wiley
https://www.wiley.com/en-jp/Biomedical+Signal+Processing+and+Signal+Modeling-p-9780471345404

 Practical Biomedical Signal Analysis Using MATLAB (Series in Medical Physics and Biomedical Engineering), Katarzyn J. Blinowska and Jaroslaw Zygierewicz, CRC Press; 1 edition (September 12, 2011), ISBN-10: 1439812020, ISBN-13: 978-1439812020
https://www.crcpress.com/Practical-Biomedical-Signal-Analysis-Using-MATLAB/Blinowska-Zygierewicz/p/book/9781439812020

 Seamless Healthcare Monitoring - Advancements in Wearable, Attachable, and Invisible Devices, Editors: Tamura, Toshiyo, Chen, Wenxi, Springer International Publishing, 2018, DOI 10.1007/978-3-319-69362-0, eBook ISBN 978-3-319-69362-0, Hardcover ISBN 978-3-319-69361-3
https://www.springer.com/us/book/9783319693613
成績評価の方法・基準
/Grading method/criteria
A summary report by compiling a series of assignments, 100%
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
The course instructor has practical working experience and has worked for 5 years at Nihon Kohden Industrial Corp., a professional manufacturer of world famous medical equipment, and has been engaged in R & D for bioinstrumentation, signal processing and data analysis. Based on this experience, he will teach the fundamental knowledge and latest advancements in “Biosignal Processing and Data Mining”.

Moodle for course handouts and other related information
https://elms.u-aizu.ac.jp/login/index.php


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  4学期 /Fourth Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
白 寅天
担当教員名
/Instructor
白 寅天
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/06
授業の概要
/Course outline
The semantic Web is the second wave of Web technology, and its environment evolves from human-readable to machine-readable. The key technology of the semantic Web is knowledge representation technique–ontology, and its management.
Main issue of this course is to learn the semantic Web service technology: ontology,its learning and engineering, and its application to Web service. Background of web evolution, ontology for knowledge representation, Web service, and application to service composition will be covered.
If you have interests on the areas in the semantic Web service (SWS) technology, please e-mail to me (paikic@u-aizu.ac.jp) or visit my office (307-C).
授業の目的と到達目標
/Objectives and attainment
goals
[Corresponding Learning Outcomes]
• Knowledge Representation and Modeling: Students will be able to design and implement structured knowledge bases by mastering RDF and OWL, utilizing professional tools like Protégé to model complex domains.
• Reasoning and Rule-based Intelligence: Students will demonstrate the ability to enhance web intelligence by applying Semantic Web Rule Language (SWRL) and exploring ontology learning and matching techniques for data integration.
• Semantic Service Frameworks: Students will be able to architect advanced web services by understanding semantic service frameworks (OWL-S, WSMO) and applying ontology engineering principles to real-world projects.

[Preparation/Review]
• Before each class session, read and understand the lecture materials (slides) and the corresponding sections in the exercise notebook. After class, review the lecture content, run the provided code, and think of ways to apply it.
• The standard out-of-class learning time for this course is 240 minutes per session, broken down as follows: 120 minutes for preparation (reading slides + running example code), and 120 minutes for review and practice (reviewing slides, applying example code, completing exercises and examples, and organizing notes).
• Report assignments will be given as needed and should be completed during the review and practice time.
授業スケジュール
/Class schedule
Session 1.
• Lecture: Introduction to Web Technologies and Semantic Web

Session 2.
• Lecture: Resource Description Framework (RDF) and DAML-OIL

Session 3.
• Lecture: Ontology Language - OWL (I. Basic Concept)

Session 4.
• Lecture: Ontology Language (OWL) (II. Details of OWL)

Session 5.
• Lecture: Semantic Web Rule Language

Session 6.
• Exercise: Ontology Design Exercise in OWL (Using Protege)

Session 7.
• Lecture: Rule Design in SWRL

Session 8.
• Exercise: Rule Design Exercise in SWRL (Using Protege)

Session 9.
• Lecture: Ontology Learning by Text Mining

Session 10.
• Lecture: Ontology Matching and Merging

Session 11.
• Lecture: Ontology Engineering

Session 12.
• Lecture: Semantic Web Service Frameworks (OWL-S and BPEL)

Session 13.
• Lecture: Semantic Web Service Frameworks (WSMO)

Session 14.
• Lecture: Paper and Term Project Presentation
教科書
/Textbook(s)
Lecture Slides will be provided on lecture Web site.
成績評価の方法・基準
/Grading method/criteria
1. Examination    --- 50%
2. Paper Presentation & Term Project --- 50%

履修上の留意点
/Note for course registration
* Prerequisites:
- JAVA Programming I & II
- Web Programming
- Artificial Intelligence
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
* Reference
1) J. Davies, R. Studer, P. Warren, Semantic Web Technologies, Wiley, 2007.
2) A. Gomez-Perez, M. Fernandex-Lopez, O. Corcho, Ontological Engineering, Springer, 2004.
3) J. Davies, D. Fensel, F.V. Harmelen, Towards The Semantic Web, Ontology-Driven        Knowledge Management, Wiely, 2003.
4) M.C. Daconta, L.J. Obrst, K.T. Smith, The Semantic Web, Wiley, 2003.



コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  2学期 /Second Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
マルコフ コンスタンティン
担当教員名
/Instructor
マルコフ コンスタンティン
推奨トラック
/Recommended track
先修科目
/Essential courses
This course is given in English.
更新日/Last updated on 2026/02/06
授業の概要
/Course outline
Machine learning is one of the fastest-growing and most exciting fields of AI, and deep learning represents its true bleeding edge. Deep Learning is one of the most highly sought after skills in IT industry. In this course, students will learn the foundations of Deep Learning, understand how to build neural networks, and learn how to complete successful machine learning projects. It will teach students how to train and optimize basic neural networks (NN), Convolutional neural networks (CNN), Recurrent neural networks (RNN, LSTM), autoencoders (AE), etc. Complete learning systems will be introduced via projects and assignments.
授業の目的と到達目標
/Objectives and attainment
goals
Students will learn to solve new classes of problems that were once thought prohibitively challenging, and come to better appreciate the complex nature of human intelligence as they solve these same problems effortlessly using deep learning methods. Students will master not only the theory, but also see how it is applied in practical case studies from various fields such as image recognition, music generation, natural language processing, etc.
授業スケジュール
/Class schedule
Session 1
Lecture: Introduction and Background.
              - Course introduction.
              - Basic probability theory and statistics.
Exercise: Python programming basics
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)

Session 2
Lecture: Machine Learning and Neural Networks
             - Machine Learning fundamentals.
             - Neural Networks fundamentals.
Exercise: Vanilla Neural Network programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)

Session 3
Lecture: Deep Neural Networks basics I.
              - Training – Back Propagation.
              - Regularization and Normalization.
Exercise: Feed Forward NN training programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)

Session 4
Lecture: Deep Neural Networks basics II.
              - Loss functions, Optimizations.
Exercise: Feed Forward NN training programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)

Session 5
Lecture: Feed-Forward DNN Applications.
              - DNN classification and regression.
Exercise: Feed Forward NN classification system programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)

Session 6
Lecture: Convolutional Neural Networks (CNN).
             - Translation invariance.
             - Templates and filters.
Exercise: Convolutional layer programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)

Session 7
Lecture: CNN Applications.
             - CNN for vision  – VGG, Inception.
             - CNN for signal and text processing.
Exercise: CNN system programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)


Session 8
Lecture: Recurrent Neural Networks (RNN).
             - LSTM, GRU variants.
             - Sequence and time series data modeling with RNN.
Exercise: RNN layer programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)

Session 9
Lecture: RNN Applications.
            - RNN in Natural Language Processing.
            - RNN for sequence generation.
Exercise: RNN application programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)

Session 10
Lecture: Sequence-to-Sequence models (Seq2Seq)
             - Attention mechanism.
             - Word embeddings.
             - Seq2Seq for Language Translation.
Exercise: Text translation system programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)


Session 11
Lecture: Autoencoders (AE)
             - Denoising AE.
             - Variational AE.
             - AE for Dimensionality Reduction.
Exercise: Autoencoder system programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)

Session 12
Lecture: Advanced DNN models.
             - Transformer, BERT, GPT-2, LLMs
Exercise: Transformer programming
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)


Session 13
Lecture: Prompt Engineering.
             - RAG, AI Agents.
Exercise: Course project presentations
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)

Session 14
Lecture: DNN training strategies.
              - Tips and tricks.
Exercise: Course project presentations
Preparation/Review: Study lecture material, preparation (2h), review (1h), exercise assignment (1h)
教科書
/Textbook(s)
I. Goodfellow,Y. Bengio and A. Courville, Deep Learning, MIT Press. Online version: http://www.deeplearningbook.org

T. Hope, Y. Resheff and I. Lieder, Learning Tensorflow: A Guide to Building Deep Learning Systems, Oreilly.

F. Chollet, Deep Learning With Python, Manning Pubs.
成績評価の方法・基準
/Grading method/criteria
Laboratory exercises: 60 points
Project: 40 points
履修上の留意点
/Note for course registration
As this is an intermediate to advanced level course, the following experience and skills are disirable:
- Programming experience (preferably in Python)
- Basic machine learning knowledge (especially supervised learning)
- Basic statistics knowledge (mean, variance, etc.)
- Linear algebra (vectors, matrices, etc.)
- Calculus (differentiation, integration, partial derivatives, etc.)

Prior to enrolling to this course, it is recommended (but not required) to take the following related courses:
- ITC12F Machine Learning
- CSA01 Neural Networks I: Fundamental Theory and Applications
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
https://elms.u-aizu.ac.jp/


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  1学期 /First Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
イリチュ ピーター
担当教員名
/Instructor
イリチュ ピーター
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/05
授業の概要
/Course outline
Today a comprehensive understanding of the intersection between learning theory and Information and Communication Technology (ICT) is both critical and timely. This course focuses on the convergence of learning theory and ICT in education, particularly examining their interplay. It will delve into the challenges researchers and educators face as emerging technologies reshape the educational landscape. Emphasizing a blend of theoretical insights and practical applications, the course provides invaluable knowledge for navigating education in a technologically advanced era. Throughout the course, students will engage in rigorous readings and dynamic discussions, reflecting on learning theories, teaching practices, and pedagogical approaches in the context of ICT’s opportunities. The curriculum is particularly relevant for students aspiring to contribute to educational software development or to integrate technology into teaching. This emphasis on readings and discussions will deepen students' understanding of the complex relationship between educational theories and technological advancements.
授業の目的と到達目標
/Objectives and attainment
goals
1. Develop knowledge of Online Educational Technologies.
2. Develop understanding of key Learning Theories influencing educational technology.
3. Develop a critical understanding of the limits of online technologies in education.
4.     English discussion and argumentative skills will be further developed.
授業スケジュール
/Class schedule
Section One:
1. Introduction to Learning Theory and ICT (Lecture, Seminar)
        i. Homework: Reading/Quiz 1
2. History of Learning Theory and ICT
        i. Homework: Reading/Quiz 2

Section Two:
1. Behaviorist Learning Theory I (Lecture, Seminar)
        i. Homework: Reading/Quiz 3
2. Behaviorist Learning Theory II
        i. Homework: Reading/Quiz 4

Section Three:
1. Cognitivist Learning Theory I (Lecture, Seminar)
        i. Homework: Reading/Quiz 5
2. Cognitivist Learning Theory II
        i. Homework: Reading/Quiz 6

Section Four:
1. Constructivist Learning Theory I (Lecture, Seminar)
        i. Homework: Reading/Quiz 7
2. Constructivist Learning Theory II
        i. Homework: Reading/Quiz 8

Section Five:
1. Connectivism/Others  (Lecture, Seminar)
        i. Homework: Reading/Quiz 9
2. Connectivism II/Collaborativist II
        i. Homework: Reading/Quiz 10
教科書
/Textbook(s)
No textbook will be used. Course material will be made available on Moodle.
成績評価の方法・基準
/Grading method/criteria
20% Active Participation
20% Online Quizzes:
20% 1st Response Paper:
20% 2nd Response Paper:
20% 3rd Response Paper:

Late assignments will lose 10% per day.
After 5 days, a late assignment will receive a mark of 0%.
履修上の留意点
/Note for course registration
In-class discussion participation is included in grading.
English language proficiency is required (TOEIC 600+, or with permission from instructor).
Attendance will be recorded.



コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  4学期 /Fourth Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
ナッサーニ アラディン
担当教員名
/Instructor
ナッサーニ アラディン, ヴィジェガス オロズコ ジュリアン アルベルト
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/04
授業の概要
/Course outline
このコースの目的は、拡張現実感(AR)、仮想現実感(VR)、複合現実感(MR)を包括する拡張現実感(XR)の高度な知識と実践的スキルを大学院生に習得させることです。高度な3Dモデリング、洗練されたVRとARの開発、機械学習の統合やブレイン・コンピュータ・インターフェイスのような最先端のトピックを含む包括的なカリキュラムを通して、学生はXRアプリケーションの開発と最適化の実践経験を積むことができます。また、文献レビュー、ユーザースタディ評価、XRにおけるヒューマンコンピュータインタラクション(HCI)とUXデザインの理解など、研究の習熟にも重点を置いています。コース終了時には、学生は、学術的にも専門的にも、進化するXRの分野に革新的に貢献できるようになります。
授業の目的と到達目標
/Objectives and attainment
goals
[対応する学習成果]
(A) 技術者としての専門的・倫理的責任を自覚し、社会的要請を分析し、社会における情報科学技術を活用した技術的課題の設定・解決・評価ができる。
(C) 数学、自然科学、情報技術に関する専門的知識と、その修得を通じて培った論理的思考力、客観的判断力などの科学的思考力を、問題解決に向けて活用することができる。

[コンピテンシーコード] C-HI-001, C-HI-002, C-HI-004, C-GV-004, C-GV-005, C-GV-007

このコースの終了時までに、受講生は以下のことができるようになります:

- XRアプリケーションのための3Dモデリングとプロシージャル生成の高度なテクニックを適用する。
- 複雑なVRおよびAR体験の開発と最適化。
- HCIとUXの原則を考慮したユーザー中心のXRインターフェースのデザイン。
- XR技術を評価するための科学的文献レビューとユーザー調査の実施。
- ハプティクス、多感覚フィードバック、ネットワーキング、クラウドコンピューティングをXRプロジェクトに統合する。
- バイオセンシングを探求し、その可能性と倫理的意味を理解する。
- XRの研究を批判的に評価し、イノベーションの機会を特定する。
授業スケジュール
/Class schedule
[授業の内容と方法]

講義・演習: 授業は、高度な理論的議論、研究論文の批判的分析(輪講)、およびXR開発に関する技術的なワークショップを組み合わせて行います。

プロジェクトワーク: 学生は、プロトタイプの開発、ユーザー調査の実施、データ分析を行うために、個人およびグループで取り組みます。

[授業スケジュール]

01. ようこそ、XRの紹介
02. XRのための論文読解と文献レビュー
03. VR開発特論
04. 先端AR開発
05. プロジェクト1の締め切り:文献レビュー
06. XRにおけるヒューマンコンピュータインタラクションとUXデザイン
07. 調査方法とユーザー調査評価
08. プロジェクト2の締め切り:プロトタイプ開発
09. プレゼンス、ハプティクス、多感覚フィードバック
10. テレプレゼンス、ライブストリーミング、360ビデオ、3Dポイントクラウド
11. XRにおけるコ・プレゼンス、ネットワーキング、クラウド・コンピューティング
12. プロジェクト3の締め切り:ユーザー調査
13. XRにおける共感とバイオセンシング
14. XRにおける機械学習とAIの統合
15. 期末試験 - プロジェクト4のプレゼンテーション

[予習・復習]

予習: 指定された学術論文を読み、今後の開発トピックに関する技術文書を事前に調査すること。

復習・課題: 4つの主要プロジェクト(文献レビュー、プロトタイプ、ユーザー調査、最終レポート)に取り組むこと。これには、かなりの時間の自主的な開発とデータ分析が必要です。

授業外学習時間: 1回あたり5~6時間。 (2単位=総学習時間90時間に基づいて算出)
教科書
/Textbook(s)
講師、TA、SAが作成した講義ノート。

The Vr Book: Human-centered Design for Virtual Reality
Jason Jerald
978-1970001129

Research Methods in Human-Computer Interaction
Jonathan Lazar
978-0470723371
成績評価の方法・基準
/Grading method/criteria
20% プロジェクト1(ミーティング5): 選択したXRトピックに関する文献レビュー

20% プロジェクト2(ミーティング8): 高度なAR/VR技術プロトタイプの開発

20% プロジェクト3(ミーティング12): ユーザー調査、データ分析、ディスカッション

40% プロジェクト4 - 期末試験(ミーティング15): ユーザー調査からの学びの実装、最終レポート、最終プレゼンテーション
履修上の留意点
/Note for course registration
このコースは、学部コースのIT06: Human Interface & Virtual Realityで得た知識を基に構成されています。ただし、必須履修科目ではありません。
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
VR開発パスウェイ - Unity Learn
https://learn.unity.com/pathway/vr-development

モバイルAR開発パスウェイ - Unity Learn
https://learn.unity.com/pathway/mobile-ar-development

Magic Leap 2向けUnity開発
https://learn.unity.com/course/magic-leap-2-development

ML-エージェント: ハミングバード - Unityを学ぶ  
https://learn.unity.com/course/ml-agents-hummingbirds

初心者のための人工知能 - Unity Learn
https://learn.unity.com/course/artificial-intelligence-for-beginners

埋もれた記憶: 忠実度の高いゲームビジュアル - Unityを学ぶ  
https://learn.unity.com/course/buried-memories-high-fidelity-game-visuals

担当教員は、開発、研究、教育を含むVRおよびARの実務経験を有しています。


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  4学期 /Fourth Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
ファヨール ピエール アラン
担当教員名
/Instructor
ファヨール ピエール アラン, 西舘 陽平
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/01/09
授業の概要
/Course outline
This course provides an introduction to numerical geometry processing using Java as a programming language. We give a presentation of the Java 2D API (which is part of any Java SDK) for 2D processing and rendering, then give a presentation of the OpenGL API via its Java bindings for 2D and 3D rendering.
Armed with this knowledge, we look at curve and surface modeling, first from a continuous point of view, then from a discrete point of view, with computer implementations in mind.
Finally, we provide an overview of the 3D modeling pipeline (acquisition, alignment, reconstruction) and look at several methods for processing digital shapes.
授業の目的と到達目標
/Objectives and attainment
goals
The main objectives of the course are:
• The study of common techniques in graphics programming and their connections with the Java 2D API and the OpenGL API (via its Java bindings)
• The study of common techniques in numerical geometry processing (curves and surfaces modeling, the 3D modeling pipeline, digital shape processing techniques)
• The development of graphics and geometry processing programs in Java (using the Java 2D and OpenGL API)
授業スケジュール
/Class schedule
1. Course introduction; Smooth curves
2. Discrete curves
3. Java 2D introduction and geometry
4. Java 2D (rendering)
5. OpenGL bindings (core mode; shaders)
6. OpenGL bindings continued
7. Project 1 presentations; 3D pipeline  
8. 3D pipeline, alignment  
9. Surface reconstruction
10. Triangle mesh representations, simplification
11. Simplification, subdivision
12. Shape parameterization
13. Spectral methods
14. Project 2 presentations; Spectral methods (continued)

[Preparation/Review] Before each class, students should prepare by studying the lecture materials and corresponding readings for the content indicated in the course plan. The course projects need to be completed outside of the classes. The typical preparation/review time per session is around 4 hours.
教科書
/Textbook(s)
Slides, reading materials and code will be provided by the instructors.
成績評価の方法・基準
/Grading method/criteria
Two projects: each of them has a weight of 50%.
履修上の留意点
/Note for course registration
Knowledge of Java programming, as well as some basic knowledge of graphics programming are expected.
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
Course website (internal)
https://web-int.u-aizu.ac.jp/~fayolle/teaching/java_2d_3d/index.html


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  1学期 /First Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
ヴィジェガス オロズコ ジュリアン アルベルト
担当教員名
/Instructor
ヴィジェガス オロズコ ジュリアン アルベルト, ナッサーニ アラディン
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/01/29
授業の概要
/Course outline
This course provides a comprehensive introduction to sound, audio, and digital signal processing. It is focused on equipping grad students with the necessary skills for processing digital audio either as input for other computations such as machine learning, data mining, etc., or for the analysis or synthesis of sonic phenomena such as music, speech, etc. We will review basic concepts of sound and audio, time-domain, frequency-domain, and time-frequency representations of audio. These representations are used for feature extraction, signal filtering, and signal enhancement. The course also covers basic audio effects useful such as dynamic range compression (DRC), noise reduction, time-scale modification, and pitch-shifting.
授業の目的と到達目標
/Objectives and attainment
goals
Students who complete this course will be empowered with basic knowledge of sound and audio and the confidence to apply those principals to generally encountered situations in software development of audio applications.
Concrete Objectives:
• Students will be able to understand the basic techniques employed in digital audio processing, as well as the literature and terminology on this topic.
• Students will be able to apply digital audio processing techniques to extract features from music and speech.
• Students will design and implement Python-based algorithms for real-world audio applications.
• By the end of the course, students will be able to implement audio effects, music classification, and speech enhancement applications in Python.
• Students will evaluate and compare different feature extraction techniques for specific tasks like speech recognition or musical instrument identification.
授業スケジュール
/Class schedule
Lecture 1 Basic Concepts of Sound and Audio
Lecture 2 Time-Domain Analysis
Lecture 3 Audio Transformations
Lecture 4 Frequency-Domain Analysis
Lecture 5 Short-Time Fourier Transform (STFT) and Spectrograms
Lecture 6 Psychoacoustics
Lecture 7 Pitch-related methods
Lecture 8 Hilbert Transform
Lecture 9 Digital Filtering Basics
Lecture 10 Linear effects
Lecture 11 Non-linear effects
Lecture 12 Time-scale modifications
Lecture 13 Linear filtering
Lecture 14 Sonification
教科書
/Textbook(s)
• Various materials prepared by the instructors
• W. M. Hartmann, Signals, Sound, and Sensation. Modern acoustics and signal processing, Woodbury, NY; USA: American Institute of Physics, 1997.
• U. Zölzer, ed., DAFX – Digital Audio Effects. New York, NY, USA: John Wiley & Sons, 2nd ed., 2011.
• M. Puckette, “Theory and techniques of electronic music,” Online] http://msp.ucsd.edu/techniques.htm, 2006.
• Python Libraries: Librosa, SciPy, NumPy, Matplotlib
成績評価の方法・基準
/Grading method/criteria
• Quizzes 40%
• Exercises 30%
• Final Exam 30%
履修上の留意点
/Note for course registration
This course is exclusively taught in English. Although no particular experience with sound and audio is required, it is expected that grad students have taken elementary physics courses (concretely, on mechanics, wave propagation, etc.) in their undergraduate studies such as the course [IT09] (Sound and Audio Processing) offered at the University of Aizu.


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  1学期 /First Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
成瀬 継太郎
担当教員名
/Instructor
成瀬 継太郎, 渡部 有隆
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/06
授業の概要
/Course outline
この講義ではロボット工学を情報技術の面から学ぶ.このとき課題となるのはロボットプランニングである.本講義では,移動ロボットとアーム型ロボットを例題として,経路計画と制御について学ぶ.具体的には座標変換,運動学,ヤコビ行列である.またmatlabを使った演習により理解を深める.
授業の目的と到達目標
/Objectives and attainment
goals
この科目を履修した学生は以下のことができるようになる.
(A) 移動ロボットとアーム型ロボットの運動学とプランニング
(B) 移動型ロボットのアーム型ロボットの動力学とシミュレーション
(C) ロボットの環境認識と学習
授業スケジュール
/Class schedule
各回の授業は、講義形式で実施する。

#1 序論と概要
#2 移動ロボット,座標変換,運動学
#3 演習
#4 アーム型ロボットの順運動学,DH法,演習
#5 演習
#6 アーム型ロボットの逆運動学,ヤコビ行列,演習
#7 演習
#8 移動ロボット,動力学,ロボットシミュレーション
#9 演習
#10 アーム型ロボット,動力学,ロボットシミュレーション
#11 演習
#12 学習と環境認識
#13 演習
#14 演習・まとめ

[予習・復習]
事前学習に関しては、授業計画に示した授業内容について講義資料の該当ページの予習をしておくこと。また講義資料に含まれるサンプルコードを実装し、動作を検証すること。
事後学習としては、演習時間内に完了しなかった課題を次の授業までに完了すること。さらに授業中に指定された発展的な課題や解析を行うこと。
毎回の予習・復習の目安は4~5 時間である。
教科書
/Textbook(s)
なし,授業中に配布する.
成績評価の方法・基準
/Grading method/criteria
レポート(100%)
履修上の留意点
/Note for course registration
学部のロボット工学と自動制御を履修していることが望ましい
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
LMS


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  4学期 /Fourth Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
成瀬 継太郎
担当教員名
/Instructor
成瀬 継太郎, 矢口 勇一
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/06
授業の概要
/Course outline
本科目では現代制御理論を学ぶ.具体的には,状態空間モデル,安定性,可制御性,可観測性,レギュレータ,オブザーバ,線形二次レギュレータによる最適制御を学ぶ.また演習により理解を深める.
授業の目的と到達目標
/Objectives and attainment
goals
この科目を受講した学生は以下ができるようになる.
(A) システムの状態空間表現
(B) システムの安定性,可制御性,可観測性の判定,リャプノフ関数
(C) システムの制御器(レギュレータ)の設計
(D) システムの観測器(オブザーバ)の設計.カルマンフィルタとパーティクルフィルタを含む
(E) 最適制御系の設計
(F) matlabによる設計シミュレーション
授業スケジュール
/Class schedule
各回の授業は、講義形式で実施する。

#1 序論と概要
#2 微分方程式と状態空間モデル
#3 演習
#4 安定性,可制御性,レギュレータの設計
#5 演習
#6 可観測性,オブザーバの設計
#7 演習
#8 オブザーバ・レギュレータシステムの設計,最適制御
#9 演習
#10 離散時間カルマンフィルタ
#11 演習
#12 離散時間モンテカルロマンフィルタ
#13 演習
#14 演習・まとめ

[予習・復習]
事前学習に関しては、授業計画に示した授業内容について講義資料の該当ページの予習をしておくこと。また講義資料に含まれるサンプルコードを実装し、動作を検証すること。
事後学習としては、演習時間内に完了しなかった課題を次の授業までに完了すること。さらに授業中に指定された発展的な課題や解析を行うこと。
毎回の予習・復習の目安は4~5 時間である。
教科書
/Textbook(s)
なし,必要な資料は授業中に配布する.
成績評価の方法・基準
/Grading method/criteria
レポート(100%)による
履修上の留意点
/Note for course registration
関連科目(必須ではない)
学部:ロボット工学と自動制御
大学院:advanced robotics
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
LMS


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  2学期 /Second Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
劉 勇
担当教員名
/Instructor
劉 勇, 矢口 勇一, 趙 強福
推奨トラック
/Recommended track
先修科目
/Essential courses
- Probability and statistics (undergraduate course)
- Algorithms and data structures (undergraduate course)
- Artificial intelligence (undergraduate course)
更新日/Last updated on 2026/02/05
授業の概要
/Course outline
Learning ability is one of the most fundamental abilities for realizing
“intelligence”. A system with the learning ability can become more and more
efficient and/or effective for solving given problems. Briefly speaking,
machine learning is a research field for studying theories, methodologies,
and algorithms that enable computing machines to learn and to become
intelligent. So far, many approaches have been proposed in the literature
for machine learning; and multilayer perceptron, convolutional neural
network, Bayesian network, and decision tree are just a few examples. In
this course, we categorize many existing approaches into a few groups,
namely, learning based on distance, learning based on probability, learning
based on layered structures, and learning based on tree structures. We do
not intend to cover all aspects of machine learning in this single course.
Instead, we will focus on several most well-known and well-applied
approaches. We suppose that, before taking this course, the students have
already studied some fundamental courses related to machine learning, say,
“Artificial intelligence” for undergraduate school, “Introduction to neural
networks” for graduate school, and so on. To know more about machine
learning or AI in general, we recommend the students to take other related
courses. For example, in the graduate school, the students may also take
courses related to big-data analysis; ontology and semantic web;
information retrieval; meta-heuristics; and so on.
授業の目的と到達目標
/Objectives and attainment
goals
The main goal of this course is to study and understand the basic
concepts and mechanisms of several well-known and well-applied machine
learning approaches, including for example, k-means, self-organization;
Naïve Bayes classification; convolutional neural network; deep auto-encoder; deep Boltzmann machine; Bayesian network; decision tree, decision ensembles, etc. To reinforce the learned knowledge, students will do some projects. Through these projects, students will solve some real-life or synthesized problems using some of the learned methods. Students are encouraged to work in a team, to solve the problems together, and to learn how to communicate and collaborate with others.  
授業スケジュール
/Class schedule
Some contents given below might be changed/improved year by year based on the newest trends in this field.  

1. History of machine learning and artificial intelligence
- Case studies
  - Learn how to classify patterns
  - Learn how to make a decision
  - Learn how to estimate/predict the future
  - Learn how to solve a problem efficiently/effectively

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

2. Pattern recognition: a brief review
- Feature space representation of patterns
- Feature extraction and feature selection
- Distance-based classification
  - NNC and k-NNC; Voronoi diagram
  - Various distance measures
- Cluster analysis
  - k-means, self-organization, and vector quantization

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

3. Fundamentals of machine learning
- Formulation of machine learning
- Ill-posed problem and regularization
- Classification and regression
- Taxonomy of learning algorithms
  - Supervised, semi-supervised, and unsupervised learning
  - Parametric and non-parametric learning
  - Deterministic and statistical learning
  - Online and off line learning
  - Evolutionary learning
  - Reinforcement learning

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

4. Statistical learning methods-1
- Naïve Bayes classification
- Parzen widow

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

5. Statistical learning methods-2
- Bayesian network

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

6. Learning based on tree structures
- Decision trees
- Multi-variate decision trees
- Decision tree ensembles (forests)

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

7. Presentation and report for projects of the 1st half

[Preparation/Review] Write presentation slides: preparation (2 hours), review (4 hours).

8. Learning based on layered structures-1
- Multilayer neural networks
- Deep auto-encoder

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

9. Learning based on layered structures-2
- Convolutional neural network
- Transfer learning

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

10. Learning based on layered structures-3
- Methods for improving the performance of deep neural networks

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

11. Generative Neural Network - 1
- Restricted Boltzmann machine

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

12. Generative Neural Network -2
- Generative Adversarial Networks
- Applications of GAN

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

13. Attentional machine learning models
- Transformer
- BERT
- Vision Transformer

[Preparation/Review] Study lecture notes and reference papers: preparation (2 hours), review (2 hours).

14. Presentation and report for projects of the 2nd half

[Preparation/Review] Write presentation slides: preparation (2 hours), review (4 hours).
教科書
/Textbook(s)
There is no textbook. We will distribute reading materials in the classes.
成績評価の方法・基準
/Grading method/criteria
- Quiz/Short Test*: 20 points
- Project reports: 80 points

* Simple quiz or short tests (no so often) may be conducted in some classes to confirm some fundamental knowledge we have learned.
履修上の留意点
/Note for course registration
This is a fundamental course related to machine learning. In this course, we focus on basic theories and methodologies so that students can, after taking this course, understand better about the basic ideas behind existing learning models and algorithms, and have better chance to propose their own models or algorithms. Students who are more interested in programming, or who want to learn how to use some open source programs, may take course like "ITA34 Practical Deep Learning".
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
1.     Machine learning, Tom M. Mitchell, McGraw-Hill, 1997.

2.     Machine learning: a probabilistic perspective, Kevin P. Murphy, The
MIT Press, 2012.

3.     Machine learning and deep learning, Tomohiro Odaka, Ohmsha, 2016.
(in Japanese)

4.     Introduction to Bayesian network, Kazuo Shigemasu, Maomi Ueno, and
Yoichi Motomura, Baifukan, 2007.


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  1学期 /First Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
陳 文西
担当教員名
/Instructor
陳 文西, 久田 泰広
推奨トラック
/Recommended track
先修科目
/Essential courses
Some basic knowledge of Physics and Chemistry, Electricity and Electronics is necessary.
更新日/Last updated on 2026/01/21
授業の概要
/Course outline
Biosignals refer to various physical, chemical, mechanical, thermal, electrical and magnetic quantities that contain information of health condition in physiology and psychophysiology. They are presented as different forms in physical quantity, chemical reaction, and electrical activity, and cover a wide spectrum of physiological information in temporal and frequency domains. Biosignal detection is a procedure by which we can determine or measure the quantity that characterizes the property or state of human biological condition. Various modalities using diversified engineering principles on the basis of physics and chemistry, electricity and electronics are applied in biosignal detection.
This course will provide introductory knowledge on the methodologies for detecting various biosignals, mention some aspects in biomedical instrumentation that differ from industrial measurement, and introduce application of IoT, AI, big data analytics and the latest advancements in seamless healthcare monitoring briefly.
授業の目的と到達目標
/Objectives and attainment
goals
The objectives of this course are to introduce briefly some fundamental concepts and approaches for detecting biosignals, and to provide introductory knowledge for the students to pursue their further study of other advanced courses in biomedical engineering field.
The goals to be achieved are:
1. To understand fundamental features and behaviors of various biosignals.
2. To understand application of fundamental physical and chemical principles in detecting various biosignals.
3. To understand the requirements in biosignal detection that differ from industrial measurements in some aspects.
4. To understand application of IoT, AI, big data analytics and the latest advancements in seamless healthcare monitoring.
授業スケジュール
/Class schedule
1. Introduction
2. Motion & Force
3. Direct Pressure
4. Indirect Pressure
5. Direct Flow
6. Indirect Flow
7. Respiration
8. Body Temperature
9. Bioelectricity
10. Biomagnetism
11. Biochemistry-1
12. Biochemistry-2
13. Biochemistry-3
14. Seamless Monitoring
教科書
/Textbook(s)
 Biomedical Sensors and Instruments, 2nd edition, Tatsuo Togawa et al., CRC Press, ISBN: 9781420090789, Publication Date: March 22, 2011
https://www.crcpress.com/Biomedical-Sensors-and-Instruments/Tagawa-Tamura-Oberg/p/book/9781420090789

 Seamless Healthcare Monitoring - Advancements in Wearable, Attachable, and Invisible Devices, Editors: Tamura, Toshiyo, Chen, Wenxi, Springer International Publishing, 2018, DOI 10.1007/978-3-319-69362-0, eBook ISBN 978-3-319-69362-0, Hardcover ISBN 978-3-319-69361-3
https://www.springer.com/us/book/9783319693613
成績評価の方法・基準
/Grading method/criteria
Paper survey and study report, 100%
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
The course instructor has practical working experience and has worked for 5 years at Nihon Kohden Industrial Corp., a professional manufacturer of world famous medical equipment, and has been engaged in R&D for bioinstrumentation, signal processing and data analysis. Based on this experience, he will teach the basic knowledge and latest technology in “Introduction to Biosignal Detection”.

Moodle for course handouts and other related information
https://elms.u-aizu.ac.jp/login/index.php


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  1学期 /First Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
平田 成
担当教員名
/Instructor
平田 成, 出村 裕英
推奨トラック
/Recommended track
先修科目
/Essential courses
N/A
更新日/Last updated on 2026/02/06
授業の概要
/Course outline
リモートセンシングとは,広義には対象物の状態を遠隔から測定する手法のことを指す.多くの場合,光を含む電磁波が計測手段として用いられる.また,狭義には人工衛星などの宇宙機や航空機を,センサを搭載するプラットフォームとして,地球や他の天体を観測することを指す.
本科目では,まずリモートセンシング技術の多様な側面について概要を述べる.次いで,宇宙機によるリモートセンシングを題材として,データの取得から解析,解釈に至る過程を段階を追って詳説する.科学的に有用な測定のためには,その背景となる数学的知識や物理学的現象の理解も重要となるため,これらについても本科目で取り扱う.
授業の目的と到達目標
/Objectives and attainment
goals
[対応する学習・教育到達目標]
(A) 技術者としての専門的・倫理的責任を自覚し、情報科学技術を駆使して社会における要求を分析し、技術的課題を設定・解決・評価することができる
(C) 数学・自然科学・情報技術分野の科目の専門的知識と、それらの習得を通して身につけた論理的な思考力や客観的判断力などの科学的思考力を、問題解決に応用できる

リモートセンシングの概念,特徴,有用性を理解する.
リモートセンシングデータの取得,解析,解釈に関わるコンピュータ理工学の知識・技術を習得する.
また,関連する数学・物理学の知識を得る.
授業スケジュール
/Class schedule
各回の授業は、基本的に授業資料を用いた座学形式で実施する。また、リモートセンシングとそれに関連する物理、情報科学分野の基礎的な知識を問う課題を出題する。一部のトピックでは、演習室環境または持ち込み計算機による実習も行う。

1 ガイダンス
2 リモートセンシング概論
3-4 リモートセンシングに関わる光学,電磁気学的背景  
5-6 リモートセンシングプラットフォームとセンサ
7-8 リモートセンシングデータの特徴
9 リモートセンシングデータの放射量補正
10 リモートセンシングデータの幾何補正
11 マルチバンド画像解析
12 リモートセンシングデータ解析の実際
13 合成開口レーダー
14 測位システム(GPS)

[予習・復習]
各回の授業の前に、授業資料を用いて予習をしておくこと。授業後にも資料に加えて各自で参考情報を収集・参照して復習をしておくこと。課題は設定された締め切りまでに行い、提出すること。毎回の予習・復習・課題の目安は4-5時間である。
教科書
/Textbook(s)
N/A
成績評価の方法・基準
/Grading method/criteria
授業中に出題する課題により成績を評価する。
履修上の留意点
/Note for course registration
以下の内容を理解,習熟していることが望ましい。
物理、微積分、線形代数、画像処理、コンピュータグラフィックス
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
基礎からわかるリモートセンシング, 日本リモートセンシング学会(編), 2011
https://www.amazon.co.jp/dp/4844607790
Image Processing and GIS for Remote Sensing: Techniques and Applications, Liu and Mason, 2016
https://www.amazon.co.jp/dp/1118724208/

実務経験有り(平田成、科目コーディネーター):NASDA(現JAXA)他の研究員、会津大学教員として、20年以上にわたり特に太陽系天体を対象としたリモートセンシング分野での実務を行ってきた。この経験をもとに、本科目の教育を行う。


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  2学期 /Second Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
平田 成
担当教員名
/Instructor
平田 成, 出村 裕英
推奨トラック
/Recommended track
先修科目
/Essential courses
N/A
更新日/Last updated on 2026/02/06
授業の概要
/Course outline
月惑星探査によって取得されたデータの解析にあたって基礎となる知識について学ぶ.探査機の科学観測データを取り扱う際には,探査機位置・姿勢情報等を含む補助データについての理解も必要不可欠となる.このため,まず補助データの利用方法について講義と実習を行う.
授業の目的と到達目標
/Objectives and attainment
goals
本講義の履修により,月惑星探査ミッションにおけるデータの解析手法を学び,それを実現するためのソフトウエア開発の基礎を習得する.また,NASAが開発したSPICE toolkitを用いた補助データの取り扱いを理解する
授業スケジュール
/Class schedule
- 第一週
  - イントロダクション
- 第二週
  - 補助データとSPICE toolkitの概要
  - 時刻情報
- 第三週
  - 座標系
  - 軌道・位置情報
- 第四週
  - 座標系の変換
- 第五週
  - 探査機の姿勢情報
- 第六週
  - 天体の形状モデル
- 第七週
  - ブラウザとPythonによるSPICE toolkit
教科書
/Textbook(s)
N/A
成績評価の方法・基準
/Grading method/criteria
授業中に出題する課題により成績を評価する。
履修上の留意点
/Note for course registration
リモートセンシングの基礎知識(ITC08Aで取り扱う)を理解していることが望ましい.
ITC10A Practical Data Analysis with Lunar and Planetary Database は本コースの内容と強い関連を持つ.ITC10Aでは実践的な探査データの解析に関するトピックを取り上げるため,先にITC09Aを履修したのち,ITC10Aを履修することが望ましい.
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
SPICE toolkit: http://naif.jpl.nasa.gov/naif/
Planetary Data System: http://pds.jpl.nasa.gov/
Hayabusa project science data archive: http://darts.isas.jaxa.jp/planet/project/hayabusa/


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  3学期 /Third Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
出村 裕英
担当教員名
/Instructor
出村 裕英, 平田 成, 小川 佳子, 本田 親寿, 北里 宏平, ラゲ ウダイ キラン, JAXA/NAOJ講師, 山本 圭香
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/17
授業の概要
/Course outline
This course is a combination of advanced lectures and exercises according to practical data analysis and tool-development in lunar and planetary explorations based on the antecedent course "Fundamental Data Analysis in Lunar and Planetary Explorations". This course follows an omnibus form given by ARC-Space professors and invited lecturers (teleclasses) from JAXA, NAOJ, etc.
授業の目的と到達目標
/Objectives and attainment
goals
To learn data analysis and making tools for the analysis from a viewpoint of remote sensing in lunar and planetary explorations
To learn basic knowledge in space developments as topics of computer science and engineering.
授業スケジュール
/Class schedule
This is a lecture-based course with exercises, providing 60 hours of class time (2 credits) per quarter, along with approximately 30 hours of pre- and post-study sessions, which may vary depending on individual progress and achievement.

Guidance by DEMURA (UoA) and following Omnibus Lectures
   RAGE (UoA) RasterMiner as a GIS
   HONDA (UoA) Performance test of imaging sensors
   OGAWA (UoA) Spectroscopic Analysis for lunar and planets
   MATSUMOTO (NAOJ) Gravity field of the Moon
   MOROTA (Tokyo Univ.) Crater Chronology
   YAMAMOTO (UoA) Gravity field determination for celestial bodies using orbital data
教科書
/Textbook(s)
N/A
成績評価の方法・基準
/Grading method/criteria
Comprehensive evaluation based on class activities (presentations, Q&A) and reports for each professor
履修上の留意点
/Note for course registration
Related courses:
ITC08A "Remote Sensing"
ITC09A "Fundamental Data Analysis in Lunar and Planetary Explorations"
ITA19 "Reliable System for Lunar and Planetary Explorations"
SEA11 "Software Engineering for Space Programs"
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
The course instructors has working experiences: Instructors are familiar with JAXA Space Development Projects.


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  2学期 /Second Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
西村 憲
担当教員名
/Instructor
西村 憲, 高橋 成雄
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/05
授業の概要
/Course outline
This course provides fundamentals of 3D Computer Graphics (CG) and its hardware implementation, which is followed by the recent advancement of CG rendering techniques with GPUs.
授業の目的と到達目標
/Objectives and attainment
goals
Through this course, students are expected to acquire fundamental knowledge about rendering algorithms and their parallelization techniques.  Students will also be able to obtain basic skills of GPU programming with the OpenGL Shading Language.
授業スケジュール
/Class schedule
1. Lecture: Introduction, Shape Modeling
2. Lecture: Geometry Calculation, Rasterization
3. Lecture: Lighting and Shading
4. Lecture: Texture Mapping and Shadowing
5. Lecture: Advanced Rendering Techniques
6. Lecture: Volume Rendering
7. Exercise: Fundamentals of Shader Programming
8. Exercise: Transformations and Colors in OpenGL Shading Language
9. Exercise: GPU-based Texture Mapping
10. Exercise: GPU-based Lighting and Shading
11. Exercise: GPU-based Normal Mapping
12. Exercise: GPU-based Shadowing
13. Assignment Presentation I
14. Assignment Presentation II

[Preparation/Review]
Lectures:
Before each class, prepare by reviewing the content of the lecture slides
on the course Web page (1 hour). After each class, read the relevant textbook
pages to gain a deeper understanding of the lecture content (2-3 hours).

Exercises:
Before each class, review the program code and explanations
provided in advance on the course Web page to understand what functionality you
will be implementing (1 hour). After each class, review and understand
the program code implemented during class, and re-implement the quiz problems
you worked on, along with their extensions, by yourself (2-3 hours).

Assignment Presentations:
Before your presentation, read the paper you selected and create
presentation slides introducing its content (8 hours).
Additionally, before each class, prepare by reviewing papers selected by other
students (2 hours).
After each class, review those papers again and try to understand its contents (2-3 hours).
教科書
/Textbook(s)
* J. Hughes, et al., Computer Graphics: Principles and Practice, 3rd edition, 2012.
* T. Sagishima, T. Nishizawa, and S. Asahara, Parallel Processing for Computer Graphics (in Japanese), Corona Publishing, 1991.
* OpenGL Tutorial (http://www.opengl-tutorial.org/)
* Handouts
* Selected journal/conference papers
成績評価の方法・基準
/Grading method/criteria
Presentation (75%), Reports (25%)
履修上の留意点
/Note for course registration
Prerequisites in the case when undergraduate students take this course:
   IT02: Computer Graphics  
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
http://web-int.u-aizu.ac.jp/~nisim/cg_gpu/


コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  1学期 /First Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
白 寅天
担当教員名
/Instructor
白 寅天, 大藤 建太, ラゲ ウダイ キラン
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/06
授業の概要
/Course outline
Recently, there have been very large and complex data sets from nature, sensors, social networks, enterprises increasingly based on high speed computers and networks together.
Big data is the term for a collection of the data sets that it becomes difficult to process using on-hand database management tools or traditional data processing applications.
Data science is a novel term that is often used interchangeably with competitive intelligence or business analytics, and it seeks to use all available and relevant data to effectively tell a story that can be easily understood by non-practitioners.
Data science based on the big data is expected to provide very potent prediction and analysis for information and knowledge of various fields of researches and businesses from the new data set.
Main objective of this course is to build up business viewpoints and target to use the big data, to learn technologies and skills to accomplish the business target.
Business targeting and modeling, decision making, data science process, database for big data, issues related to deep learning, statistical analysis, data mining,
and how to use the technologies to achieve the business goal will be studied in detail.
授業の目的と到達目標
/Objectives and attainment
goals
In this course, introductory knowledge and skill for big data analysis process and technology will be covered.
In detail, CRISP-DM for data analysis process, Hadoop and Spark platform for big data infrastructure, statistical analysis and several machine learning techniques for data analysis, and deep learning for data analysis will be studied by lecture and exercise.
Students can have broad and necessary knowledge and technique for data analysis on big data infrastructure.

[Corresponding Learning Outcomes]
• Data Lifecycle and Infrastructure Management: Students will be able to manage the entire data science process by understanding big data infrastructures like Hadoop and Apache Spark to handle large-scale datasets efficiently.
• Statistical and Analytical Proficiency: Students will demonstrate the ability to apply advanced statistical methods, including multivariate analysis (PCA, FA) and hypothesis testing, to derive meaningful business insights from complex data.
• Algorithmic Data Mining and AI Integration: Students will be able to implement various data mining techniques (Classification, Clustering, Association) and integrate modern Deep Learning and LLM approaches to solve real-world analytical problems.

[Preparation/Review]
• Before each class session, read and understand the lecture materials (slides) and the corresponding sections in the exercise notebook. After class, review the lecture content, run the provided code, and think of ways to apply it.
• The standard out-of-class learning time for this course is 240 minutes per session, broken down as follows: 120 minutes for preparation (reading slides + running example code), and 120 minutes for review and practice (reviewing slides, applying example code, completing exercises and examples, and organizing notes).
• Report assignments will be given as needed and should be completed during the review and practice time.
授業スケジュール
/Class schedule
Session 1.
• Lecture: Data (Analysis/Science/Engineering) Process

Session 2.
• Lecture: A Scenario of Business Analysis with Data Science Process

Session 3.
• Lecture: Big Data Infrastructure (Hadoop & Apache Spark)

Session 4.
• Lecture: Big Data Analysis with Deep Learning and LLM

Session 5.
• Lecture: Statistical Analysis 1 (Linear Regression)
• Exercise: Hands-on with Google Colab (Regression)

Session 6.
• Lecture: Statistical Analysis 2 (Multivariate Analysis - PCA, FA)
• Exercise: Hands-on with Tensorflow Playground

Session 7.
• Lecture: Statistical Analysis 3 (Statistical Tests)
• Exercise: Hands-on with Google Colab (Pipeline and model structure)

Session 8.
• Lecture: Statistical Analysis 4 (Wrap up)
• Exercise: Hands-on with Google Colab (Parameter tuning)

Session 9.
• Lecture: Data Mining 1 - Classification

Session 10.
• Lecture: Data Mining 2 - Clustering

Session 11.
• Lecture: Data Mining 3 - Association Rule Mining

Session 12.
• Lecture: Data Mining 4 - Pattern Mining in Big Data

Session 13.
• Exercise: Exercise(Spark & Data Mining) #1

Session 14.
• Exercise: Exercise(Spark & Data Mining) #2
教科書
/Textbook(s)
Lecture Slide: Will be provided on the lecture Web site

成績評価の方法・基準
/Grading method/criteria
Examination  -----       60 %
Exercise LAB (Including Term Project)  -----------------    40 %

履修上の留意点
/Note for course registration
* Prerequisites:
For exercise, students should have skill and basic knowledge for the below:
- JAVA & Python Programming
- Machine Learning and Data Mining Basics
参考(授業ホームページ、図書など)
/Reference (course
website, literature, etc.)
Reference:
1. Tom White, Hadoop, OREILLLY, 2011
2. Srinath Perera, Thilina Gunarathne, Hadoop Map-Reduce Programming, Packt Publishing, 2013
3. J.H Jeong, Biginning Hadoop Programming: Development and Operations, Wiki Books, 2012
4. Tan, Steinbach & Kumar,Introduction to data mining", Pearson Intrnational Edition, 2006
5. Tensorflow, https://www.tensorflow.org/





コンピテンシーコード表を開く 科目一覧へ戻る

開講学期
/Semester
2026年度/Academic Year  2学期 /Second Quarter
対象学年
/Course for;
1年 , 2年
単位数
/Credits
2.0
責任者
/Coordinator
ラゲ ウダイ キラン
担当教員名
/Instructor
ラゲ ウダイ キラン, サクセナ ディーピカー
推奨トラック
/Recommended track
先修科目
/Essential courses
更新日/Last updated on 2026/02/10
授業の概要
/Course outline
This course provides an advanced and integrated study of Data Science and Cloud Computing, focusing on scalable data analytics, distributed processing, and cloud-based data management. As data volumes continue to grow rapidly, modern data science solutions increasingly rely on cloud infrastructures to store, process, and analyze large-scale datasets efficiently.

The course is organized into two tightly connected parts: advanced data science techniques and cloud computing technologies. In the first part, students study essential and advanced data science concepts, including data preprocessing, data warehousing, extract–transform–load (ETL) pipelines, knowledge discovery in data, dimensionality reduction, and supervised and unsupervised learning techniques. Emphasis is placed on understanding the strengths, limitations, and appropriate application of different analytical methods.

The second part focuses on cloud computing foundations and architectures, including cluster, grid, and utility computing, cloud service and deployment models, cloud data management, MapReduce-based analytics, resource management, security mechanisms, and data lake architectures. Students learn how large-scale analytics systems are designed and optimized in cloud environments.

Through lectures, hands-on exercises, and project-based learning, students gain practical experience in designing and evaluating end-to-end data science pipelines on cloud platforms. The course emphasizes both conceptual understanding and practical skills required for research and real-world applications.
授業の目的と到達目標
/Objectives and attainment
goals
Objectives

The objective of this course is to equip students with advanced knowledge and practical skills required to analyze large-scale data using modern data science techniques and cloud computing technologies. The course aims to enhance students’ ability to design scalable, efficient, and secure analytics solutions.

Attainment Goals

By completing this course, students will be able to:

Understand and apply advanced data science techniques for large-scale data analysis

Design ETL pipelines and data warehousing solutions

Select appropriate machine learning techniques based on data characteristics and problem requirements

Explain and differentiate cluster, grid, utility, and cloud computing paradigms

Apply cloud-based data management and analytics frameworks such as MapReduce

Design scalable, secure, and cost-effective data analytics solutions in cloud environments

Integrate data science workflows with cloud infrastructure and data lakes
授業スケジュール
/Class schedule
Course Content and Methods

Each class consists of lectures introducing core concepts and exercise sessions involving hands-on practice, case studies, or discussions. Exercises include SQL practice, data preprocessing tasks, distributed analytics experiments, and cloud-based system design.

Schedule (14 Sessions)

Introduction to Data Science and Cloud Computing

Database Fundamentals and Data Modeling

Structured Query Language (SQL) for Analytics

Data Warehousing and ETL Pipelines

Data Preprocessing and Data Quality Management

Knowledge Discovery in Data

Dimensionality Reduction and Feature Engineering

Supervised and Unsupervised Learning Techniques

Cluster, Grid, and Utility Computing

Cloud Computing Fundamentals and Architectures

Cloud Data Management and MapReduce

Cloud Resource Management and Optimization

Cloud Security, Privacy, and Data Governance

Data Lakes and Integrated Cloud Data Science Systems

Pre-class and Post-class Learning

Students are expected to review assigned materials before each class and complete exercises, experiments, or reports after class.
Estimated out-of-class study time per session: 5–7 hours.
教科書
/Textbook(s)
Han, J., Kamber, M., & Pei, J., Data Mining: Concepts and Techniques, Springer

Kleppmann, M., Designing Data-Intensive Applications, O’Reilly

Erl, T., Cloud Computing: Concepts, Technology & Architecture, Pearson
成績評価の方法・基準
/Grading method/criteria
Student performance is evaluated using the following criteria:

Course Project: 50%

Classroom Exercises and Participation: 10%

Assignments / Practical Exercises: 25%

Final Examination: 15%

Attendance is not included in the grading criteria.
履修上の留意点
/Note for course registration
Attendance is mandatory.

Students must maintain at least 75% attendance to pass the course.

Failure to meet the attendance requirement will result in course failure, regardless of academic performance.

Prior knowledge of databases, data mining, and basic cloud concepts is strongly recommended.


このページの内容に関するお問い合わせは学生課 教務係までお願いいたします。

お問い合わせ先メールアドレス:sad-aas@u-aizu.ac.jp