Project

Project

Theoretical Stagnation of Foundation Models
  • Question: To build a commercial-level large-scale architecture resolving current foundation model's limits in training dynamics.
  • Keywords: Representation Learning
  • funding: (confidential, international), 2024 ~ 2025
Representation Invariance Analysis on Various Levels of Relation Structures
  • Question: How to build representation invariant to many factors for expressing consistent knowledge?
  • Keywords: Representation Learning, Relational Learning
  • funding: National Research Foundation, South Korea, 2022~2027
Knowledge Extraction from Visual Data for Linguistic Understanding
  • Question: How to extract latent objects and their structures from images in unsupervised manner for the use in multimodal environment?
  • Keywords: Representation Learning, Relational Learning
  • funding: Electronics and Telecommunication Research Institute, South Korea, 2022~2023
Schema Loading Networks for Learning Common Knowledge Representation
  • Question: What is a learnable architecture to save and load modular neural networks?
  • Keywords: Representation Learning, Optimization, Relational Learning
  • funding: National Research Foundation, South Korea, 2019~2022
Statistical and Neural Machine Translation
  • Question: Enhancing translation quality by probabilistic and representation analysis for machine translation
  • Keywords: natural language processing
  • funding: Electronics and Telecommunication Research Institute, South Korea, 2016~2020
Expression Power Enhancement of Infinite Probabilistic Context Free Grammar
  • Question: How to remove bias in optimization of probabilistic context free grammar for better expression power?
  • Keywords: Optimization, Probability and Relational Learning
  • funding: National Research Foundation, South Korea, 2016

Until 2019

  • link to google scholar