Projects

Practical Action Recognition and Prediction Technology

NRSfM_small.png

Action recognition is the most important component of video analysis. Once we can reliably recognize actions of persons appearing in a video, it can enable a number of new applications, including video search and tagging, intelligent surveillance, situation understanding, gesture recognition, human-robot and interaction. However, action recognition is still a challenging problem due to occlusions, clutters, changes in view points, changes in lighting conditions, and different scales, to name a few. We think the fundamental problem is rooted in our attempt to recognize an action from 2D data. If we can recover 3D information from 2D data reliably, we can robustly recognize actions from videos. In this project, we develop robust non-rigid structure from motion (NRSfM) methods based on the recently proposed Procrustean normal distribution (PND). The developed NRSfM methods are applied to develop robust action recognition and prediction algorithms.

Funded by the National Research Foundation (NRF).


Human-Level Lifelong Machine Learning

ARGPnav.jpg

In order to act intelligently in an open environment, it is necessary to have an ability to learn and predict physical phenomena under different contexts and surroundings. Such dynamic phenomena can be mathematically modeled as stochastic processes.  In this project, we develop novel algorithms and methods for real-time nonparametric learning and prediction of time-varying stochastic processes. 


Funded by the Ministry of Science, ICT, and Future Planning (MSIP).

 


Biomimetic Recognition Technology

addpic.png

 

This project develops low-complexity object and situation recognition methods optimized for small-sized resource-limited biomimetic robots.

Funded by the Defense Acquisition Program Administration (DAPA). 

 


Past Projects   


Human-Centric Networked Robotics Technology

hcr_fig.JPG

 

There is a growing interest in service robots for applications such as housekeeping, military, personal care and support, transportation, rehabilitation, and entertainment, to name a few. Service robots are different from industrial robots in many aspects. The most important aspect of service robots is that they have to operate in the environment with humans. Hence, they have to be able to adapt to changing and uncertain operation domains. This project develops new methods and prototype applications for human-centric networked robots for seamless operations of service robots in our daily lives.

Funded by the National Research Foundation (NRF). 


Resilient Cyber-Physical Systems

 cps.jpg

A cyber-physical system is highly networked and highly embedded into our physical world and offers a portal to an unprecedented quantity of information about our environment, bringing about a revolution in the amount of control individuals have over their environment. It is envisioned that the cyber-physical systems of tomorrow will dramatically improve adaptability, autonomy, efficiency, functionality, reliability, safety, and usability of engineered systems. This project develops algorithms and methods for resilient cyber-physical systems. 

Funded by the Ministry of Science, ICT, and Future Planning (MSIP). 


Wireless Camera Sensor Networks Technology

camerasensornetworks.jpg


Wireless camera sensor networks are an emerging technology based on the recent advances in embedded systems, low-power wireless communication, image sensing devices, and computer vision algorithms.

This project develops the core technology for commercialization of wireless camera sensor networks and prototype systems. The project focuses on developing convergent technology for camera sensor networks by integrating innovative approaches in communication and information processing. The development of prototype applications will present new possibilities of wireless camera sensor networks technology. 

Funded by the National Research Foundation (NRF).

 

 


Mobile Sensor Networks: Algorithms and Applications

rescue.jpg

In order to address societal issues with growing importance, such as environment protection, efficient management of resources, and security, it is necessary to collect and process sensing data in real-time, potentially under dynamic and unstructured environment. While wireless sensor networks can provide some solutions, they are not adaptive enough to handle dynamic changes in environment. Mobility in a sensor network can increase its sensing coverage both in space and time and robustness against dynamic changes in the environment and have received extensive attention recently. The goals of this project are to develop the core technology for mobile sensor networks and prototype applications and to present new possibilities using mobile sensor networks technology.

Funded by the National Research Foundation (NRF). 


Situation Understanding for Smart Devices

KOCCA_.png

 

We are witnessing the emergence of smart devices, such as smartphones, smart pads, and smart TVs. For example, using a smartphone, we can organize our schedule, browse the web, and check emails. Beyond these simple tasks, we can find out our current location and get a direction to our destination using the GPS unit inside a smartphone. As GPS-enabled smartphones have made the location-based service possible, it is envisioned that many new useful and previously unavailable services will be possible using the next generation smart devices. We can provide better services and enable new applications if these devices are aware of the current situation (or context) of users or their surroundings. This project develops algorithms and new smart devices for understanding situations of users and their surroundings.

Funded by the Korea Creative Content Agency (KOCCA).

  


Micro Autonomous Systems and Technology (MAST)

mast.bmp

 

MAST is a research project to develop autonomous, multifunctional, collaborative ensembles of agile, mobile micro-systems to enhance tactical situational awareness in urban and complex terrain for small unit operations. MAST is a partnership across 9 member institutions with many more participating institutions. To achieve this objective, the program plans to advance fundamental science and technology in several key areas including: microsystem mechanics, processing for autonomous operation, microelectronics, and platform integration.


CITRIS: Mobile Sensor Networks for Independent Living and Safety at Home

cirtis1.bmpcitris2.bmp

 

 

Guardian Angel is a mobile sensor network system to monitor activities of occupants, to detect abnormal behaviors or emergency situations, and to alarm a third-party in the case of emergency in an indoor environment.

 

 


Heterogeneous Sensor Networks (HSN)

hsn.bmp

Traditional sensor network research has focused largely on the low-bandwidth sensors, such as temperature, acoustic, infrared, etc. As a result, we have limited applications of sensor networks. In the HSN project, we consider a network of high-bandwidth, low-bandwidth, and mobile sensors. The high-bandwidth sensors include the image sensors for in-depth situation awareness and recognition. The main research objectives are to: (1) study the tradeoff between performance and cost, (2) develop design principles for heterogeneous sensor networks, (3) develop (near-)optimal multi-modal, multi-sensor fusion algorithms for heterogeneous sensor networks, and (4) develop a new communication theory and algorithms for heterogeneous sensor networks.


Network Embedded Systems Technology (NEST)

nest.bmp

In the NEST project, our main research focus is to develop a real-time control system using wireless sensor networks. The main challenge is the inconsistency in sensor measurements due to packet loss, communication delay, and false detections. These challenges are addressed by robust multiple layers of data fusion and a hierarchical architecture. The implemented system is successfully demonstrated using a large-scale outdoor sensor network.