Research

Design Your Future with Engineering

I have developed various digital control systems and invented novel algorithms which can make vehicles or robots more interactive to human. I am currently researching and developing an innovative HMI technology regarding vehicles. I have recently developed some systems to recognize patterns and analyze various sensor signals with an embedded system. In the previous time, I researched a technology about detecting environmental condition around a vehicle and about autonomous driving.

I have the experience to develop the 40” multi-touch LCD panel when I was a research engineer at Samsung Electronics. Sometimes, I have implemented future personal mobilities, and I have also lead a study regarding emotion recognition, generation, and expression of robots. In addition, I have diverse experiences in embedded systems such as articulatory speech synthesizer, virtual music instrument, ink-jet head controller, etc. I can handle MCU/DSP with C/C++ and FPGA/CPLD with Verilog HDL. I have used Matlab and PSpice as simulation tools over 20 years in data analysis and circuit design. My key strength lies in diverse experiences in hardware systems such as electrical circuit design, PCB layout, assembly, and debugging.

Interests

  • Embedded System
  • Personal Mobility
  • Social Robot
  • Human Machine Interaction
  • Emotional Interaction

Members

Yaongi Hyung

Yaongi Hyung

Postdoctoral fellow

Follow
Kim, Sojung (김소정)

Kim, Sojung (김소정)

Ph. D. candidate

Neoburi

Neoburi

Ph. D. candidate

Lee, Sung Ho (이성호)

Lee, Sung Ho (이성호)

Master Student

Cho, Eunjun (조은준)

Cho, Eunjun (조은준)

Master Student

Bonobono

Bonobono

Master Student

Moon, Ji Hwan (문지환)

Moon, Ji Hwan (문지환)

Undergraduate Student

Lee, Sandeul (이산들)

Lee, Sandeul (이산들)

Undergraduate Student

Kang, Donghun (강동훈)

Kang, Donghun (강동훈)

Undergraduate Student

Byyung Hern Kim (김병헌)

Byyung Hern Kim (김병헌)

Undergraduate Student

Follow
Porori

Porori

Undergraduate Student

Jeong Ho Lee (이정호)

Jeong Ho Lee (이정호)

TLO

You are the one!

We are always looking for great someone to work together!

We invite motivated applicants for postdoc, master, and Ph. D. candidate.

Representative Works

  • IoC (Internet of Cars) platform

    IoC (Internet of Cars) platform

    For making cars smarter! Cooperated with Hyundai Motor Company, 2015 ~ present

    Let’s make fun together with your car!

    applied to a car sharing company: J’car
    http://m.blog.naver.com/PostList.nhn?blogId=jecar_1

    Hyundai Motors Page: https://www.facebook.com/hyundaimotorgroup/posts/1542930155718455

  • AUI - Acoustic User Interaction

    AUI - Acoustic User Interaction

    Interacting with cars and robot using acoustic signal or sound, Hyundai Motor Company, 2013 ~ present

    With increasing of IT applications, the information drivers should control in vehicles is getting diverse and complicated. Accordingly, people want new human-machine interaction (HMI) and user interface (UI) that doesn’t need to train and is more intuitive than traditional HMI and UI. From designers’ point of view, they don’t like that their products or works would be damaged or changed by buttons, switches, or other sensors.

    I suggested and have developed the acoustic user interaction (AUI) technology to solve the problems mentioned above. The AUI uses acoustic wave for touch recognition which is generated by tapping or sweeping on some materials. An AUI device listens for an acoustic signal of some materials. It works by placing a specialized small microphone on the backside of devices. The AUI is one of the elements that are beyond the multi-touch. We can approach the rich-touch through the AUI technology. A system where the AUI is installed can distinguish human’s touch types such as a nail, fingertip, palm, or knuckle. We can also realize dynamic input based on different touch power by the AUI. Touch position, sweeping direction as well as touch types can be detected by an AUI system and various pattern recognition algorithms.

    It can reduce visual distraction and cognitive distraction while driving cars since drivers can feel the texture and shape where they touch. It is not necessary to have a flat and uniform surface. Since the acoustic wave can be transferred through the same material without loss, we can implement the touch system on a large and complex surface material. There is no button or switch with the AUI so that designers can keep and realize their own design concept.

    I displayed and demonstrated the AUI steering wheel remote controller at CES2015. Drivers can start a car by tapping the AUI pad on the steering wheel.  The left and right AUI pad are for controlling the volume of the music and for changing the driving mode, respectively. We can control the volume and the mode by rubbing the pads up and down. There is no button or switch on the steering wheel.

    related video: https://youtu.be/w-cLX1W3yFo

    그림1

    In June 2016, I have made an AUI application in mass production. That is the Knock-On Sliding Door which has been applied to the Carnival Hi-Limousine of KIA. The Knock-On Sliding Door is the world’s first product in the automotive industry to which the AUI technology is applied.

    Users can open the sliding door only with knocking twice on the outer panel of the door. The AUI system monitors an acoustic signal, and it analyzes the acoustic patterns. After recognizing the human input, the sliding door will be opened or closed automatically. With the help of the smart key, the car system can identify that the user is the real owner and the user is near by the car. Because it is a simple and easy way, children and women are easily able to control the sliding door.

    related video: https://youtu.be/-ywRR23d9vA

    노크식슬라이딩도어

  • E4U, Eco-Evolutional Electrical Egg for You

    E4U, Eco-Evolutional Electrical Egg for You

    A future personal mobility vehicle with a hemispherical structure, Hyundai Motor Company, 2012~2013

    Related video: https://youtu.be/aMggQFvZIe4 ,
    https://youtu.be/-N-5xA9ruwA

    As the human population grows, cities and suburbs become more densely populated. Public transportation travels on a steady path of not being tailored to your schedule, and society needs to find a better way to get around. Many companies and researchers have proposed various types of the personal mobility based on the electrical motor and battery. But most of them have more than two motors engaged to the steering and speed control.

    I had envisioned a new form of mobility. The main idea was how to control speed and steering using only one motor without a transmission system in an electrical vehicle. That is the most important factor to make the personal mobility have a simple structure and low weight. To accomplish it, I adopted the hemisphere structure which had been proposed by Lame in 1938.

    To make it a reality, I organized a team of four engineers and two designers including me.  We named the work E4U.  The name E4U embraces 4 Es; Egg, Evolution, Electricity, and Eco-friendliness. The body shape of the E4U basically came from the EggRan I had developed in 2011. The E4U is shaped like an egg since it shares the style of the EggRan, but it has two training wheels sticking out of the back.

    그림3

    The E4U can maneuver in any direction even though its motor only moves in one. It moves by using a hemisphere that continuously spins horizontally. The spinning hemisphere is tilted to generate drive while two training wheels at the rear keep it moving straight. The hemispherical part rotates in one direction. The direction of travel can be controlled by using the feet placed on the part to tilt it. To move forward, if the hemisphere rotates counterclockwise, the left foot is used to tilt the hemisphere part so that the left side of it contacts the ground and kicks the ground backward. As a result of its reaction force, the vehicle moves forward. The driver can also control E4U’s speed according to how much he or she tilts the hemisphere. When the driver tilts the hemisphere more, the E4U gets more speed.  If the driver wants the E4U to stop, he or she just keeps the hemisphere horizontal.

    After six months of development, we have exhibited our work in the internal company competition in 2012. We won the excellence prize of the competition. After that, the E4U was invited and displayed in many events and exhibition, such as the Seoul Motor Show 2013, the Creative Economy Exhibition 2014, and the Electrical Car Conference 2015.

    그림4 그림6그림7

  • Object and Road Condition Detection for Autonomous Driving Cars

    Object and Road Condition Detection for Autonomous Driving Cars

    Developing algorithms and hardwares to recognize objects and road condition around a car, Hyundai Motor Company, 2010 ~ 2012

    Related video: https://youtu.be/M0pm5pGBnhs

    I have initially researched the developing autonomous vehicle technology since I moved to Hyundai Motors in April 2010. There was the first unmanned autonomous vehicle competition at Hyundai Motor Company in October 2010, and I belonged to the team that had the role of demonstration and making an autonomous vehicle for the competition. I was in charge of making the simple hardware system in the autonomous vehicle. It was also my role that I had to develop the logic which can detect obstacles around the car using Lidar (Light Detection and Ranging).

    After that time, as the leader of the sensor fusion group, I had developed various systems for detecting the situation around the vehicle or the road surface condition, etc. using a variety of sensors, such as optical cameras, ToF cameras, Lidars, and ultrasonics. With sensor fusion technology, we can also detect a bump, a pothole, and black ice on the road surface. We had developed the technology that can recognize the position and speed of moving objects around the car only using the Lidar.

    그림2

    2012-01-10 21.07.52

  • EggRan

    EggRan

    A personal mobility vehicle for kids, Hyundai Motor Company, 2011~2012

    Related video: https://youtu.be/Vj5TRu4xdfo

    Children instinctively prefer to move somewhere embracing and riding on something. Electric toy cars for children are recently prevalent, but vehicle technology for children has not been introduced yet. Since the structures of conventional automobiles are directly reflected even in toy cars, children should learn how to operate a steering wheel, a pedal, and gearshift to control speed or direction of the car.

    What structure of mobility is capable for young children to drive easy and fun? How can we have kids feel a car as a plump and soft life not solid and hard stuff? These questions are the starting point to design a new concept in personal mobility.

    Based on our motivation, we suggest egg shaped personal mobility named “EggRan.” We focused on children’s instinctive behavior to hug while riding on something. For this, we designed a sensing mechanism which can detect movements of a child’s body. We have also tried to include a sense of comfort and warmth on EggRan. At the same time, we have made sure that the vehicle is durable. Since the EggRan requires only hands for its control, it can be operated by disabled people who cannot use their legs when they want to move to other places.

    그림9 그림8

    Two electrical motors are installed for the wheels, and a load cell is installed at each handle of the EggRan to sense body movements. The force induced by children’s body movements is transmitted to the handles through arms. The load cell can detect the amount of the force. The direction and velocity can be determined by the difference of forces to each handle and the child’s weight. For instance, when a child tilts his or her body to the left, greater pulling force is transmitted to the right handle, which causes the EggRan to turn left. When a child pulls or pushes the handles with the same force, the EggRan goes forward or backward, respectively. The resulting speed of the vehicle depends on the pulling or pushing force and the child’s weight.

    The emergency stop function is implemented by a magnetic contact switch which is placed beneath the EggRan. The magnetic switch is connected to a long line which looks like a tail. If we pull out the long line and detach it from the EggRan’s body, the whole control system will stop. The vehicle can come to a complete stop or restart by just pulling or attaching the ‘tail.’

    그림10

    The EggRan is a novel designed personal mobility that children can control easily. Its shape and surface is strong enough to enjoy by themselves. For this reason, it can be a kind of new ride at some places like below: Commerce facilities for kids (Amusement park, Theme park), Rehabilitation facilities for people who have low-part disorder.

    We presented the EggRan at the Busan Motor Show 2012. After that, the EggRan was displayed and demonstrated at many events and exhibitions. When we evaluated our real product to kids and women at various events, we found they are interested in our cute design and easy operating system.

    IMG_0386

  • SUR40 (Surface2)

    SUR40 (Surface2)

    40`` multi touch LCD display, Samsung Electronics, 2009~2010

    related video: https://youtu.be/8wYJeujIJQU

    The SUR40 or Surface2 is a multi-touch table-like computer developed by Samsung and Microsoft. It features a 1080p 40-inch LCD display. Additionally, the SUR40 is powered with Microsoft Surface and comes with PixelSense which allows it to recognize up to 50 simultaneous touch points. Microsoft’s PixelSense, in the new Samsung SUR40 for Microsoft Surface, allows a display to recognize fingers, hands, and objects placed on the screen, enabling vision-based interaction without the use of cameras.  The individual pixels in the display see what’s touching the screen and that information is immediately processed and interpreted.

    PixelSense technology also enables Samsung and Microsoft to reduce the thickness of the product from 56 cm in the previous version to 10 cm in the new version. The size reduction enables the product to be placed horizontally and adds the capability to be mounted vertically while retaining the ability to recognize fingers, tags, blobs and utilize raw vision data.

    그림11

    Additionally, the system is designed to interact with several people at the same time so that content can be shared without the limitations of a single-user device. These combined characteristics place the Microsoft Surface platform in the category of so-called natural user interface (NUI), the apparent successor to the graphical user interface (GUI) systems.

    The technology allows non-digital objects to be used as input devices. In one example, a normal paint brush was used to create a digital painting in the software. This function is made possible by the fact that, thanks to the PixelSense technology, the system does not rely on restrictive properties required of conventional touchscreen or touchpad devices such as the capacitance, electrical resistance, or temperature of the tool used.

    Samsung produced the hardware, and Microsoft provided the software platform for the SUR40. The SUR40 was released to customers in 2012 and won the “Best of Innovation” of CES2012.

    그림16

  • Dodori

    Dodori

    A facial robot for Human-Robot Interaction, KAIST, 2005~2007

    Related videos: https://youtu.be/vMZTtDD4_G8 ,
    https://youtu.be/VNllt3clpuU

    The development of human-friendly robot systems which enable robots to show their emotions to people is necessary. To achieve this goal, it should be considered how to design facial robots to be friendly to human and how to control the robots effectively, and a well-organized research about those issues should be accomplished. This is the main theme of my Ph. D. dissertation.

    I suggested the Linear Dynamic Affect-Expression Space Model for emotional expressions.  For proving my idea to be practical for emotional robots, I developed a facial robot, Doldori, and implemented all electrical systems to control the Doldori. The Doldori was modeled with 5~6-year-old child’s facial proportions. Doldori has the nose, mouth, and eyelids with a hard outer form. The surface of the outer form has complex curves, and the outer form can be separated into three parts; a front face, upper head, and back head.

    그림13

    When human makes a facial expression, the expression is generally composed of an upper part (around the eyes) and a lower part (around the mouth). Thus the framework is designed separately in two parts. This structure has some advantages such that it can be easily adaptive to various outer forms and adjusting the positions of the eye or the mouth is simple. Ten DC motors of small sizes are chosen to obtain large torque in a small volume.

    A control system is designed and implemented in order to simultaneously control all the ten DC motors and two LEDs using a micro control unit (MCU) and a complex programmable logic device (CPLD). A power amplifier is also included to drive the DC motors. The digital controller of small size is embodied so that it could be included inside the Doldori’s outer form. The developed robot is capable of showing rich expressions with LEDs and dynamic neck motion that is composed of two brushless DC motors.  An operator can easily control the various facial expressions and LEDs using only three parameters based on the linear dynamic affect-expression model.

    그림12

    그림15

    IMG_0166