Using dynamic time warping for online temporal fusion in multisensor systems
Sensor fusion is concerned with gaining information from multiple sensors by fusing across raw data, features or decisions. Traditionally these fusion processes only concern fusion at specific points in time. However recently, there is a growing interest in inferring the behavioural aspects of envir...
| Main Authors: | , , , |
|---|---|
| Format: | Journal Article |
| Published: |
Elsevier
2008
|
| Online Access: | http://hdl.handle.net/20.500.11937/24407 |
| _version_ | 1848751419996241920 |
|---|---|
| author | Ko, Ming West, Geoff Venkatesh, Svetha Kumar, Mohan |
| author_facet | Ko, Ming West, Geoff Venkatesh, Svetha Kumar, Mohan |
| author_sort | Ko, Ming |
| building | Curtin Institutional Repository |
| collection | Online Access |
| description | Sensor fusion is concerned with gaining information from multiple sensors by fusing across raw data, features or decisions. Traditionally these fusion processes only concern fusion at specific points in time. However recently, there is a growing interest in inferring the behavioural aspects of environments or objects that are monitored by multisensor systems, rather than just their states at specific points in time. In order to infer environmental behaviours, it may be necessary to fuse data acquired from (i) geographically distributed sensors at specific points of time and (ii) specific sensors over a period of time. Fusing multisensor data over a period of time (also known as Temporal fusion) is a challenging task, since the data to be fused consists of complex sequences that are multi-dimensional, multimodal, interacting, and time-varying in nature. Additionally, performing temporal fusion efficiently in real-time is another challenge due to the large amounts of data to be fused. To address this issue, we propose a robust and efficient framework that uses dynamic time warping (DTW) as the core recognizer to perform online temporal fusion on either the raw data or the features. We evaluate the performance of the online temporal fusion system on two real world datasets: (1) accelerometer data acquired from performing two hand gestures, and (2) a benchmark dataset acquired from carrying a mobile device and performing the predefined user scenarios. Performance results of the DTW-based system are compared with those of a Hidden Markov Model (HMM) based system. The experimental results from both datasets demonstrate that the proposed system outperforms HMM based systems, and has the capability to perform online temporal fusion efficiently and accurately in real-time. |
| first_indexed | 2025-11-14T07:52:26Z |
| format | Journal Article |
| id | curtin-20.500.11937-24407 |
| institution | Curtin University Malaysia |
| institution_category | Local University |
| last_indexed | 2025-11-14T07:52:26Z |
| publishDate | 2008 |
| publisher | Elsevier |
| recordtype | eprints |
| repository_type | Digital Repository |
| spelling | curtin-20.500.11937-244072017-09-13T15:07:14Z Using dynamic time warping for online temporal fusion in multisensor systems Ko, Ming West, Geoff Venkatesh, Svetha Kumar, Mohan Sensor fusion is concerned with gaining information from multiple sensors by fusing across raw data, features or decisions. Traditionally these fusion processes only concern fusion at specific points in time. However recently, there is a growing interest in inferring the behavioural aspects of environments or objects that are monitored by multisensor systems, rather than just their states at specific points in time. In order to infer environmental behaviours, it may be necessary to fuse data acquired from (i) geographically distributed sensors at specific points of time and (ii) specific sensors over a period of time. Fusing multisensor data over a period of time (also known as Temporal fusion) is a challenging task, since the data to be fused consists of complex sequences that are multi-dimensional, multimodal, interacting, and time-varying in nature. Additionally, performing temporal fusion efficiently in real-time is another challenge due to the large amounts of data to be fused. To address this issue, we propose a robust and efficient framework that uses dynamic time warping (DTW) as the core recognizer to perform online temporal fusion on either the raw data or the features. We evaluate the performance of the online temporal fusion system on two real world datasets: (1) accelerometer data acquired from performing two hand gestures, and (2) a benchmark dataset acquired from carrying a mobile device and performing the predefined user scenarios. Performance results of the DTW-based system are compared with those of a Hidden Markov Model (HMM) based system. The experimental results from both datasets demonstrate that the proposed system outperforms HMM based systems, and has the capability to perform online temporal fusion efficiently and accurately in real-time. 2008 Journal Article http://hdl.handle.net/20.500.11937/24407 10.1016/j.inffus.2006.08.002 Elsevier restricted |
| spellingShingle | Ko, Ming West, Geoff Venkatesh, Svetha Kumar, Mohan Using dynamic time warping for online temporal fusion in multisensor systems |
| title | Using dynamic time warping for online temporal fusion in multisensor systems |
| title_full | Using dynamic time warping for online temporal fusion in multisensor systems |
| title_fullStr | Using dynamic time warping for online temporal fusion in multisensor systems |
| title_full_unstemmed | Using dynamic time warping for online temporal fusion in multisensor systems |
| title_short | Using dynamic time warping for online temporal fusion in multisensor systems |
| title_sort | using dynamic time warping for online temporal fusion in multisensor systems |
| url | http://hdl.handle.net/20.500.11937/24407 |