02.02.2015 Views

モーションキャプチャと音楽情報を用いた舞踊動作解析手法 - 池内研究室

モーションキャプチャと音楽情報を用いた舞踊動作解析手法 - 池内研究室

モーションキャプチャと音楽情報を用いた舞踊動作解析手法 - 池内研究室

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

† †† †<br />

The Structure Analysis of Dance Motions using Motion Capture<br />

and Musical Information<br />

Takaaki SHIRATORI † , Atsushi NAKAZAWA †† , and Katsushi IKEUCHI †<br />

() <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

1. <br />

<br />

<br />

[1]<br />

<br />

(metrics) <br />

<br />

<br />

metrics <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

† <br />

Institute of Industrial Science, The University of Tokyo<br />

†† <br />

Cybermedia Center, Osaka University<br />

[2]<br />

<br />

()<br />

<br />

<br />

<br />

<br />

<br />

<br />

() <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

[3]<br />

<br />

<br />

<br />

X Vol. Jxx–X No. xx pp. 1–10 xxxx xx 1


xxxx/xx Vol. Jxx–X No. xx<br />

<br />

<br />

<br />

<br />

<br />

<br />

Self Motion Element <br />

HMM [4]<br />

<br />

[5]<br />

Dynamic<br />

Time Warping <br />

[6] <br />

<br />

Labanotation <br />

[7] Labanotation<br />

<br />

<br />

<br />

Labanotation <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

1 <br />

() <br />

<br />

2 <br />

<br />

1 Flash <br />

[3]<br />

<br />

<br />

<br />

2 <br />

<br />

2 <br />

<br />

<br />

<br />

1 <br />

2. <br />

<br />

<br />

[8] <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

Kim<br />

<br />

<br />

[9]<br />

1 <br />

Fig. 1 Algorithm overview<br />

3 <br />

4 <br />

<br />

5 <br />

6 <br />

<br />

3. <br />

<br />

2


[10] <br />

<br />

3. 1 <br />

<br />

() <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

3. 2 <br />

<br />

<br />

<br />

t f <br />

d(t, f) 1 <br />

<br />

⎧<br />

⎪⎨ max(p(t, f), p(t + 1, f)) − p ′<br />

d(t, f) = (min(p(t, f), p(t + 1, f)) > = p ′ ) (1)<br />

⎪⎩<br />

0 (otherwise)<br />

p ′ ≡ max(p(t − 1, f), p(t − 1, f ± 1)) (2)<br />

p(t, f) t f <br />

2 d(t, f) <br />

<br />

1 t t + 1 2 <br />

<br />

<br />

2 <br />

Fig. 2 Onset component extraction<br />

2 <br />

<br />

<br />

<br />

1 <br />

D(t) = ∑ d(t, f) <br />

f<br />

t <br />

3. 3 <br />

3. 2 D(t) <br />

<br />

D(t)<br />

R DD ( 3)<br />

R DD (τ) = 1 T∑<br />

D(t)D(t + τ) (3)<br />

T<br />

t=1<br />

R DD(τ) D(t) τ D(t + τ)<br />

2 <br />

D(t) D(t) <br />

τ (t rhythm ) <br />

<br />

P (t) D(t) P (t) R DP<br />

( 4)<br />

R DP (τ) = 1 T∑<br />

D(t)P (t + τ) (4)<br />

T<br />

t=1<br />

D(t) P (t) <br />

t max <br />

D(t) P (t + t max ) <br />

t max (t st ) <br />

3. 4 <br />

3. 3 <br />

<br />

<br />

<br />

3. 3 D(t)<br />

3


xxxx/xx Vol. Jxx–X No. xx<br />

<br />

( 3)<br />

<br />

<br />

<br />

Fig. 4<br />

4 <br />

Body center coordinate system<br />

3 <br />

Fig. 3 Beat tracking considering slgiht changes of<br />

rhythm<br />

4. <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

(<br />

) <br />

<br />

<br />

4. 1 <br />

<br />

<br />

<br />

<br />

x <br />

y x y <br />

z ( 4)<br />

<br />

() <br />

<br />

0 <br />

<br />

<br />

<br />

<br />

0 <br />

<br />

<br />

4. 2 <br />

4. 1 <br />

<br />

<br />

<br />

<br />

<br />

0 0 <br />

<br />

<br />

<br />

<br />

<br />

<br />

( 5)<br />

4. 1 <br />

0 <br />

<br />

<br />

<br />

( 6)<br />

4. 3 <br />

<br />

3 <br />

4


5 <br />

Fig. 5 Segmentation candidates extraction - Hands<br />

& CM<br />

6 <br />

Fig. 6 Segmentation candidates extraction - Feet<br />

<br />

(t rhythm /10)<br />

<br />

<br />

<br />

<br />

7 <br />

1 3 <br />

<br />

2 <br />

4 <br />

<br />

<br />

7 <br />

<br />

<br />

Fig. 7 Candidates refinement with musical rhythm<br />

Broken line : segmentation candidate<br />

Dotted line : estimated music rhythm<br />

4. 4 <br />

4. 3 <br />

<br />

4. 3 2 <br />

<br />

<br />

1 2 <br />

<br />

2 <br />

1 2 <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

( 2) <br />

<br />

<br />

5. <br />

<br />

<br />

5. 1 <br />

<br />

3 <br />

<br />

<br />

<br />

<br />

USB <br />

PC wav (32000Hz<br />

16bit/sample) <br />

Vicon Motion Systems(<br />

) <br />

2 2<br />

5


xxxx/xx Vol. Jxx–X No. xx<br />

<br />

20 ( 120fps<br />

2400 <br />

200fps 4000 <br />

) <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

20 <br />

9 12<br />

( 9)<br />

5. 2 <br />

FFT <br />

1024 <br />

256 <br />

<br />

<br />

<br />

0.704 85M.M. 0.576 <br />

104M.M. (M.M. Mälzel’s Metronome <br />

1 )<br />

8(a) 8(b) <br />

<br />

<br />

<br />

5. 3 <br />

<br />

[8]<br />

<br />

<br />

5. 3. 1 <br />

<br />

10 12<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

(a) Aizu-bandaisan dance<br />

(b) Jongara-bushi dance<br />

8 <br />

3. 2<br />

D(t)<br />

<br />

Fig. 8 Results of beat tracking<br />

Top : amplitude, Middle : spectrum,<br />

Bottom : D(t) as described in 3. 2<br />

Straight line in spectrum : music rhythm<br />

4. 4 <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

6


1 <br />

Table 1 Result of Aizu-bandaisan dance - a Female<br />

Dancer<br />

(/) <br />

13 89% (8/9) 5<br />

9 100% (9/9) 0<br />

2 <br />

Table 2 Result of Aizu-bandaisan dance - a Male<br />

Dancer<br />

(/) <br />

3 33% (3/9) 2<br />

9 100% (9/9) 0<br />

4. 4 1 <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

5. 3. 2 <br />

<br />

11 <br />

3 <br />

3 <br />

Table 3 Result of Jongara-bushi dance<br />

(/) <br />

6 50% (6/12) 0<br />

9 75% (9/12) 0<br />

<br />

<br />

<br />

<br />

<br />

4. 4 1 <br />

<br />

3 <br />

<br />

<br />

<br />

<br />

6. <br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

<br />

(CREST) <br />

()<br />

<br />

<br />

<br />

<br />

[1] D. M. Gavlira: “The visual analysis of human movement”,<br />

Journal of Computer Vision and Image Understanding,<br />

73, 1, pp. 82–98.<br />

[2] D. Miyazaki, T. Oishi, T. Nishikawa, R. Sagawa,<br />

K. Nishino, T. Tomomatsu, Y. Takase and K. Ikeuchi:<br />

7


xxxx/xx Vol. Jxx–X No. xx<br />

(a)Aizu-bandaisan dance<br />

9<br />

Fig. 9<br />

(b)Jongara-bushi dance<br />

<br />

Keyposes extracted from manuals<br />

10<br />

– <br />

<br />

<br />

Fig. 10 Results of key poses extraction - Aizu-bandaisan of a female dancer<br />

Top : speed sequence, Middle : key poses extracted by our method,<br />

Bottom : key poses extracted by dancers<br />

Broken line : segmentation candidate, Straight line : music rhythm<br />

8


11<br />

– <br />

<br />

<br />

Fig. 11 Results of key poses extraction - Jongara-bushi dance<br />

Top : speed sequence, Middle : key poses extracted by our method,<br />

Bottom : key poses extracted by dancers<br />

Broken line : segmentation candidate, Straight line : music rhythm<br />

“Great buddha project: Modeling cultural heritage<br />

through observation”, In Proc. of the Sixth International<br />

Conference on Virtual Systems and MultiMedia<br />

(VSMM 2000), pp. 138–145 (2000).<br />

[3] T. Flash and H. Hogan: “The coordination of arm<br />

movements”, J. Neuroscience, pp. 1688–1703 (1985).<br />

[4] T. Inamura, Y. Nakamura, H. Ezaki and I. Toshima:<br />

“Imitation and primitive symbol acquisition of humanoids<br />

by the integrated mimesis loop”, In Proc. of<br />

International Conference on Robotics and Automation<br />

(ICRA2001), pp. 4208–4213 (2001).<br />

[5] O. C. Jenkins, M. J. Metaric and S. Weber: “Primitive<br />

based movement classification for humanoid<br />

imitation”, In Proc. of First IEEE-RAS International<br />

Conference on Humanoid Robotics (Humanoids<br />

2000) (2000).<br />

[6] , , “<br />

”, , 15 8 ,<br />

pp. 878–885 (2000).<br />

[7] , , “ labanotation <br />

– labanreader <br />

labaneditor–”, <br />

, 38-6, pp. 61–68 (1998).<br />

[8] A. Nakazawa, S. Nakaoka, K. Ikeuchi and K. Yokoi:<br />

“Imitating human dance motions through motion<br />

structure analysis”, In Proc. of International Conference<br />

on Intelligent Robots and Systems(IROS 2002),<br />

pp. 2539–2544 (2002).<br />

[9] T. Kim, S. I. Park and S. Y. Shin: “Rhythmic-motion<br />

synthesis based on motion-beat analysis”, In Proc. of<br />

ACM SIGGRAPH 2003 (2003).<br />

[10] M. Goto: “An audio-based real-time beat tracking<br />

system for music with or without drum-sounds”,<br />

Journal of New Music Research, 30, 2, pp. 159–171<br />

(2001).<br />

xx xx xx <br />

<br />

<br />

2002 <br />

2004 <br />

<br />

<br />

9


xxxx/xx Vol. Jxx–X No. xx<br />

<br />

<br />

1997 <br />

2001 <br />

()<br />

(<br />

)2003 <br />

<br />

<br />

<br />

1973 <br />

1978 <br />

MIT <br />

CMU <br />

1996 <br />

2000 <br />

<br />

<br />

D.Marr (ICCV:1990 )IEEE (CVPR:1991<br />

) (AI Journal:1992 )Fu <br />

(IEEE Trans. R&A1998 ) IEEE Distinguished<br />

Lecturer(SPS 2000 - 2001CS 2004 - 2006)IEEE<br />

Fellow<br />

10


Abstract This paper describes a dance motion analysis method considering music rhythm. In these<br />

days, many important intangible cultural properties are being lost because of the lack of successive performers,<br />

and we have started our digital archiving project of cultural intangible heritages. Our goal for<br />

archiving dance motions is to detect “primitive motions”, which consists of dance motions. In addition, we<br />

believe these primitives must be synchronized to the musical rhythm, so our proposed method automatically<br />

segments the original motion considering with estimated musical rhythm to detect the primitive motions.<br />

Key words<br />

Motion Capture, Primitive Motions, Music, Beat Tracking

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!